Bae, Young-Hyeon; Ko, Mansoo; Lee, Suk Min
2016-04-29
Revised high-heeled shoes (HHSs) were designed to improve the shortcomings of standard HHSs. This study was conducted to compare revised and standard HHSs with regard to joint angles and electromyographic (EMG) activity of the lower extremities during standing. The participants were five healthy young women. Data regarding joint angles and EMG activity of the lower extremities were obtained under three conditions: barefoot, when wearing revised HHSs, and when wearing standard HHSs. Lower extremity joint angles in the three dimensional plane were confirmed using a VICON motion capture system. EMG activity of the lower extremities was measured using active bipolar surface EMG. Kruskal-Wallis one-way analysis of variance by rank applied to analyze differences during three standing conditions. Compared with the barefoot condition, the standard HHSs condition was more different than the revised HHSs condition with regard to lower extremity joint angles during standing. EMG activity of the lower extremities was different for the revised HHSs condition, but the differences among the three conditions were not significant. Wearing revised HHSs may positively impact joint angles and EMG activity of the lower extremities by improving body alignment while standing.
NASA Technical Reports Server (NTRS)
Ramesham, Rajeshuni
2012-01-01
This paper provides the experimental test results of advanced CCGA packages tested in extreme temperature thermal environments. Standard optical inspection and x-ray non-destructive inspection tools were used to assess the reliability of high density CCGA packages for deep space extreme temperature missions. Ceramic column grid array (CCGA) packages have been increasing in use based on their advantages such as high interconnect density, very good thermal and electrical performances, compatibility with standard surface-mount packaging assembly processes, and so on. CCGA packages are used in space applications such as in logic and microprocessor functions, telecommunications, payload electronics, and flight avionics. As these packages tend to have less solder joint strain relief than leaded packages or more strain relief over lead-less chip carrier packages, the reliability of CCGA packages is very important for short-term and long-term deep space missions. We have employed high density CCGA 1152 and 1272 daisy chained electronic packages in this preliminary reliability study. Each package is divided into several daisy-chained sections. The physical dimensions of CCGA1152 package is 35 mm x 35 mm with a 34 x 34 array of columns with a 1 mm pitch. The dimension of the CCGA1272 package is 37.5 mm x 37.5 mm with a 36 x 36 array with a 1 mm pitch. The columns are made up of 80% Pb/20%Sn material. CCGA interconnect electronic package printed wiring polyimide boards have been assembled and inspected using non-destructive x-ray imaging techniques. The assembled CCGA boards were subjected to extreme temperature thermal atmospheric cycling to assess their reliability for future deep space missions. The resistance of daisy-chained interconnect sections were monitored continuously during thermal cycling. This paper provides the experimental test results of advanced CCGA packages tested in extreme temperature thermal environments. Standard optical inspection and x-ray non-destructive inspection tools were used to assess the reliability of high density CCGA packages for deep space extreme temperature missions. Keywords: Extreme temperatures, High density CCGA qualification, CCGA reliability, solder joint failures, optical inspection, and x-ray inspection.
Optimizing Illumina next-generation sequencing library preparation for extremely AT-biased genomes.
Oyola, Samuel O; Otto, Thomas D; Gu, Yong; Maslen, Gareth; Manske, Magnus; Campino, Susana; Turner, Daniel J; Macinnis, Bronwyn; Kwiatkowski, Dominic P; Swerdlow, Harold P; Quail, Michael A
2012-01-03
Massively parallel sequencing technology is revolutionizing approaches to genomic and genetic research. Since its advent, the scale and efficiency of Next-Generation Sequencing (NGS) has rapidly improved. In spite of this success, sequencing genomes or genomic regions with extremely biased base composition is still a great challenge to the currently available NGS platforms. The genomes of some important pathogenic organisms like Plasmodium falciparum (high AT content) and Mycobacterium tuberculosis (high GC content) display extremes of base composition. The standard library preparation procedures that employ PCR amplification have been shown to cause uneven read coverage particularly across AT and GC rich regions, leading to problems in genome assembly and variation analyses. Alternative library-preparation approaches that omit PCR amplification require large quantities of starting material and hence are not suitable for small amounts of DNA/RNA such as those from clinical isolates. We have developed and optimized library-preparation procedures suitable for low quantity starting material and tolerant to extremely high AT content sequences. We have used our optimized conditions in parallel with standard methods to prepare Illumina sequencing libraries from a non-clinical and a clinical isolate (containing ~53% host contamination). By analyzing and comparing the quality of sequence data generated, we show that our optimized conditions that involve a PCR additive (TMAC), produces amplified libraries with improved coverage of extremely AT-rich regions and reduced bias toward GC neutral templates. We have developed a robust and optimized Next-Generation Sequencing library amplification method suitable for extremely AT-rich genomes. The new amplification conditions significantly reduce bias and retain the complexity of either extremes of base composition. This development will greatly benefit sequencing clinical samples that often require amplification due to low mass of DNA starting material.
NASA Astrophysics Data System (ADS)
Bennett, D. L.; Brene, N.; Nielsen, H. B.
1987-01-01
The goal of random dynamics is the derivation of the laws of Nature as we know them (standard model) from inessential assumptions. The inessential assumptions made here are expressed as sets of general models at extremely high energies: gauge glass and spacetime foam. Both sets of models lead tentatively to the standard model.
ERIC Educational Resources Information Center
Lyons, Kevin J.; Greening, Shirley; Robeson, Mary
2000-01-01
A modified Delphi procedure assessed the content validity of accreditation standards for cardiovascular technologists, cytotechnologists, medical sonographers, electroneurodiagnostic technologists, medical assistants, perfusionists, physician assistants, and surgical technologists. Although validity and reliability were extremely high, some…
A comparative assessment of statistical methods for extreme weather analysis
NASA Astrophysics Data System (ADS)
Schlögl, Matthias; Laaha, Gregor
2017-04-01
Extreme weather exposure assessment is of major importance for scientists and practitioners alike. We compare different extreme value approaches and fitting methods with respect to their value for assessing extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series over the standardly used annual maxima series in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing partial duration series, PDS) being superior to the block maxima approach (employing annual maxima series, AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was neither visible from the square-root criterion, nor from standardly used graphical diagnosis (mean residual life plot), but from a direct comparison of AMS and PDS in synoptic quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best suited approach. This will make the analyses more robust, in cases where threshold selection and dependency introduces biases to the PDS approach, but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of this study are of relevance for a broad range of environmental variables, including meteorological and hydrological quantities.
NASA Astrophysics Data System (ADS)
Hatzopoulos, N.; Kim, S. H.; Kafatos, M.; Nghiem, S. V.; Myoung, B.
2016-12-01
Live Fuel Moisture is a dryness measure used by the fire departments to determine how dry is the current situation of the fuels from the forest areas. In order to map Live Fuel Moisture we conducted an analysis with a standardized regressional approach from various vegetation indices derived from remote sensing data of MODIS. After analyzing the results we concluded mapping Live Fuel Moisture using a standardized NDVI product. From the mapped remote sensed product we observed the appearance of extremely high dry fuels to be highly correlated with very dry years based on the overall yearly precipitation. The appearances of the extremely dry mapped fuels tend to have a direct association with fire events and observed to be a post fire indicator. In addition we studied the appearance of extreme dry fuels during critical months when season changes from spring to summer as well as the relation to fire events.
NASA Astrophysics Data System (ADS)
Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro
2016-04-01
High temperatures have an impact on the energy balance of any living organism and on the operational capabilities of critical infrastructures. Heat-wave indicators have been mainly developed with the aim of capturing the potential impacts on specific sectors (agriculture, health, wildfires, transport, power generation and distribution). However, the ability to capture the occurrence of extreme temperature events is an essential property of a multi-hazard extreme climate indicator. Aim of this study is to develop a standardized heat-wave indicator, that can be combined with other indices in order to describe multiple hazards in a single indicator. The proposed approach can be used in order to have a quantified indicator of the strenght of a certain extreme. As a matter of fact, extremes are usually distributed in exponential or exponential-exponential functions and it is difficult to quickly asses how strong was an extreme events considering only its magnitude. The proposed approach simplify the quantitative and qualitative communication of extreme magnitude
Hot air vulcanization of rubber profiles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gerlach, J.
1995-07-01
Elastomer profiles are deployed in quantity by the automobile industry as seals and wateproofing in coachwork. The high standards demanded by the industry; improvement in weather prediction, noise reduction, restriction of tolerances, together with powerful demand for EPDM force the rubber processing industry into development, particularly of elastomers. Complex proofing systems must also be achieved with extremely complicated profile forms. All too often such profiles have an extremely large surface together with a low cross-section density. They frequently consist of two or three rubber compounds and are steel reinforced. Sometimes they are flocked and coated with a low friction finish.more » Such high-tech seals require an adjustment of the vulcanization method. The consistent trend in the nineties towards lower quantities of elastomer per sealing unit and the dielectric factor, especially with EPDM, has brought an old fashioned vulcanization method once more to the fore, a method developed over the past years to an extremely high standard, namely the hot-air method. This paper describes various vulcanization and curing methods and their relative merits and disadvantages, the Gerlach hot-air concept, the hot air installation concept, and energy saving and efficiency afforded by this technique. 4 figs.« less
NASA Astrophysics Data System (ADS)
Schlögl, Matthias; Laaha, Gregor
2017-04-01
The assessment of road infrastructure exposure to extreme weather events is of major importance for scientists and practitioners alike. In this study, we compare the different extreme value approaches and fitting methods with respect to their value for assessing the exposure of transport networks to extreme precipitation and temperature impacts. Based on an Austrian data set from 25 meteorological stations representing diverse meteorological conditions, we assess the added value of partial duration series (PDS) over the standardly used annual maxima series (AMS) in order to give recommendations for performing extreme value statistics of meteorological hazards. Results show the merits of the robust L-moment estimation, which yielded better results than maximum likelihood estimation in 62 % of all cases. At the same time, results question the general assumption of the threshold excess approach (employing PDS) being superior to the block maxima approach (employing AMS) due to information gain. For low return periods (non-extreme events) the PDS approach tends to overestimate return levels as compared to the AMS approach, whereas an opposite behavior was found for high return levels (extreme events). In extreme cases, an inappropriate threshold was shown to lead to considerable biases that may outperform the possible gain of information from including additional extreme events by far. This effect was visible from neither the square-root criterion nor standardly used graphical diagnosis (mean residual life plot) but rather from a direct comparison of AMS and PDS in combined quantile plots. We therefore recommend performing AMS and PDS approaches simultaneously in order to select the best-suited approach. This will make the analyses more robust, not only in cases where threshold selection and dependency introduces biases to the PDS approach but also in cases where the AMS contains non-extreme events that may introduce similar biases. For assessing the performance of extreme events we recommend the use of conditional performance measures that focus on rare events only in addition to standardly used unconditional indicators. The findings of the study directly address road and traffic management but can be transferred to a range of other environmental variables including meteorological and hydrological quantities.
Lee, Kevin; Murphy, Patrick B; Ingves, Matthew V; Duncan, Audra; DeRose, Guy; Dubois, Luc; Forbes, Thomas L; Power, Adam
2017-12-01
The surgical site infection (SSI) rate in vascular surgery after groin incision for lower extremity revascularization can lead to significant morbidity and mortality. This trial was designed to study the effect of negative pressure wound therapy (NPWT) on SSI in closed groin wounds after lower extremity revascularization in patients at high risk for SSI. A single-center, randomized, controlled trial was performed at an academic tertiary medical center. Patients with previous femoral artery surgical exposure, body mass index of >30 kg/m 2 or the presence of ischemic tissue loss were classified as a high-risk patient for SSI. All wounds were closed primarily and patients were randomized to either NPWT or standard dressing. The primary outcome of the trial was postoperative 30-day SSI in the groin wound. The secondary outcomes included 90-day SSI, hospital duration of stay, readmissions or reoperations for SSI, and mortality. A total of 102 patients were randomized between August 2014 and December 2015. Patients were classified as at high risk owing to the presence of previous femoral artery cut down (29%), body mass index of >30 kg/m 2 (39%) or presence of ischemic tissue loss (32%). Revascularization procedures performed included femoral to distal artery bypass (57%), femoral endarterectomy (18%), femoral to femoral artery crossover (17%), and other procedures (8%). The primary outcome of 30-day SSI was 11% in NPWT group versus 19% in standard dressing group (P = .24). There was a statistically significant shorter mean duration of hospital stay in the NPWT group (6.4 days) compared with the standard group (8.9 days; P = .01). There was no difference in readmission or reoperation for SSI or mortality between the two groups. This study demonstrated a nonsignificant lower rate of groin SSI in high-risk revascularization patients with NPWT compared with standard dressing. Owing to a lower than expected infection rate, the study was underpowered to detect a difference at the prespecified level. The NPWT group did show significantly shorter mean hospital duration of stay compared with the standard dressing group. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ramesham, Rajeshuni
2012-03-01
Ceramic column grid array (CCGA) packages have been increasing in use based on their advantages such as high interconnect density, very good thermal and electrical performances, compatibility with standard surfacemount packaging assembly processes, and so on. CCGA packages are used in space applications such as in logic and microprocessor functions, telecommunications, payload electronics, and flight avionics. As these packages tend to have less solder joint strain relief than leaded packages or more strain relief over lead-less chip carrier packages, the reliability of CCGA packages is very important for short-term and long-term deep space missions. We have employed high density CCGA 1152 and 1272 daisy chained electronic packages in this preliminary reliability study. Each package is divided into several daisy-chained sections. The physical dimensions of CCGA1152 package is 35 mm x 35 mm with a 34 x 34 array of columns with a 1 mm pitch. The dimension of the CCGA1272 package is 37.5 mm x 37.5 mm with a 36 x 36 array with a 1 mm pitch. The columns are made up of 80% Pb/20%Sn material. CCGA interconnect electronic package printed wiring polyimide boards have been assembled and inspected using non-destructive x-ray imaging techniques. The assembled CCGA boards were subjected to extreme temperature thermal atmospheric cycling to assess their reliability for future deep space missions. The resistance of daisy-chained interconnect sections were monitored continuously during thermal cycling. This paper provides the experimental test results of advanced CCGA packages tested in extreme temperature thermal environments. Standard optical inspection and x-ray non-destructive inspection tools were used to assess the reliability of high density CCGA packages for deep space extreme temperature missions.
Best Available Evidence: Three Complementary Approaches
ERIC Educational Resources Information Center
Slocum, Timothy A.; Spencer, Trina D.; Detrich, Ronnie
2012-01-01
The best available evidence is one of the three critical features of evidence-based practice. Best available evidence is often considered to be synonymous with extremely high standards for research methodology. However, this notion may limit the scope and impact of evidence based practice to those educational decisions on which high quality…
Quantifying the relationship between extreme air pollution events and extreme weather events
NASA Astrophysics Data System (ADS)
Zhang, Henian; Wang, Yuhang; Park, Tae-Won; Deng, Yi
2017-05-01
Extreme weather events can strongly affect surface air quality, which has become a major environmental factor to affect human health. Here, we examined the relationship between extreme ozone and PM2.5 (particular matter with an aerodynamic diameter less than 2.5 μm) events and the representative meteorological parameters such as daily maximum temperature (Tmax), minimum relative humidity (RHmin), and minimum wind speed (Vmin), using the location-specific 95th or 5th percentile threshold derived from historical reanalysis data (30 years for ozone and 10 years for PM2.5). We found that ozone and PM2.5 extremes were decreasing over the years, reflecting EPA's tightened standards and effort on reducing the corresponding precursor's emissions. Annual ozone and PM2.5 extreme days were highly correlated with Tmax and RHmin, especially in the eastern U.S. They were positively (negatively) correlated with Vmin in urban (rural and suburban) stations. The overlapping ratios of ozone extreme days with Tmax were fairly constant, about 32%, and tended to be high in fall and low in winter. Ozone extreme days were most sensitive to Tmax, then RHmin, and least sensitive to Vmin. The majority of ozone extremes occurred when Tmax was between 300 K and 320 K, RHmin was less than 40%, and Vmin was less than 3 m/s. The number of annual extreme PM2.5 days was highly positively correlated with the extreme RHmin/Tmax days, with correlation coefficient between PM2.5/RHmin highest in urban and suburban regions and the correlation coefficient between PM2.5/Tmax highest in rural area. Tmax has more impact on PM2.5 extreme over the eastern U.S. Extreme PM2.5 days were more likely to occur at low RH conditions in the central and southeastern U.S., especially during spring time, and at high RH conditions in the northern U.S. and the Great Plains. Most extreme PM2.5 events occurred when Tmax was between 300 K and 320 K and RHmin was between 10% and 50%. Extreme PM2.5 days usually occurred when Vmin was under 2 m/s. However, during spring season in the Southeast and fall season in Northwest, high winds were found to accompany extreme PM2.5 days, likely reflecting the impact of fire emissions.
The Extreme Climate Index: a novel and multi-hazard index for extreme weather events.
NASA Astrophysics Data System (ADS)
Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro
2017-04-01
In this presentation we introduce the Extreme Climate Index (ECI): an objective, multi-hazard index capable of tracking changes in the frequency or magnitude of extreme weather events in African countries, thus indicating that a shift to a new climate regime is underway in a particular area. This index has been developed in the context of XCF (eXtreme Climate Facilities) project lead by ARC (African Risk Capacity, specialised agency of the African Union), and will be used in the payouts triggering mechanism of an insurance programme against risks related to the increase of frequency and magnitude of extreme weather events due to climate regimes' changes. The main hazards covered by ECI will be extreme dry, wet and heat events, with the possibility of adding region-specific risk events such as tropical cyclones for the most vulnerable areas. It will be based on data coming from consistent, sufficiently long, high quality historical records and will be standardized across broad geographical regions, so that extreme events occurring under different climatic regimes in Africa can be comparable. The first step to construct such an index is to define single hazard indicators. In this first study we focused on extreme dry/wet and heat events, using for their description respectively the well-known SPI (Standardized Precipitation Index) and an index developed by us, called SHI (Standardized Heat-waves Index). The second step consists in the development of a computational strategy to combine these, and possibly other indices, so that the ECI can describe, by means of a single indicator, different types of climatic extremes. According to the methodology proposed in this paper, the ECI is defined by two statistical components: the ECI intensity, which indicates whether an event is extreme or not; the angular component, which represent the contribution of each hazard to the overall intensity of the index. The ECI can thus be used to identify "extremes" after defining a suitable threshold above which the events can be held as extremes. In this presentation, after describing the methodology we used for the construction of the ECI, we present results obtained on different African regions, using NCEP Reanalysis dataset for air temperature at sig995 level and CHIRP dataset for precipitations. Particular attention will be devoted to 2015/2016 Malawi drought, which received some media attention due to the failure of the risk assessment model used to trigger due payouts: it will be shown how, on the contrary, combination of hydrological and temperature data used in ECI succeed in evaluating the extremeness of this event.
NASA Astrophysics Data System (ADS)
Platonov, Vladimir S.; Kislov, Alexander V.
2016-11-01
A statistical analysis of extreme weather events over coastal areas of the Russian Arctic based on observational data has revealed many interesting features of wind velocity distributions. It has been shown that the extremes contain data belonging to two different statistical populations. Each of them is reliably described by a Weibull distribution. According to the standard terminology, these sets of extremes are named ‘black swans’ and ‘dragons’. The ‘dragons’ are responsible for most extremes, surpassing the ‘black swans’ by 10 - 30 %. Since the data of the global climate model INM-CM4 do not contain ‘dragons’, the wind speed extremes are investigated on the mesoscale using the COSMO-CLM model. The modelling results reveal no differences between the ‘swans’ and ‘dragons’ situations. It could be associated with the poor sample data used. However, according to many case studies and modeling results we assume that it is caused by a rare superposition of large-scale synoptic factors and many local meso- and microscale factors (surface, coastline configuration, etc.). Further studies of extreme wind speeds in the Arctic, such as ‘black swans’ and ‘dragons’, are necessary to focus on non-hydrostatic high-resolution atmospheric modelling using downscaling techniques.
Estimating extreme stream temperatures by the standard deviate method
NASA Astrophysics Data System (ADS)
Bogan, Travis; Othmer, Jonathan; Mohseni, Omid; Stefan, Heinz
2006-02-01
It is now widely accepted that global climate warming is taking place on the earth. Among many other effects, a rise in air temperatures is expected to increase stream temperatures indefinitely. However, due to evaporative cooling, stream temperatures do not increase linearly with increasing air temperatures indefinitely. Within the anticipated bounds of climate warming, extreme stream temperatures may therefore not rise substantially. With this concept in mind, past extreme temperatures measured at 720 USGS stream gauging stations were analyzed by the standard deviate method. In this method the highest stream temperatures are expressed as the mean temperature of a measured partial maximum stream temperature series plus its standard deviation multiplied by a factor KE (standard deviate). Various KE-values were explored; values of KE larger than 8 were found physically unreasonable. It is concluded that the value of KE should be in the range from 7 to 8. A unit error in estimating KE translates into a typical stream temperature error of about 0.5 °C. Using a logistic model for the stream temperature/air temperature relationship, a one degree error in air temperature gives a typical error of 0.16 °C in stream temperature. With a projected error in the enveloping standard deviate dKE=1.0 (range 0.5-1.5) and an error in projected high air temperature d Ta=2 °C (range 0-4 °C), the total projected stream temperature error is estimated as d Ts=0.8 °C.
Digital Semaphore: Technical Feasibility of QR Code Optical Signaling for Fleet Communications
2013-06-01
Standards (http://www.iso.org) JIS Japanese Industrial Standard JPEG Joint Photographic Experts Group (digital image format; http://www.jpeg.org) LED...Denso Wave corporation in the 1990s for the Japanese automotive manufacturing industry. See Appendix A for full details. Reed-Solomon Error...eliminates camera blur induced by the shutter, providing clear images at extremely high frame rates. Thusly, digital cinema cameras are more suitable
Kim, Jin Woo; Choo, Ki Seok; Jeon, Ung Bae; Kim, Tae Un; Hwang, Jae Yeon; Yeom, Jeong A; Jeong, Hee Seok; Choi, Yoon Young; Nam, Kyung Jin; Kim, Chang Won; Jeong, Dong Wook; Lim, Soo Jin
2016-07-01
Multi-detector computed tomography (MDCT) angiography is now used for the diagnosing patients with peripheral arterial disease. The dose of radiation is related to variable factors, such as tube current, tube voltage, and helical pitch. To assess the diagnostic performance and radiation dose of lower extremity CT angiography (CTA) using a 128-slice dual source CT at 80 kVp and high pitch in patients with critical limb ischemia (CLI). Twenty-eight patients (mean, 64.1 years; range, 39-80 years) with CLI were enrolled in this retrospective study and underwent CTA using a 128-slice dual source CT at 80 kVp and high pitch and subsequent intra-arterial digital subtraction angiography (DSA), which was used as a reference standard for assessing diagnostic performance. For arterial segments with significant disease (>50% stenosis), overall sensitivity, specificity, and accuracy of lower extremity CTA were 94.8% (95% CI, 91.7-98.0%), 91.5% (95% CI, 87.7-95.2%), and 93.1% (95% CI, 90.6-95.6%), respectively, and its positive and negative predictive values were 91.0% (95% CI, 87.1-95.0%), and 95.1% (95% CI, 92.1-98.1%), respectively. Mean radiation dose delivered to lower extremities was 266.6 mGy.cm. Lower extremity CTA using a 128-slice dual source CT at 80 kVp and high pitch was found to have good diagnostic performance for the assessment of patients with CLI using an extremely low radiation dose. © The Foundation Acta Radiologica 2015.
Detection of meteorological extreme effect on historical crop yield anomaly
NASA Astrophysics Data System (ADS)
Kim, W.; Iizumi, T.; Nishimori, M.
2017-12-01
Meteorological extremes of temperature and precipitation are a critical issue in the global climate change, and some studies investigating how the extreme changes in accordance with the climate change are continuously reported. However, it is rarely understandable that the extremes affect crop yield worldwide as heatwave, coolwave, drought, and flood, albeit some local or national reports are available. Therefore, we globally investigated the extremes effects on the variability of historical yield of maize, rice, soy, and wheat with a standardized index and a historical yield anomaly. For the regression analysis, the standardized index is annually aggregated in the consideration of a crop calendar, and the historical yield is detrended with 5-year moving average. Throughout this investigation, we found that the relationship between the aggregated standardized index and the historical yield anomaly shows not merely positive correlation but also negative correlation in all crops in the globe. Namely, the extremes cause decrease of crop yield as a matter of course, but increase in some regions contrastingly. These results help us to quantify the extremes effect on historical crop yield anomaly.
Extreme value problems without calculus: a good link with geometry and elementary maths
NASA Astrophysics Data System (ADS)
Ganci, Salvatore
2016-11-01
Some classical examples of problem solving, where an extreme value condition is required, are here considered and/or revisited. The search for non-calculus solutions appears pedagogically useful and intriguing as shown through a rich literature. A teacher, who teaches both maths and physics, (as happens in Italian High schools) can find in these kinds of problems a mind stimulating exercise compared with the standard solution obtained by the differential calculus. A good link between the geometric and analytical explanations is so established.
Measurement of the Thermal Expansion Coefficient for Ultra-High Temperatures up to 3000 K
NASA Astrophysics Data System (ADS)
Kompan, T. A.; Kondratiev, S. V.; Korenev, A. S.; Puhov, N. F.; Inochkin, F. M.; Kruglov, S. K.; Bronshtein, I. G.
2018-03-01
The paper is devoted to a new high-temperature dilatometer, a part of the State Primary Standard of the thermal expansion coefficient (TEC) unit. The dilatometer is designed for investigation and certification of materials for TEC standards in the range of extremely high temperatures. The critical review of existing methods of TEC measurements is given. Also, the design, principles of operation and metrological parameters of the new device are described. The main attention is paid to the system of machine vision that allows accurate measurement of elongation at high temperatures. The results of TEC measurements for graphite GIP-4, single crystal Al2O3, and some other materials are also presented.
Baarts, C; Mikkelsen, K L; Hannerz, H; Tüchsen, F
2000-12-01
Data indicates that Denmark has relatively high risks of occupational injuries. We evaluated all injuries resulting in hospitalization by occupation. All gainfully employed men younger than 60 in 1990 were divided into 47 industrial groups and followed using the National Inpatient Registry, for hospitalized injuries 1991-1993. Following ICD-8, injuries were grouped into six categories: head, upper extremities, back, trunk, lower extremities and ruptures, sprains and strains. Standardized industrial hospitalization ratios (SHRs) were calculated and Pearson's independence test was performed for each category. Industrial differences were ascertained for each injury category. The highest associated injury category was upper extremity injuries ranging from SHR = 43 (fire services and salvage corps) to SHR = 209 (slaughterhouse industry). Carpentry, joinery, bricklaying and construction work had significantly high SHRs for all injury categories, whereas administrative work was significantly low throughout. Occupational surveillance systems based on hospitalized injuries can be used to identify high-risk industries, and thereby suggest where to direct prevention efforts. Copyright 2000 Wiley-Liss, Inc.
NASA Technical Reports Server (NTRS)
Carr, Gregory A.; Iannello, Christopher J.; Chen, Yuan; Hunter, Don J.; DelCastillo, Linda; Bradley, Arthur T.; Stell, Christopher; Mojarradi, Mohammad M.
2013-01-01
This paper is to present a concept of a modular and scalable High Temperature Boost (HTB) Power Processing Unit (PPU) capable of operating at temperatures beyond the standard military temperature range. The various extreme environments technologies are also described as the fundamental technology path to this concept. The proposed HTB PPU is intended for power processing in the area of space solar electric propulsion, where reduction of in-space mass and volume are desired, and sometimes even critical, to achieve the goals of future space flight missions. The concept of the HTB PPU can also be applied to other extreme environment applications, such as geothermal and petroleum deep-well drilling, where higher temperature operation is required.
NASA Technical Reports Server (NTRS)
Carr, Gregory A.; Iannello, Christopher J.; Chen, Yuan; Hunter, Don J.; Del Castillo, Linda; Bradley, Arthur T.; Stell, Christopher; Mojarradi, Mohammad M.
2013-01-01
This paper is to present a concept of a modular and scalable High Temperature Boost (HTB) Power Processing Unit (PPU) capable of operating at temperatures beyond the standard military temperature range. The various extreme environments technologies are also described as the fundamental technology path to this concept. The proposed HTB PPU is intended for power processing in the area of space solar electric propulsion, where the reduction of in-space mass and volume are desired, and sometimes even critical, to achieve the goals of future space flight missions. The concept of the HTB PPU can also be applied to other extreme environment applications, such as geothermal and petroleum deep-well drilling, where higher temperature operation is required.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Jancso, Leonhardt M.; Rocco, Stefania Di; Staehelin, Johannes; Maeder, Joerg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; de Backer, Hugo; Koehler, Ulf; Krzyścin, Janusz; Vaníček, Karel
2011-11-01
We apply methods from extreme value theory to identify extreme events in high (termed EHOs) and low (termed ELOs) total ozone and to describe the distribution tails (i.e. very high and very low values) of five long-term European ground-based total ozone time series. The influence of these extreme events on observed mean values, long-term trends and changes is analysed. The results show a decrease in EHOs and an increase in ELOs during the last decades, and establish that the observed downward trend in column ozone during the 1970-1990s is strongly dominated by changes in the frequency of extreme events. Furthermore, it is shown that clear ‘fingerprints’ of atmospheric dynamics (NAO, ENSO) and chemistry [ozone depleting substances (ODSs), polar vortex ozone loss] can be found in the frequency distribution of ozone extremes, even if no attribution is possible from standard metrics (e.g. annual mean values). The analysis complements earlier analysis for the world's longest total ozone record at Arosa, Switzerland, confirming and revealing the strong influence of atmospheric dynamics on observed ozone changes. The results provide clear evidence that in addition to ODS, volcanic eruptions and strong/moderate ENSO and NAO events had significant influence on column ozone in the European sector.
A Standard Atmosphere of the Antarctic Plateau
NASA Technical Reports Server (NTRS)
Mahesh, Ashwin; Lubin, Dan
2004-01-01
Climate models often rely on standard atmospheres to represent various regions; these broadly capture the important physical and radiative characteristics of regional atmospheres, and become benchmarks for simulations by researchers. The high Antarctic plateau is a significant region of the earth for which such standard atmospheres are as yet unavailable. Moreover, representative profiles from atmospheres over other regions of the planet, including &om the northern high latitudes, are not comparable to the atmosphere over the Antarctic plateau, and are therefore only of limited value as substitutes in climate models. Using data from radiosondes, ozonesondes and satellites along with other observations from South Pole station, typical seasonal atmospheric profiles for the high plateau are compiled. Proper representations of rapidly changing ozone concentrations (during the ozone hole) and the effect of surface elevation on tropospheric temperatures are discussed. The differences between standard profiles developed here and the most similar standard atmosphere that already exists - namely, the Arctic Winter profile - suggest that these new profiles will be extremely useful to make accurate representations of the atmosphere over the high plateau.
Call for national dialogue: Adapting standards of care in extreme events. We are not ready.
Cusack, Lynette; Gebbie, Kristine
Clinical practices are based on a common understanding of nursing's professional standards in all aspects of patient care, no matter what the circumstances are. Circumstances can however, change dramatically due to emergencies, disasters, or pandemics and may make it difficult to meet the standard of care in the way nurses are accustomed. The Australian nursing profession has not yet facilitated a broad discussion and debate at the professional and institutional level about adapting standards of care under extreme conditions, a dialogue which goes beyond the content of basic emergency and disaster preparedness. The purpose of this paper is to encourage discussion within the nursing profession on this important ethical and legal issue. A comprehensive review of the literature was undertaken to determine the state of the evidence in relation to adapting standards of care under extreme conditions. Content analysis of the literature identified categories related to adapting standards of care that have been considered by individuals or groups that should be considered in Australia, should a dialogue be undertaken. The categories include ethical expectations of professional practice; legal interpretation of care requirements, resource priority between hospital and public health and informing communities. Literature reviews and commentary may provide the background for a national dialogue on the nursing response in an extreme event. However, it is only with the engagement of a broadly representative segment of the professional nursing community that appropriate guidance on adapting standards of care under extreme conditions can be developed and then integrated into the professional worldview of nursing in Australia.
A novel approach for detecting heat waves: the Standardized Heat-Wave Index.
NASA Astrophysics Data System (ADS)
Cucchi, Marco; Petitta, Marcello; Calmanti, Sandro
2016-04-01
Extreme temperatures have an impact on the energy balance of any living organism and on the operational capabilities of critical infrastructures. The ability to capture the occurrence of extreme temperature events is therefore an essential property of a multi-hazard extreme climate indicator. In this paper we introduce a new index for the detection of such extreme temperature events called SHI (Standardized Heat-Wave Index), developed in the context of XCF project for the construction of a multi-hazard extreme climate indicator (ECI). SHI is a probabilistic index based on the analysis of maximum daily temperatures time series; it is standardized, enabling comparisons overs space/time and with other indices, and it is capable of describing both extreme cold and hot events. Given a particular location, SHI is constructed using the time series of local maximum daily temperatures with the following procedure: three-days cumulated maximum daily temperatures are assigned to each day of the time series; probabilities of occurrence in the same months the reference days belong to are computed for each of the previous calculated values; such probability values are thus projected on a standard normal distribution, obtaining our standardized indices. In this work we present results obtained using NCEP Reanalysis dataset for air temperature at sigma 0.995 level, which timespan ranges from 1948 to 2014. Given the specific framework of this work, the geographical focus of this study is limited to the African continent. We present a validation of the index by showing its use for monitoring heat-waves under different climate regimes.
GROMACS 4: Algorithms for Highly Efficient, Load-Balanced, and Scalable Molecular Simulation.
Hess, Berk; Kutzner, Carsten; van der Spoel, David; Lindahl, Erik
2008-03-01
Molecular simulation is an extremely useful, but computationally very expensive tool for studies of chemical and biomolecular systems. Here, we present a new implementation of our molecular simulation toolkit GROMACS which now both achieves extremely high performance on single processors from algorithmic optimizations and hand-coded routines and simultaneously scales very well on parallel machines. The code encompasses a minimal-communication domain decomposition algorithm, full dynamic load balancing, a state-of-the-art parallel constraint solver, and efficient virtual site algorithms that allow removal of hydrogen atom degrees of freedom to enable integration time steps up to 5 fs for atomistic simulations also in parallel. To improve the scaling properties of the common particle mesh Ewald electrostatics algorithms, we have in addition used a Multiple-Program, Multiple-Data approach, with separate node domains responsible for direct and reciprocal space interactions. Not only does this combination of algorithms enable extremely long simulations of large systems but also it provides that simulation performance on quite modest numbers of standard cluster nodes.
Reflections on STEM, Standards, and Disciplinary Focus
ERIC Educational Resources Information Center
Reed, Philip A.
2018-01-01
Technology Education as a discipline is at a historical point of two extremes. On one hand it is clear that what we do in technology education is highly valued; after all, imitation is said to be the sincerest form of flattery. The proliferation of "Makermania," technical competitions, engineering design in Next Generation Science…
Barron, John A.; Bukry, David B.; Hendy, Ingrid L.
2015-01-01
Diatom and silicoflagellate assemblages documented in a high-resolution time series spanning 800 to 1600 AD in varved sediment recovered in Kasten core SPR0901-02KC (34°16.845’ N, 120°02.332’ W, water depth 588 m) from the Santa Barbara Basin (SBB) reveal that SBB surface water conditions during the Medieval Climate Anomaly (MCA) and the early part of the Little Ice Age (LIA) were not extreme by modern standards, mostly falling within one standard deviation of mean conditions during the pre anthropogenic interval of 1748 to 1900. No clear differences between the character of MCA and the early LIA conditions are apparent. During intervals of extreme droughts identified by terrigenous proxy scanning XRF analyses, diatom and silicoflagellate proxies for coastal upwelling typically exceed one standard deviation above mean values for 1748-1900, supporting the hypothesis that droughts in southern California are associated with cooler (or La Niña-like) sea surface temperatures (SSTs). Increased percentages of diatoms transported downslope generally coincide with intervals of increased siliciclastic flux to the SBB identified by scanning XRF analyses. Diatom assemblages suggest only two intervals of the MCA (at ~897 to 922 and ~1151 to 1167) when proxy SSTs exceeded one standard deviation above mean values for 1748 to 1900. Conversely, silicoflagellates imply extreme warm water events only at ~830 to 860 (early MCA) and ~1360 to 1370 (early LIA) that are not supported by the diatom data. Silicoflagellates appear to be more suitable for characterizing average climate during the 5 to 11 year-long sample intervals studied in the SPR0901-02KC core than diatoms, probably because diatom relative abundances may be dominated by seasonal blooms of a particular year.
Optimization of an on-board imaging system for extremely rapid radiation therapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cherry Kemmerling, Erica M.; Wu, Meng, E-mail: mengwu@stanford.edu; Yang, He
2015-11-15
Purpose: Next-generation extremely rapid radiation therapy systems could mitigate the need for motion management, improve patient comfort during the treatment, and increase patient throughput for cost effectiveness. Such systems require an on-board imaging system that is competitively priced, fast, and of sufficiently high quality to allow good registration between the image taken on the day of treatment and the image taken the day of treatment planning. In this study, three different detectors for a custom on-board CT system were investigated to select the best design for integration with an extremely rapid radiation therapy system. Methods: Three different CT detectors aremore » proposed: low-resolution (all 4 × 4 mm pixels), medium-resolution (a combination of 4 × 4 mm pixels and 2 × 2 mm pixels), and high-resolution (all 1 × 1 mm pixels). An in-house program was used to generate projection images of a numerical anthropomorphic phantom and to reconstruct the projections into CT datasets, henceforth called “realistic” images. Scatter was calculated using a separate Monte Carlo simulation, and the model included an antiscatter grid and bowtie filter. Diagnostic-quality images of the phantom were generated to represent the patient scan at the time of treatment planning. Commercial deformable registration software was used to register the diagnostic-quality scan to images produced by the various on-board detector configurations. The deformation fields were compared against a “gold standard” deformation field generated by registering initial and deformed images of the numerical phantoms that were used to make the diagnostic and treatment-day images. Registrations of on-board imaging system data were judged by the amount their deformation fields differed from the corresponding gold standard deformation fields—the smaller the difference, the better the system. To evaluate the registrations, the pointwise distance between gold standard and realistic registration deformation fields was computed. Results: By most global metrics (e.g., mean, median, and maximum pointwise distance), the high-resolution detector had the best performance but the medium-resolution detector was comparable. For all medium- and high-resolution detector registrations, mean error between the realistic and gold standard deformation fields was less than 4 mm. By pointwise metrics (e.g., tracking a small lesion), the high- and medium-resolution detectors performed similarly. For these detectors, the smallest error between the realistic and gold standard registrations was 0.6 mm and the largest error was 3.6 mm. Conclusions: The medium-resolution CT detector was selected as the best for an extremely rapid radiation therapy system. In essentially all test cases, data from this detector produced a significantly better registration than data from the low-resolution detector and a comparable registration to data from the high-resolution detector. The medium-resolution detector provides an appropriate compromise between registration accuracy and system cost.« less
An Extracorporeal Artificial Placenta Supports Extremely Premature Lambs for One Week
Bryner, Benjamin; Gray, Brian; Perkins, Elena; Davis, Ryan; Hoffman, Hayley; Barks, John; Owens, Gabe; Bocks, Martin; Rojas-Peña, Alvaro; Hirschl, Ronald; Bartlett, Robert; Mychaliska, George
2015-01-01
Purpose The treatment of extreme prematurity remains an unsolved problem. We developed an artificial placenta (AP) based on extracorporeal life support (ECLS) that simulates the intrauterine environment and provides gas exchange without mechanical ventilation (MV), and compared it to the current standard of neonatal care. Methods Extremely premature lambs (110-120d; term=145d) were used. AP lambs (n=9) were cannulated (jugular drainage, umbilical vein reinfusion) for ECLS .Control lambs (n=7) were intubated, ventilated, given surfactant, and transitioned to high-frequency oscillatory ventilation. All lambs received parenteral nutrition, antibiotics, and steroids. Hemodynamics, blood gases, hemoglobin, and circuit flows were measured. Results Four premature lambs survived for 1 week on the AP; one survived 6 days. Adequate oxygenation and ventilation were provided by the AP. The MV lambs survived 2-8 hours. Each of these lambs experienced a transient improvement with surfactant, but developed progressive hypercapnea and hypoxia despite high airway pressures and HFOV. Conclusions Extremely premature lambs were supported for 1 week with the AP with hemodynamic stability and adequate gas exchange; mechanically ventilated lambs succumbed within 8 hours. Further studies will assess control of fetal circulation and organ maturation on the AP. PMID:25598091
[Objectives and limits of test standards].
Kaddick, C; Blömer, W
2014-06-01
Test standards are developed worldwide by extremely committed expert groups working mostly in an honorary capacity and have substantially contributed to the currently achieved safety standards in reconstructive orthopedics. Independent of the distribution and quality of a test specification, the specialist knowledge of the user cannot replace a well founded risk analysis and if used unthinkingly can lead to a false estimation of safety. The limits of standardization are reached where new indications or highly innovative products are concerned. In this case the manufacturer must undertake the time and cost-intensive route of a self-developed testing procedure which in the ideal case leads to a further testing standard. Test standards make a substantial contribution to implant safety but cannot replace the expert knowledge of the user. Tests as an end to themselves take the actual objectives of standardization to absurdity.
The development and production of thermo-mechanically forged tool steel spur gears
NASA Technical Reports Server (NTRS)
Bamberger, E. N.
1973-01-01
A development program to establish the feasibility and applicability of high energy rate forging procedures to tool steel spur gears was performed. Included in the study were relatively standard forging procedures as well as a thermo-mechanical process termed ausforming. The subject gear configuration utilized was essentially a standard spur gear having 28 teeth, a pitch diameter of 3.5 inches and a diametral pitch of 8. Initially it had been planned to use a high contact ratio gear design, however, a comprehensive evaluation indicated that severe forging problems would be encountered as a result of the extremely small teeth required by this type of design. The forging studies were successful in achieving gear blanks having integrally formed teeth using both standard and thermo-mechanical forging procedures.
PPM mixtures of formaldehyde in gas cylinders: Stability and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wong, K.C.; Miller, S.B.; Patterson, L.M.
1999-07-01
Scott Specialty Gases has been successful in producing stable calibration gases of formaldehyde at low concentration. Critical to this success has been the development of a treatment process for high pressure aluminum cylinders. Formaldehyde cylinders having concentrations of 20ppm and 4ppm were found to show only small decline in concentrations over a period of approximately 12 months. Since no NIST traceable formaldehyde standards (or Standard Reference Material) are available, all Scott's formaldehyde cylinders were originally certified by traditional impinger method. This method involves an extremely tedious purification procedure for 2,4-dinitrophenylhydrazine (2,4-DNPH). A modified version of the impinger method has beenmore » developed and does not require extensive reagent purification for formaldehyde analysis. Extremely low formaldehyde blanks have been obtained with the modified method. The HPLC conditions in the original method were used for chromatographic separations. The modified method results in a lower analytical uncertainty for the formaldehyde standard mixtures. Consequently, it is possible to discern small differences between analytical results that are important for stability study.« less
Cazelle, Elodie; Eskes, Chantra; Hermann, Martina; Jones, Penny; McNamee, Pauline; Prinsen, Menk; Taylor, Hannah; Wijnands, Marcel V W
2015-04-01
A.I.S.E. investigated the suitability of the regulatory adopted ICE in vitro test method (OECD TG 438) with or without histopathology to identify detergent and cleaning formulations having extreme pH that require classification as EU CLP/UN GHS Category 1. To this aim, 18 extreme pH detergent and cleaning formulations were tested covering both alkaline and acidic extreme pHs. The ICE standard test method following OECD Test Guideline 438 showed good concordance with in vivo classification (83%) and good and balanced specificity and sensitivity values (83%) which are in line with the performances of currently adopted in vitro test guidelines, confirming its suitability to identify Category 1 extreme pH detergent and cleaning products. In contrast to previous findings obtained with non-extreme pH formulations, the use of histopathology did not improve the sensitivity of the assay whilst it strongly decreased its specificity for the extreme pH formulations. Furthermore, use of non-testing prediction rules for classification showed poor concordance values (33% for the extreme pH rule and 61% for the EU CLP additivity approach) with high rates of over-prediction (100% for the extreme pH rule and 50% for the additivity approach), indicating that these non-testing prediction rules are not suitable to predict Category 1 hazards of extreme pH detergent and cleaning formulations. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kürbis, K.; Mudelsee, M.; Tetzlaff, G.; Brázdil, R.
2009-09-01
For the analysis of trends in weather extremes, we introduce a diagnostic index variable, the exceedance product, which combines intensity and frequency of extremes. We separate trends in higher moments from trends in mean or standard deviation and use bootstrap resampling to evaluate statistical significances. The application of the concept of the exceedance product to daily meteorological time series from Potsdam (1893 to 2005) and Prague-Klementinum (1775 to 2004) reveals that extremely cold winters occurred only until the mid-20th century, whereas warm winters show upward trends. These changes were significant in higher moments of the temperature distribution. In contrast, trends in summer temperature extremes (e.g., the 2003 European heatwave) can be explained by linear changes in mean or standard deviation. While precipitation at Potsdam does not show pronounced trends, dew point does exhibit a change from maximum extremes during the 1960s to minimum extremes during the 1970s.
NASA Astrophysics Data System (ADS)
Lewis, Sophie; Karoly, David
2013-04-01
Changes in extreme climate events pose significant challenges for both human and natural systems. Some climate extremes are likely to become "more frequent, more widespread and/or more intense during the 21st century" (Intergovernmental Panel on Climate Change, 2007) due to anthropogenic climate change. Particularly in Australia, El Niño-Southern Oscillation (ENSO) has a relationship to the relative frequency of temperature and precipitation extremes. In this study, we investigate the record high two-summer rainfall observed in Australia (2010-2011 and 2011-2012). This record rainfall occurred in association with a two year extended La Niña event and resulted in severe and extensive flooding. We examine simulated changes in seasonal-scale rainfall extremes in the Australian region in a suite of models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5). In particular, we utilise the novel CMIP5 detection and attribution historical experiments with various forcings (natural forcings only and greenhouse gas forcings only) to examine the impact of various anthropogenic forcings on seasonal-scale extreme rainfall across Australia. Using these standard detection and attribution experiments over the period of 1850 to 2005, we examine La Niña contributions to the 2-season record rainfall, as well as the longer-term climate change contribution to rainfall extremes. Was there an anthropogenic influence in the record high Australian summer rainfall over 2010 to 2012, and if so, how much influence? Intergovernmental Panel on Climate Change (2007), Climate Change 2007: The Physical Science Basis, Contribution of Working Group I to the Fourth Assessment Report on the Intergovernmental Panel on Climate Change, edited by S. Solomon et al., 996 pp., Cambridge Univ. Press, Cambridge, U. K.
Quality-control of an hourly rainfall dataset and climatology of extremes for the UK.
Blenkinsop, Stephen; Lewis, Elizabeth; Chan, Steven C; Fowler, Hayley J
2017-02-01
Sub-daily rainfall extremes may be associated with flash flooding, particularly in urban areas but, compared with extremes on daily timescales, have been relatively little studied in many regions. This paper describes a new, hourly rainfall dataset for the UK based on ∼1600 rain gauges from three different data sources. This includes tipping bucket rain gauge data from the UK Environment Agency (EA), which has been collected for operational purposes, principally flood forecasting. Significant problems in the use of such data for the analysis of extreme events include the recording of accumulated totals, high frequency bucket tips, rain gauge recording errors and the non-operation of gauges. Given the prospect of an intensification of short-duration rainfall in a warming climate, the identification of such errors is essential if sub-daily datasets are to be used to better understand extreme events. We therefore first describe a series of procedures developed to quality control this new dataset. We then analyse ∼380 gauges with near-complete hourly records for 1992-2011 and map the seasonal climatology of intense rainfall based on UK hourly extremes using annual maxima, n-largest events and fixed threshold approaches. We find that the highest frequencies and intensities of hourly extreme rainfall occur during summer when the usual orographically defined pattern of extreme rainfall is replaced by a weaker, north-south pattern. A strong diurnal cycle in hourly extremes, peaking in late afternoon to early evening, is also identified in summer and, for some areas, in spring. This likely reflects the different mechanisms that generate sub-daily rainfall, with convection dominating during summer. The resulting quality-controlled hourly rainfall dataset will provide considerable value in several contexts, including the development of standard, globally applicable quality-control procedures for sub-daily data, the validation of the new generation of very high-resolution climate models and improved understanding of the drivers of extreme rainfall.
NDE standards for high temperature materials
NASA Technical Reports Server (NTRS)
Vary, Alex
1991-01-01
High temperature materials include monolithic ceramics for automotive gas turbine engines and also metallic/intermetallic and ceramic matrix composites for a range of aerospace applications. These are materials that can withstand extreme operating temperatures that will prevail in advanced high-efficiency gas turbine engines. High temperature engine components are very likely to consist of complex composite structures with three-dimensionality interwoven and various intermixed ceramic fibers. The thermomechanical properties of components made of these materials are actually created in-place during processing and fabrication stages. The complex nature of these new materials creates strong incentives for exact standards for unambiguous evaluations of defects and microstructural characteristics. NDE techniques and standards that will ultimately be applicable to production and quality control of high temperature materials and structures are still emerging. The needs range from flaw detection to below 100 micron levels in monolithic ceramics to global imaging of fiber architecture and matrix densification anomalies in composites. The needs are different depending on the processing stage, fabrication method, and nature of the finished product. The standards are discussed that must be developed in concert with advances in NDE technology, materials processing research, and fabrication development. High temperature materials and structures that fail to meet stringent specifications and standards are unlikely to compete successfully either technologically or in international markets.
Su, Ning; Zhai, Fei-Fei; Zhou, Li-Xin; Ni, Jun; Yao, Ming; Li, Ming-Li; Jin, Zheng-Yu; Gong, Gao-Lang; Zhang, Shu-Yang; Cui, Li-Ying; Tian, Feng; Zhu, Yi-Cheng
2017-01-01
Objective: To investigate the correlation between cerebral small vessel disease (CSVD) burden and motor performance of lower and upper extremities in community-dwelling populations. Methods: We performed a cross-sectional analysis on 770 participants enrolled in the Shunyi study, which is a population-based cohort study. CSVD burden, including white matter hyperintensities (WMH), lacunes, cerebral microbleeds (CMBs), perivascular spaces (PVS), and brain atrophy were measured using 3T magnetic resonance imaging. All participants underwent quantitative motor assessment of lower and upper extremities, which included 3-m walking speed, 5-repeat chair-stand time, 10-repeat pronation–supination time, and 10-repeat finger-tapping time. Data on demographic characteristics, vascular risk factors, and cognitive functions were collected. General linear model analysis was performed to identify potential correlations between motor performance measures and imaging markers of CSVD after controlling for confounding factors. Results: For motor performance of the lower extremities, WMH was negatively associated with gait speed (standardized β = -0.092, p = 0.022) and positively associated with chair-stand time (standardized β = 0.153, p < 0.0001, surviving FDR correction). For motor performance of the upper extremities, pronation–supination time was positively associated with WMH (standardized β = 0.155, p < 0.0001, surviving FDR correction) and negatively with brain parenchymal fraction (BPF; standardized β = -0.125, p = 0.011, surviving FDR correction). Only BPF was found to be negatively associated with finger-tapping time (standardized β = -0.123, p = 0.012). However, lacunes, CMBs, or PVS were not found to be associated with motor performance of lower or upper extremities in multivariable analysis. Conclusion: Our findings suggest that cerebral microstructural changes related to CSVD may affect motor performance of both lower and upper extremities. WMH and brain atrophy are most strongly associated with motor function deterioration in community-dwelling populations. PMID:29021757
Su, Ning; Zhai, Fei-Fei; Zhou, Li-Xin; Ni, Jun; Yao, Ming; Li, Ming-Li; Jin, Zheng-Yu; Gong, Gao-Lang; Zhang, Shu-Yang; Cui, Li-Ying; Tian, Feng; Zhu, Yi-Cheng
2017-01-01
Objective: To investigate the correlation between cerebral small vessel disease (CSVD) burden and motor performance of lower and upper extremities in community-dwelling populations. Methods: We performed a cross-sectional analysis on 770 participants enrolled in the Shunyi study, which is a population-based cohort study. CSVD burden, including white matter hyperintensities (WMH), lacunes, cerebral microbleeds (CMBs), perivascular spaces (PVS), and brain atrophy were measured using 3T magnetic resonance imaging. All participants underwent quantitative motor assessment of lower and upper extremities, which included 3-m walking speed, 5-repeat chair-stand time, 10-repeat pronation-supination time, and 10-repeat finger-tapping time. Data on demographic characteristics, vascular risk factors, and cognitive functions were collected. General linear model analysis was performed to identify potential correlations between motor performance measures and imaging markers of CSVD after controlling for confounding factors. Results: For motor performance of the lower extremities, WMH was negatively associated with gait speed (standardized β = -0.092, p = 0.022) and positively associated with chair-stand time (standardized β = 0.153, p < 0.0001, surviving FDR correction). For motor performance of the upper extremities, pronation-supination time was positively associated with WMH (standardized β = 0.155, p < 0.0001, surviving FDR correction) and negatively with brain parenchymal fraction (BPF; standardized β = -0.125, p = 0.011, surviving FDR correction). Only BPF was found to be negatively associated with finger-tapping time (standardized β = -0.123, p = 0.012). However, lacunes, CMBs, or PVS were not found to be associated with motor performance of lower or upper extremities in multivariable analysis. Conclusion: Our findings suggest that cerebral microstructural changes related to CSVD may affect motor performance of both lower and upper extremities. WMH and brain atrophy are most strongly associated with motor function deterioration in community-dwelling populations.
NASA Astrophysics Data System (ADS)
Fyodorov, Yan V.; Bouchaud, Jean-Philippe
2008-09-01
We investigate some implications of the freezing scenario proposed by Carpentier and Le Doussal (CLD) for a random energy model (REM) with logarithmically correlated random potential. We introduce a particular (circular) variant of the model, and show that the integer moments of the partition function in the high-temperature phase are given by the well-known Dyson Coulomb gas integrals. The CLD freezing scenario allows one to use those moments for extracting the distribution of the free energy in both high- and low-temperature phases. In particular, it yields the full distribution of the minimal value in the potential sequence. This provides an explicit new class of extreme-value statistics for strongly correlated variables, manifestly different from the standard Gumbel class.
Olmos, Jorge A; Piskorz, María Marta; Vela, Marcelo F
2016-06-01
GERD is a highly prevalent disease in our country. It has a deep impact in patient´s quality of life, representing extremely high costs regarding health. The correct understanding of its pathophysiology is crucial for the rational use of diagnoses methods and the implementation of appropriate treatment adjusted to each individual case. In this review we evaluate this disorder based on the best available evidence, focusing in pathophysiological mechanisms, its epidemiology, modern diagnosis methods and current management standards.
Neutron reflecting supermirror structure
Wood, J.L.
1992-12-01
An improved neutron reflecting supermirror structure comprising a plurality of stacked sets of bilayers of neutron reflecting materials. The improved neutron reflecting supermirror structure is adapted to provide extremely good performance at high incidence angles, i.e. up to four time the critical angle of standard neutron mirror structures. The reflection of neutrons striking the supermirror structure at a high critical angle provides enhanced neutron throughput, and hence more efficient and economical use of neutron sources. 2 figs.
Neutron reflecting supermirror structure
Wood, James L.
1992-01-01
An improved neutron reflecting supermirror structure comprising a plurality of stacked sets of bilayers of neutron reflecting materials. The improved neutron reflecting supermirror structure is adapted to provide extremely good performance at high incidence angles, i.e. up to four time the critical angle of standard neutron mirror structures. The reflection of neutrons striking the supermirror structure at a high critical angle provides enhanced neutron throughput, and hence more efficient and economical use of neutron sources.
Stability and UV completion of the Standard Model
NASA Astrophysics Data System (ADS)
Branchina, Vincenzo; Messina, Emanuele
2017-03-01
The knowledge of the electroweak vacuum stability condition is of the greatest importance for our understanding of beyond Standard Model physics. It is widely believed that new physics that lives at very high-energy scales should have no impact on the stability analysis. This expectation has been recently challenged, but the results were controversial as new physics was given in terms of non-renormalizable higher-order operators. Here we consider for the first time new physics at extremely high-energy scales (say close to the Planck scale) in terms of renormalizable operators, in other words we consider a sort of toy UV completion of the Standard Model, and definitely show that its presence can be crucial in determining the vacuum stability condition. This result has important phenomenological consequences, as it provides useful guidance in studying beyond Standard Model theories. Moreover, it suggests that very popular speculations based on the so-called “criticality” of the Standard Model do not appear to be well founded.
Role of Microstructure in High Temperature Oxidation.
1980-05-01
Surface Prepartion Upon Oxidation ......... .................. 20 EXPERIMENTAL METHODS 21 Speciemen Preparation...angle sectioning method 26 Figure 3. Application of the test line upon the image of NiO scale to determine the number of the NiO grain boundary...of knowledge in this field was readily accounted for by extreme experimental difficulty in applying standard methods of microscopy to the thin
The New Teacher Education in the United States: Directions Forward
ERIC Educational Resources Information Center
Cochran-Smith, Marilyn
2008-01-01
There is now unprecedented emphasis on teacher quality in the USA and in many nations around the world with extremely high expectations for teacher performance. Based on the assumption that education and the economy are inextricably linked, it is now assumed that teachers can--and should--teach all students to world-class standards, serve as the…
NASA Astrophysics Data System (ADS)
Mirzoyan, Razmik
2018-04-01
The MAGIC collaboration reports the first detection of very-high-energy (VHE; E > 100 GeV) gamma-ray emission from PGC 2402248, also known as 2WHSP J073326.7+515354 (Chang et al. 2016, A & A, 598, A17) with coordinates R.A.: 07:33:26.7 h, Dec: +51:53:54.99 deg. The source is classified as an extreme high-energy peaked BL Lacertae object of unknown redshift, included in the 2WHSP catalog with a synchrotron peak located at 10^17.9 Hz. PGC 2402248 was observed with the MAGIC telescopes from 2018/01/23 to 2018/04/18 (MJD 58141-58226) for about 23 h. The preliminary analysis of these data resulted in the detection of PGC 2402248 with a statistical significance of more than 6 standard deviations.
Culpepper, Steven Andrew
2016-06-01
Standardized tests are frequently used for selection decisions, and the validation of test scores remains an important area of research. This paper builds upon prior literature about the effect of nonlinearity and heteroscedasticity on the accuracy of standard formulas for correcting correlations in restricted samples. Existing formulas for direct range restriction require three assumptions: (1) the criterion variable is missing at random; (2) a linear relationship between independent and dependent variables; and (3) constant error variance or homoscedasticity. The results in this paper demonstrate that the standard approach for correcting restricted correlations is severely biased in cases of extreme monotone quadratic nonlinearity and heteroscedasticity. This paper offers at least three significant contributions to the existing literature. First, a method from the econometrics literature is adapted to provide more accurate estimates of unrestricted correlations. Second, derivations establish bounds on the degree of bias attributed to quadratic functions under the assumption of a monotonic relationship between test scores and criterion measurements. New results are presented on the bias associated with using the standard range restriction correction formula, and the results show that the standard correction formula yields estimates of unrestricted correlations that deviate by as much as 0.2 for high to moderate selectivity. Third, Monte Carlo simulation results demonstrate that the new procedure for correcting restricted correlations provides more accurate estimates in the presence of quadratic and heteroscedastic test score and criterion relationships.
Cornwell, Andrew S.; Liao, James Y.; Bryden, Anne M.; Kirsch, Robert F.
2013-01-01
We have developed a set of upper extremity functional tasks to guide the design and test the performance of rehabilitation technologies that restore arm motion in people with high tetraplegia. Our goal was to develop a short set of tasks that would be representative of a much larger set of activities of daily living while also being feasible for a unilateral user of an implanted Functional Electrical Stimulation (FES) system. To compile this list of tasks, we reviewed existing clinical outcome measures related to arm and hand function, and were further informed by surveys of patient desires. We ultimately selected a set of five tasks that captured the most common components of movement seen in these tasks, making them highly relevant for assessing FES-restored unilateral arm function in individuals with high cervical spinal cord injury (SCI). The tasks are intended to be used when setting design specifications and for evaluation and standardization of rehabilitation technologies under development. While not unique, this set of tasks will provide a common basis for comparing different interventions (e.g., FES, powered orthoses, robotic assistants) and testing different user command interfaces (e.g., sip-and-puff, head joysticks, brain-computer interfaces). PMID:22773199
NASA Astrophysics Data System (ADS)
Bashir, F.; Zeng, X.; Gupta, H. V.; Hazenberg, P.
2017-12-01
Drought as an extreme event may have far reaching socio-economic impacts on agriculture based economies like Pakistan. Effective assessment of drought requires high resolution spatiotemporally continuous hydrometeorological information. For this purpose, new in-situ daily observations based gridded analyses of precipitation, maximum, minimum and mean temperature and diurnal temperature range are developed, that covers whole Pakistan on 0.01º latitude-longitude for a 54-year period (1960-2013). The number of participating meteorological observatories used in these gridded analyses is 2 to 6 times greater than any other similar product available. This data set is used to identify extreme wet and dry periods and their spatial patterns across Pakistan using Palmer Drought Severity Index (PDSI) and Standardized Precipitation Index (SPI). Periodicity of extreme events is estimated at seasonal to decadal scales. Spatiotemporal signatures of drought incidence indicating its extent and longevity in different areas may help water resource managers and policy makers to mitigate the severity of the drought and its impact on food security through suitable adaptive techniques. Moreover, this high resolution gridded in-situ observations of precipitation and temperature is used to evaluate other coarser-resolution gridded products.
Frouzan, Arash; Masoumi, Kambiz; Delirroyfard, Ali; Mazdaie, Behnaz; Bagherzadegan, Elnaz
2017-08-01
Long bone fractures are common injuries caused by trauma. Some studies have demonstrated that ultrasound has a high sensitivity and specificity in the diagnosis of upper and lower extremity long bone fractures. The aim of this study was to determine the accuracy of ultrasound compared with plain radiography in diagnosis of upper and lower extremity long bone fractures in traumatic patients. This cross-sectional study assessed 100 patients admitted to the emergency department of Imam Khomeini Hospital, Ahvaz, Iran with trauma to the upper and lower extremities, from September 2014 through October 2015. In all patients, first ultrasound and then standard plain radiography for the upper and lower limb was performed. Data were analyzed by SPSS version 21 to determine the specificity and sensitivity. The mean age of patients with upper and lower limb trauma were 31.43±12.32 years and 29.63±5.89 years, respectively. Radius fracture was the most frequent compared to other fractures (27%). Sensitivity, specificity, positive predicted value, and negative predicted value of ultrasound compared with plain radiography in the diagnosis of upper extremity long bones were 95.3%, 87.7%, 87.2% and 96.2%, respectively, and the highest accuracy was observed in left arm fractures (100%). Tibia and fibula fractures were the most frequent types compared to other fractures (89.2%). Sensitivity, specificity, PPV and NPV of ultrasound compared with plain radiography in the diagnosis of upper extremity long bone fractures were 98.6%, 83%, 65.4% and 87.1%, respectively, and the highest accuracy was observed in men, lower ages and femoral fractures. The results of this study showed that ultrasound compared with plain radiography has a high accuracy in the diagnosis of upper and lower extremity long bone fractures.
Body composition and military performance--many things to many people.
Friedl, Karl E
2012-07-01
Soldiers are expected to maintain the highest possible level of physical readiness because they must be ready to mobilize and perform their duties anywhere in the world at any time. The objective of Army body composition standards is to motivate physical training and good nutrition habits to ensure a high state of readiness. Establishment of enforceable and rational standards to support this objective has been challenging even at extremes of body size. Morbidly obese individuals are clearly not suited to military service, but very large muscular individuals may be superbly qualified for soldier performance demands. For this reason, large individuals are measured for body fat using a waist circumference-based equation (female soldiers are also measured for hip circumference). The main challenge comes in setting appropriate fat standards to support the full range of Army requirements. Military appearance ideals dictate the most stringent body fat standards, whereas health risk thresholds anchor the most liberal standards, and physical performance associations fall on a spectrum between these 2 poles. Standards should not exclude or penalize specialized performance capabilities such as endurance running or power lifting across a spectrum of body sizes and fat. The full integration of women into the military further complicates the issue because of sexually dimorphic characteristics that make gender-appropriate standards essential and where inappropriately stringent standards can compromise both health and performance of this segment of the force. Other associations with body composition such as stress effects on intraabdominal fat distribution patterns and metabolic implications of a fat reserve for survival in extreme environments are also relevant considerations. This is a review of the science that underpins the U.S. Army body composition standards.
USDA-ARS?s Scientific Manuscript database
Botulinum neurotoxins (BoNTs) are one of the six highest-risk threat agents for bioterrorism, due to their extreme potency and lethality, ease of production, and need for prolonged intensive care of intoxicated patients. The current standard of treatment, equine antitoxin, has a high incidence of al...
Use of computer games as an intervention for stroke.
Proffitt, Rachel M; Alankus, Gazihan; Kelleher, Caitlin L; Engsberg, Jack R
2011-01-01
Current rehabilitation for persons with hemiparesis after stroke requires high numbers of repetitions to be in accordance with contemporary motor learning principles. The motivational characteristics of computer games can be harnessed to create engaging interventions for persons with hemiparesis after stroke that incorporate this high number of repetitions. The purpose of this case report was to test the feasibility of using computer games as a 6-week home therapy intervention to improve upper extremity function for a person with stroke. One person with left upper extremity hemiparesis after stroke participated in a 6-week home therapy computer game intervention. The games were customized to her preferences and abilities and modified weekly. Her performance was tracked and analyzed. Data from pre-, mid-, and postintervention testing using standard upper extremity measures and the Reaching Performance Scale (RPS) were analyzed. After 3 weeks, the participant demonstrated increased upper extremity range of motion at the shoulder and decreased compensatory trunk movements during reaching tasks. After 6 weeks, she showed functional gains in activities of daily living (ADLs) and instrumental ADLs despite no further improvements on the RPS. Results indicate that computer games have the potential to be a useful intervention for people with stroke. Future work will add additional support to quantify the effectiveness of the games as a home therapy intervention for persons with stroke.
Risk management for food and beverage industry using Australia/New Zealand 4360 Standard
NASA Astrophysics Data System (ADS)
Kristina, S.; Wijaya, B. M.
2017-12-01
This research aims to identifying, measuring and establishing risk in food and beverage industry. The risk management is implemented by referring to Australia/New Zealand 4360 Standard which has four phases such as problem formulation, risk analysis, risk characterization, and risk management. The implementation of risk management is done by case study at Back Alley Café. Based on the risk identification result, there are 59 risks were found in Back Alley Café. The risk identification is conducted based on observation and interview with the café manager who understand the condition of Back Alley Café properly. Based on the assessment of impact and probability, the risk mapping produced four risks at extreme level, 16 risks at high level, 24 risks at medium level, and 18 risks at low level. The strategy was designed as the risk mitigation after the risk mapping for the extreme and high level. The strategy which is given as the prevention or treatment of risk in Back Alley Café is divided into three. There are reducing the risk, sharing the risk and accepting the risk. The strategy is then implemented for each relevant activities.
NASA Astrophysics Data System (ADS)
Ramesham, Rajeshuni
2010-02-01
Ceramic Column Grid Array packages have been increasing in use based on their advantages such as high interconnect density, very good thermal and electrical performance, compatibility with standard surface-mount packaging assembly processes, etc. CCGA packages are used in space applications such as in logics and microprocessor functions, telecommunications, flight avionics, payload electronics, etc. As these packages tend to have less solder joint strain relief than leaded packages, the reliability of CCGA packages is very important for short and long-term space missions. CCGA interconnect electronic package printed wiring boards (PWBs) of polyimide have been assembled, inspected non-destructively and subsequently subjected to extreme temperature thermal cycling to assess the reliability for future deep space, short and long-term, extreme temperature missions. In this investigation, the employed temperature range covers from -185°C to +125°C extreme thermal environments. The test hardware consists of two CCGA717 packages with each package divided into four daisy-chained sections, for a total of eight daisy chains to be monitored. The CCGA717 package is 33 mm × 33 mm with a 27×27 array of 80%/20% Pb/Sn columns on a 1.27 mm pitch. The resistance of daisy-chained, CCGA interconnects were continuously monitored as a function of thermal cycling. Electrical resistance measurements as a function of thermal cycling are reported and the tests to date have shown significant change in daisy chain resistance as a function of thermal cycling. The change in interconnect resistance becomes more noticeable with increasing number of thermal cycles. This paper will describe the experimental test results of CCGA testing under wide extreme temperatures. Standard Weibull analysis tools were used to extract the Weibull parameters to understand the CCGA failures. Optical inspection results clearly indicate that the solder joints of columns with the board and the ceramic package have failed as a function of thermal cycling. The first failure was observed at 137th thermal cycle and 63.2% failures of daisy chains have occurred at about 664 thermal cycles. The shape parameter extracted from Weibull plot was about 1.47 which indicates the failures were related to failures occurred during the flat region or useful life region of standard bath tub curve. Based on this experimental test data one can use the CCGAs for the temperature range studied for ~100 thermal cycles (ΔT = 310°C, 5oC/minute, and 15 minutes dwell) with high degree of confidence for high reliability space and other applications.
Universal Behavior of Extreme Price Movements in Stock Markets
Fuentes, Miguel A.; Gerig, Austin; Vicente, Javier
2009-01-01
Many studies assume stock prices follow a random process known as geometric Brownian motion. Although approximately correct, this model fails to explain the frequent occurrence of extreme price movements, such as stock market crashes. Using a large collection of data from three different stock markets, we present evidence that a modification to the random model—adding a slow, but significant, fluctuation to the standard deviation of the process—accurately explains the probability of different-sized price changes, including the relative high frequency of extreme movements. Furthermore, we show that this process is similar across stocks so that their price fluctuations can be characterized by a single curve. Because the behavior of price fluctuations is rooted in the characteristics of volatility, we expect our results to bring increased interest to stochastic volatility models, and especially to those that can produce the properties of volatility reported here. PMID:20041178
Anomalous Near-Surface Low-Salinity Pulses off the Central Oregon Coast
Mazzini, Piero L. F.; Risien, Craig M.; Barth, John A.; Pierce, Stephen D.; Erofeev, Anatoli; Dever, Edward P.; Kosro, P. Michael; Levine, Murray D.; Shearman, R. Kipp; Vardaro, Michael F.
2015-01-01
From mid-May to August 2011, extreme runoff in the Columbia River ranged from 14,000 to over 17,000 m3/s, more than two standard deviations above the mean for this period. The extreme runoff was the direct result of both melting of anomalously high snowpack and rainfall associated with the 2010–2011 La Niña. The effects of this increased freshwater discharge were observed off Newport, Oregon, 180 km south of the Columbia River mouth. Salinity values as low as 22, nine standard deviations below the climatological value for this period, were registered at the mid-shelf. Using a network of ocean observing sensors and platforms, it was possible to capture the onshore advection of the Columbia River plume from the mid-shelf, 20 km offshore, to the coast and eventually into Yaquina Bay (Newport) during a sustained wind reversal event. Increased freshwater delivery can influence coastal ocean ecosystems and delivery of offshore, river-influenced water may influence estuarine biogeochemistry. PMID:26607750
Min and Max Exponential Extreme Interval Values and Statistics
ERIC Educational Resources Information Center
Jance, Marsha; Thomopoulos, Nick
2009-01-01
The extreme interval values and statistics (expected value, median, mode, standard deviation, and coefficient of variation) for the smallest (min) and largest (max) values of exponentially distributed variables with parameter ? = 1 are examined for different observation (sample) sizes. An extreme interval value g[subscript a] is defined as a…
Translating weather extremes into the future - a case for Norway
NASA Astrophysics Data System (ADS)
Sillmann, Jana; Mueller, Malte; Gjertsen, Uta; Haarsma, Rein; Hazeleger, Wilco; Amundsen, Helene
2017-04-01
We introduce a new project "Translating weather extremes into the future - a case for Norway" (TWEX - http://www.cicero.uio.no/en/twex). In TWEX, we take a novel "Tales of future weather" approach in which we use future scenarios tailored to a specific region and stakeholder in order to gain a more realistic picture of what future weather extremes might look like in a particular context. We focus on hydroclimatic extremes associated with a particular circulation pattern (so-called "Atmospheric River") leading to heavy rainfall in fall and winter along the West Coast of Norway and causing high-impact floods in Norwegian communities. We translate selected past events into the future (e.g., 2090) by using an approach very similar to what is used today for weather prediction. The data generated in TWEX will be distributed by standard (weather prediction) communication channels of the Norwegian Meteorological Institute and thus, will be accessible by end-user in a well-known data format for analyzing the impact of the events in the future and support decision-making on hazard prevention and adaptation planning.
Thermal stress, human performance, and physical employment standards.
Cheung, Stephen S; Lee, Jason K W; Oksa, Juha
2016-06-01
Many physically demanding occupations in both developed and developing economies involve exposure to extreme thermal environments that can affect work capacity and ultimately health. Thermal extremes may be present in either an outdoor or an indoor work environment, and can be due to a combination of the natural or artificial ambient environment, the rate of metabolic heat generation from physical work, processes specific to the workplace (e.g., steel manufacturing), or through the requirement for protective clothing impairing heat dissipation. Together, thermal exposure can elicit acute impairment of work capacity and also chronic effects on health, greatly contributing to worker health risk and reduced productivity. Surprisingly, in most occupations even in developed economies, there are rarely any standards regarding enforced heat or cold safety for workers. Furthermore, specific physical employment standards or accommodations for thermal stressors are rare, with workers commonly tested under near-perfect conditions. This review surveys the major occupational impact of thermal extremes and existing employment standards, proposing guidelines for improvement and areas for future research.
Artz, Neil; Dixon, Samantha; Wylde, Vikki; Marques, Elsa; Beswick, Andrew D; Lenguerrand, Erik; Blom, Ashley W; Gooberman-Hill, Rachael
2017-04-01
To evaluate the feasibility of conducting a randomized controlled trial comparing group-based outpatient physiotherapy with usual care in patients following total knee replacement. A feasibility study for a randomized controlled trial. One secondary-care hospital orthopaedic centre, Bristol, UK. A total of 46 participants undergoing primary total knee replacement. The intervention group were offered six group-based exercise sessions after surgery. The usual care group received standard postoperative care. Participants were not blinded to group allocation. Feasibility was assessed by recruitment, reasons for non-participation, attendance, and completion rates of study questionnaires that included the Lower Extremity Functional Scale and Knee Injury and Osteoarthritis Outcome Score. Recruitment rate was 37%. Five patients withdrew or were no longer eligible to participate. Intervention attendance was high (73%) and 84% of group participants reported they were 'very satisfied' with the exercises. Return of study questionnaires at six months was lower in the usual care (75%) than in the intervention group (100%). Mean (standard deviation) Lower Extremity Functional Scale scores at six months were 45.0 (20.8) in the usual care and 57.8 (15.2) in the intervention groups. Recruitment and retention of participants in this feasibility study was good. Group-based physiotherapy was acceptable to participants. Questionnaire return rates were lower in the usual care group, but might be enhanced by telephone follow-up. The Lower Extremity Functional Scale had high responsiveness and completion rates. Using this outcome measure, 256 participants would be required in a full-scale randomized controlled trial.
Kaposi’s sarcoma (KS), a disease characterized by the development of malignant tumors usually in the lower extremities, is a major complication of HIV/AIDS. KS continues to be a problem even with the use of highly active antiretroviral therapy (HAART), today’s standard of care for patients with HIV/AIDS. CCR investigators recently investigated the effects of interleukin-12
Lower extremity lawn-mower injuries in children.
Farley, F A; Senunas, L; Greenfield, M L; Warschausky, S; Loder, R T; Kewman, D G; Hensinger, R N
1996-01-01
Lower extremity lawn-mower injuries in children result in significant morbidity with a significant financial burden to the family and society. We reviewed 24 children with lower extremity lawn-mower injuries; all mothers completed standardized psychologic assessments of their children, and 18 children were interviewed. Fifty percent of the mothers had defensive profiles on the standardized psychologic assessment, suggesting the likelihood of denial or underreporting of the child's psychologic difficulties. Therefore, we found the interview with the child to be a more accurate measure of psychologic distress. Prevention measures aimed at parents must emphasize that a child must not be allowed in a yard that is being mowed with a riding mower.
NASA Astrophysics Data System (ADS)
Singh, Vishal; Goyal, Manish Kumar
2016-01-01
This paper draws attention to highlight the spatial and temporal variability in precipitation lapse rate (PLR) and precipitation extreme indices (PEIs) through the mesoscale characterization of Teesta river catchment, which corresponds to north Sikkim eastern Himalayas. A PLR rate is an important variable for the snowmelt runoff models. In a mountainous region, the PLR could be varied from lower elevation parts to high elevation parts. In this study, a PLR was computed by accounting elevation differences, which varies from around 1500 m to 7000 m. A precipitation variability and extremity were analysed using multiple mathematical functions viz. quantile regression, spatial mean, spatial standard deviation, Mann-Kendall test and Sen's estimation. For this reason, a daily precipitation, in the historical (years 1980-2005) as measured/observed gridded points and projected experiments for the 21st century (years 2006-2100) simulated by CMIP5 ESM-2 M model (Coupled Model Intercomparison Project Phase 5 Earth System Model 2) employing three different radiative forcing scenarios (Representative Concentration Pathways), utilized for the research work. The outcomes of this study suggest that a PLR is significantly varied from lower elevation to high elevation parts. The PEI based analysis showed that the extreme high intensity events have been increased significantly, especially after 2040s. The PEI based observations also showed that the numbers of wet days are increased for all the RCPs. The quantile regression plots showed significant increments in the upper and lower quantiles of the various extreme indices. The Mann-Kendall test and Sen's estimation tests clearly indicated significant changing patterns in the frequency and intensity of the precipitation indices across all the sub-basins and RCP scenario in an intra-decadal time series domain. The RCP8.5 showed extremity of the projected outcomes.
Probing the Molecular Outflows of the Coldest Known Object in the Universe: The Boomerang Nebula
NASA Astrophysics Data System (ADS)
Sahai, Raghvendra; Vlemmings, W.; Nyman, L. A.; Huggins, P.
2012-05-01
The Boomerang Nebula is the coldest known object in the Universe, and an extreme member of the class of Pre-Planetary Nebulae, objects which represent a short-lived transitional phase between the AGB and Planetary Nebula evolutionary stages. The Boomerang's estimated prodigious mass-loss rate (0.001 solar masses/year) and low-luminosity (300 Lsun) lack an explanation in terms of current paradigms for dusty mass-loss and standard evolutionary theory of intermediate-mass stars. Single-dish CO J=1-0 observations (with a 45 arcsec beam) show that the high-speed outflow in this object has cooled to a temperature significantly below the temperature of the cosmic background radiation. We report on our high-resolution ALMA mapping of the CO lines in this ultra-cold nebula to determine the origin of these extreme conditions and robustly confirm current estimates of the fundamental physical properties of its ultra-cold outflow.
Application of IUS equipment and experience to orbit transfer vehicles of the 90's
NASA Astrophysics Data System (ADS)
Bangsund, E.; Keeney, J.; Cowgill, E.
1985-10-01
This paper relates experiences with the IUS program and the application of that experience to Future Orbit Transfer Vehicles. More specifically it includes the implementation of the U.S. Air Force Space Division high reliability parts standard (SMASO STD 73-2C) and the component/system test standard (MIL-STD-1540A). Test results from the parts and component level testing and the resulting system level test program for fourteen IUS flight vehicles are discussed. The IUS program has had the highest compliance with these standards and thus offers a benchmark of experience for future programs demanding extreme reliability. In summary, application of the stringent parts standard has resulted in fewer failures during testing and the stringent test standard has eliminated design problems in the hardware. Both have been expensive in costs and schedules, and should be applied with flexibility.
Goldenberg, Neil A.; Durham, Janette D.; Knapp-Clevenger, R.
2007-01-01
Important predictors of adverse outcomes of thrombosis in children, including postthrombotic syndrome (PTS), have recently been identified. Given this knowledge and the encouraging preliminary pediatric experience with systemic thrombolysis, we sought to retrospectively analyze our institutional experience with a thrombolytic regimen versus standard anticoagulation for acute, occlusive deep venous thrombosis (DVT) of the proximal lower extremities in children in whom plasma factor VIII activity and/or D-dimer concentration were elevated at diagnosis, from within a longitudinal pediatric cohort. Nine children who underwent the thrombolytic regimen and 13 who received standard anticoagulation alone were followed from time of diagnosis with serial clinical evaluation and standardized PTS outcome assessments conducted in uniform fashion. The thrombolytic regimen was associated with a markedly decreased odds of PTS at 18 to 24 months compared with standard anticoagulation alone, which persisted after adjustment for significant covariates of age and lag time to therapy (odds ratio [OR] = 0.018, 95% confidence interval [CI] = < 0.001-0.483; P = .02). Major bleeding developed in 1 child, clinically judged as not directly related to thrombolysis for DVT. These findings suggest that the use of a thrombolysis regimen may safely and substantially reduce the risk of PTS in children with occlusive lower-extremity acute DVT, providing the basis for a future clinical trial. PMID:17360940
Extreme sensitivity biosensing platform based on hyperbolic metamaterials
NASA Astrophysics Data System (ADS)
Sreekanth, Kandammathe Valiyaveedu; Alapan, Yunus; Elkabbash, Mohamed; Ilker, Efe; Hinczewski, Michael; Gurkan, Umut A.; de Luca, Antonio; Strangi, Giuseppe
2016-06-01
Optical sensor technology offers significant opportunities in the field of medical research and clinical diagnostics, particularly for the detection of small numbers of molecules in highly diluted solutions. Several methods have been developed for this purpose, including label-free plasmonic biosensors based on metamaterials. However, the detection of lower-molecular-weight (<500 Da) biomolecules in highly diluted solutions is still a challenging issue owing to their lower polarizability. In this context, we have developed a miniaturized plasmonic biosensor platform based on a hyperbolic metamaterial that can support highly confined bulk plasmon guided modes over a broad wavelength range from visible to near infrared. By exciting these modes using a grating-coupling technique, we achieved different extreme sensitivity modes with a maximum of 30,000 nm per refractive index unit (RIU) and a record figure of merit (FOM) of 590. We report the ability of the metamaterial platform to detect ultralow-molecular-weight (244 Da) biomolecules at picomolar concentrations using a standard affinity model streptavidin-biotin.
Creating a global sub-daily precipitation dataset
NASA Astrophysics Data System (ADS)
Lewis, Elizabeth; Blenkinsop, Stephen; Fowler, Hayley
2017-04-01
Extremes of precipitation can cause flooding and droughts which can lead to substantial damages to infrastructure and ecosystems and can result in loss of life. It is still uncertain how hydrological extremes will change with global warming as we do not fully understand the processes that cause extreme precipitation under current climate variability. The INTENSE project is using a novel and fully-integrated data-modelling approach to provide a step-change in our understanding of the nature and drivers of global precipitation extremes and change on societally relevant timescales, leading to improved high-resolution climate model representation of extreme rainfall processes. The INTENSE project is in conjunction with the World Climate Research Programme (WCRP)'s Grand Challenge on 'Understanding and Predicting Weather and Climate Extremes' and the Global Water and Energy Exchanges Project (GEWEX) Science questions. The first step towards achieving this is to construct a new global sub-daily precipitation dataset. Data collection is ongoing and already covers North America, Europe, Asia and Australasia. Comprehensive, open source quality control software is being developed to set a new standard for verifying sub-daily precipitation data and a set of global hydroclimatic indices will be produced based upon stakeholder recommendations. This will provide a unique global data resource on sub-daily precipitation whose derived indices, e.g. monthly/annual maxima, will be freely available to the wider scientific community.
Photonic crystal fiber Fabry-Perot interferometers with high-reflectance internal mirrors
NASA Astrophysics Data System (ADS)
Fan, Rong; Hou, Yuanbin; Sun, Wei
2015-06-01
We demonstrated an in-line micro fiber-optic Fabry-Perot interferometer with an air cavity which was created by multi-step fusion splicing a muti-mode photonic crystal fiber (MPCF) to a standard single mode fiber (SMF). The fringe visibility of the interference pattern was up to 20 dB by reshaping the air cavity. Experimental results showed that such a device could be used as a highly sensitive strain sensor with the sensitivity of 4.5 pm/μɛ. Moreover, it offered some other outstanding advantages, such as the extremely compact structure, easy fabrication, low cost, and high accuracy.
The Use of Quantitative SPECT/CT Imaging to Assess Residual Limb Health
2017-10-01
loss in 2020.4 BACKGROUND The integrity of the vasculature, nerves, and soft tissue within the extremities is of high importance, as an impairment or...formation, and vascular abnormalities .6 Therefore, effective diagnosis can be critical in directing the medical treatment of patients. Standard noninva...and identify damage to their nor- mal fasicular pattern, nerve swelling or thickening, loss of nerve bundle integrity , and development of neuromas
Frequency metrology using highly charged ions
NASA Astrophysics Data System (ADS)
Crespo López-Urrutia, J. R.
2016-06-01
Due to the scaling laws of relativistic fine structure splitting, many forbidden optical transitions appear within the ground state configurations of highly charged ions (HCI). In some hydrogen-like ions, even the hyperfine splitting of the 1s ground state gives rise to optical transitions. Given the very low polarizability of HCI, such laser-accessible transitions are extremely impervious to external perturbations and systematics that limit optical clock performance and arise from AC and DC Stark effects, such as black-body radiation and light shifts. Moreover, AC and DC Zeeman splitting are symmetric due to the much larger relativistic spin-orbit coupling and corresponding fine-structure splitting. Appropriate choice of states or magnetic sub-states with suitable total angular momentum and magnetic quantum numbers can lead to a cancellation of residual quadrupolar shifts. All these properties are very advantageous for the proposed use of HCI forbidden lines as optical frequency standards. Extremely magnified relativistic, quantum electrodynamic, and nuclear size contributions to the binding energies of the optically active electrons make HCI ideal tools for fundamental research, as in proposed studies of a possible time variation of the fine structure constant. Beyond this, HCI that cannot be photoionized by vacuum-ultraviolet photons could also provide frequency standards for future lasers operating in that range.
Frouzan, Arash; Masoumi, Kambiz; Delirroyfard, Ali; Mazdaie, Behnaz; Bagherzadegan, Elnaz
2017-01-01
Background Long bone fractures are common injuries caused by trauma. Some studies have demonstrated that ultrasound has a high sensitivity and specificity in the diagnosis of upper and lower extremity long bone fractures. Objective The aim of this study was to determine the accuracy of ultrasound compared with plain radiography in diagnosis of upper and lower extremity long bone fractures in traumatic patients. Methods This cross-sectional study assessed 100 patients admitted to the emergency department of Imam Khomeini Hospital, Ahvaz, Iran with trauma to the upper and lower extremities, from September 2014 through October 2015. In all patients, first ultrasound and then standard plain radiography for the upper and lower limb was performed. Data were analyzed by SPSS version 21 to determine the specificity and sensitivity. Results The mean age of patients with upper and lower limb trauma were 31.43±12.32 years and 29.63±5.89 years, respectively. Radius fracture was the most frequent compared to other fractures (27%). Sensitivity, specificity, positive predicted value, and negative predicted value of ultrasound compared with plain radiography in the diagnosis of upper extremity long bones were 95.3%, 87.7%, 87.2% and 96.2%, respectively, and the highest accuracy was observed in left arm fractures (100%). Tibia and fibula fractures were the most frequent types compared to other fractures (89.2%). Sensitivity, specificity, PPV and NPV of ultrasound compared with plain radiography in the diagnosis of upper extremity long bone fractures were 98.6%, 83%, 65.4% and 87.1%, respectively, and the highest accuracy was observed in men, lower ages and femoral fractures. Conclusion The results of this study showed that ultrasound compared with plain radiography has a high accuracy in the diagnosis of upper and lower extremity long bone fractures. PMID:28979747
Neutron Time-of-Flight Diffractometer HIPPO at LANSCE
NASA Astrophysics Data System (ADS)
Vogel, Sven; Williams, Darrick; Zhao, Yusheng; Bennett, Kristin; von Dreele, Bob; Wenk, Hans-Rudolf
2004-03-01
The High-Pressure Preferred Orientation (HIPPO) neutron diffractometer is the first third-generation neutron time-of-flight powder diffractometer to be constructed in the United States. It produces extremely high intensity by virtue of a short (9 m) initial flight path on a high intensity water moderator and 1380 3He detector tubes covering 4.5 m2 of detector area from 10' to 150' in scattering angles. HIPPO was designed and manufactured as a joint effort between LANSCE and University of California with the goals of attaining world-class science and making neutron powder diffractometry an accessible and available tool to the national user community. Over two decades of momentum transfer are available (0.1-30 A-1) to support studies of amorphous solids; magnetic diffraction; small crystalline samples; and samples subjected to extreme environments such as temperature, pressure, or magnetic fields. The exceptionally high data rates of HIPPO also make it useful for time-resolved studies. In addition to the standard ancillary equipment (100-position sample/texture changer, closed-cycle He refrigerator, furnace), HIPPO has unique high-pressure cells capable of achieving pressures of 30 GPA at ambient and high (2000 K) temperature with samples up to 100 mm3 in volume.
Flat conductor cable for electrical packaging
NASA Technical Reports Server (NTRS)
Angele, W.
1972-01-01
Flat conductor cable (FCC) is relatively new, highly promising means for electrical packaging and system integration. FCC offers numerous desirable traits (weight, volume and cost savings, flexibility, high reliability, predictable and repeatable electrical characteristics) which make it extremely attractive as a packaging medium. FCC, today, finds wide application in everything from integration of lunar equipment to the packaging of electronics in nuclear submarines. Described are cable construction and means of termination, applicable specifications and standards, and total FCC systems. A list of additional sources of data is also included for more intensive study.
Mediterranean space-time extremes of wind wave sea states
NASA Astrophysics Data System (ADS)
Barbariol, Francesco; Carniel, Sandro; Sclavo, Mauro; Marcello Falcieri, Francesco; Bonaldo, Davide; Bergamasco, Andrea; Benetazzo, Alvise
2014-05-01
Traditionally, wind wave sea states during storms have been observed, modeled, and predicted mostly in the time domain, i.e. at a fixed point. In fact, the standard statistical models used in ocean waves analysis rely on the implicit assumption of long-crested waves. Nevertheless, waves in storms are mainly short-crested. Hence, spatio-temporal features of the wave field are crucial to accurately model the sea state characteristics and to provide reliable predictions, particurly of wave extremes. Indeed, the experimental evidence provided by novel instrumentations, e.g. WASS (Wave Acquisition Stereo System), showed that the maximum sea surface elevation gathered in time over an area, i.e. the space-time extreme, is larger than that one measured in time at a point, i.e. the time extreme. Recently, stochastic models used to estimate maxima of multidimensional Gaussian random fields have been applied to ocean waves statistics. These models are based either on Piterbarg's theorem or Adler and Taylor's Euler Characteristics approach. Besides a probability of exceedance of a certain threshold, they can provide the expected space-time extreme of a sea state, as long as space-time wave features (i.e. some parameters of the directional variance density spectrum) are known. These models have been recently validated against WASS observation from fixed and moving platforms. In this context, our focus was modeling and predicting extremes of wind waves during storms. Thus, to intensively gather space-time extremes data over the Mediterranean region, we used directional spectra provided by the numerical wave model SWAN (Simulating WAves Nearshore). Therefore, we set up a 6x6 km2 resolution grid entailing most of the Mediterranean Sea and we forced it with COSMO-I7 high resolution (7x7 km2) hourly wind fields, within 2007-2013 period. To obtain the space-time features, i.e. the spectral parameters, at each grid node and over the 6 simulated years, we developed a modified version of the SWAN model, the SWAN Space-Time (SWAN-ST). SWAN-ST results were post-processed to obtain the expected space-time extremes over the model domain. To this end, we applied the stochastic model of Fedele, developed starting from Adler and Taylor's approach, which we found to be more accurate and versatile with respect to Piterbarg's theorem. Results we obtained provide an alternative sight on Mediterranean extreme wave climate, which could represent the first step towards operationl forecasting of space-time wave extremes, on the one hand, and the basis for a novel statistical standard wave model, on the other. These results may benefit marine designers, seafarers and other subjects operating at sea and exposed to the frequent and severe hazard represented by extreme wave conditions.
Vector dark energy and high-z massive clusters
NASA Astrophysics Data System (ADS)
Carlesi, Edoardo; Knebe, Alexander; Yepes, Gustavo; Gottlöber, Stefan; Jiménez, Jose Beltrán.; Maroto, Antonio L.
2011-12-01
The detection of extremely massive clusters at z > 1 such as SPT-CL J0546-5345, SPT-CL J2106-5844 and XMMU J2235.3-2557 has been considered by some authors as a challenge to the standard Λ cold dark matter cosmology. In fact, assuming Gaussian initial conditions, the theoretical expectation of detecting such objects is as low as ≤1 per cent. In this paper we discuss the probability of the existence of such objects in the light of the vector dark energy paradigm, showing by means of a series of N-body simulations that chances of detection are substantially enhanced in this non-standard framework.
The Extreme-Technology Industry
NASA Astrophysics Data System (ADS)
Hoefflinger, Bernd
The persistent annual R&D quota of >15% of revenue in the semiconductor industry has been and continues to be more than twice as high as the OECD definition for High-Technology Industry. At the frontiers of miniaturization, the Cost-of-Ownership (COO) continues to rise upwards to beyond 10 billion for a Gigafactory. Only leaders in the world market for selected processors and memories or for foundry services can afford this. Others can succeed with high-value custom products equipped with high-performance application-specific standard products acquired from the leaders in their specific fields or as fabless original-device manufacturers buying wafers from top foundries and packaging/testing from contract manufacturers, thus eliminating the fixed cost for a factory. An overview is offered on the leaders in these different business models. In view of the coming highly diversified and heterogeneous world of nanoelectronic-systems competence, the point is made for global networks of manufacturing and services with the highest standards for product quality and liability.
Public Perception of Climate Change and the New Climate Dice
NASA Technical Reports Server (NTRS)
Hansen, James; Sato, Makiko; Ruedy, Reto
2012-01-01
"Climate dice", describing the chance of unusually warm or cool seasons, have become more and more "loaded" in the past 30 years, coincident with rapid global warming. The distribution of seasonal mean temperature anomalies has shifted toward higher temperatures and the range of anomalies has increased. An important change is the emergence of a category of summertime extremely hot outliers, more than three standard deviations (3 sigma) warmer than the climatology of the 1951-1980 base period. This hot extreme, which covered much less than 1% of Earth's surface during the base period, now typically covers about 10% of the land area. It follows that we can state, with a high degree of confidence, that extreme anomalies such as those in Texas and Oklahoma in 2011 and Moscow in 2010 were a consequence of global warming, because their likelihood in the absence of global warming was exceedingly small. We discuss practical implications of this substantial, growing, climate change.
Effects of Standard Extremity on Mixed Standard Scale Performance Ratings.
1983-03-01
e C co lude’d th bt Ih e mixed standard scal, f:rm.0l porf(, r ,t l J, wul I...levels. MSSI (hE) was con, posed c.f s t ;4tri 4 .rts re f I t i nq m a x I Iy extreme scal e Sva Iues for each c .,f th: f ur diicn s iorr rI, r so...8217rte on the appraisal i nstrum n . I I (ME) w . c ,.,pos d of stat..ment(. with mcderately ext- r ,?’, e sc I e val I.... IS I I I F) was co0. o e J
Measurement Properties of Instruments for Measuring of Lymphedema: Systematic Review.
Hidding, Janine T; Viehoff, Peter B; Beurskens, Carien H G; van Laarhoven, Hanneke W M; Nijhuis-van der Sanden, Maria W G; van der Wees, Philip J
2016-12-01
Lymphedema is a common complication of cancer treatment, resulting in swelling and subjective symptoms. Reliable and valid measurement of this side effect of medical treatment is important. The purpose of this study was to provide best evidence regarding which measurement instruments are most appropriate in measuring lymphedema in its different stages. The PubMed and Web of Science databases were used, and the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines were followed. Clinical studies on measurement instruments assessing lymphedema were reviewed using the Quality Assessment of Diagnostic Accuracy Studies-2 (QUADAS-2) scoring instrument for quality assessment. Data on reliability, concurrent validity, convergent validity, sensitivity, specificity, applicability, and costs were extracted. Pooled data showed good intrarater intraclass correlation coefficients (ICCs) (.89) for bioimpedance spectroscopy (BIS) in the lower extremities and high intrarater and interrater ICCs for water volumetry, tape measurement, and perometry (.98-.99) in the upper extremities. In the upper extremities, the standard error of measurement was 3.6% (σ=0.7%) for water volumetry, 5.6% (σ=2.1%) for perometry, and 6.6% (σ=2.6%) for tape measurement. Sensitivity of tape measurement in the upper extremities, using different cutoff points, varied from 0.73 to 0.90, and specificity values varied from 0.72 to 0.78. No uniform definition of lymphedema was available, and a gold standard as a reference test was lacking. Items concerning risk of bias were study design, patient selection, description of lymphedema, blinding of test outcomes, and number of included participants. Measurement instruments with evidence for good reliability and validity were BIS, water volumetry, tape measurement, and perometry, where BIS can detect alterations in extracellular fluid in stage 1 lymphedema and the other measurement instruments can detect alterations in volume starting from stage 2. In research, water volumetry is indicated as a reference test for measuring lymphedema in the upper extremities. © 2016 American Physical Therapy Association.
The anesthesia and brain monitor (ABM). Concept and performance.
Kay, B
1984-01-01
Three integral components of the ABM, the frontalis electromyogram (EMG), the processed unipolar electroencephalogram (EEG) and the neuromuscular transmission monitor (NMT) were compared with standard research methods, and their clinical utility indicated. The EMG was compared with the method of Dundee et al (2) for measuring the induction dose of thiopentone; the EEG was compared with the SLE Galileo E8-b and the NMT was compared with the Medelec MS6. In each case correlation of results was extremely high, and the ABM offered some advantages over the standard research methods. We conclude that each of the integral units of the ABM is simple to apply and interpret, yet as accurate as standard apparatus used for research. In addition the ABM offers excellent display and recording facilities and alarm systems.
Sex Differences During an Overhead Squat Assessment.
Mauntel, Timothy C; Post, Eric G; Padua, Darin A; Bell, David R
2015-08-01
A disparity exists between the rates of male and female lower extremity injuries. One factor that may contribute to this disparity is high-risk biomechanical patterns that are commonly displayed by females. It is unknown what biomechanical differences exist between males and females during an overhead squat. This study compared lower extremity biomechanics during an overhead squat and ranges of motion between males and females. An electromagnetic motion tracking system interfaced with a force platform was used to quantify peak lower extremity kinematics and kinetics during the descent phase of each squat. Range of motion measurements were assessed with a standard goniometer. Differences between male and female kinematics, kinetics, and ranges of motion were identified with t tests. Males displayed greater peak knee valgus angle, peak hip flexion angle, peak vertical ground reaction forces, and peak hip extension moments. Males also displayed less active ankle dorsiflexion with the knee extended and hip internal and external rotation than females. No other differences were observed. The biomechanical differences between males and females during the overhead squat may result from differences in lower extremity ranges of motion. Therefore, sex-specific injury prevention programs should be developed to improve biomechanics and ranges of motion.
Safe surgical technique for associated acetabular fractures
2013-01-01
Associated acetabular fractures are challenging injuries to manage. The complex surgical approaches and the technical difficulty in achieving anatomical reduction imply that the learning curve to achieve high-quality care of patients with such challenging injuries is extremely steep. This first article in the Journal’s “Safe Surgical Technique” section presents the standard surgical care, in conjunction with intraoperative tips and tricks, for the safe management of all subgroups of associated acetabular fractures. PMID:23414782
ANS ultraviolet observations of dwarf Cepheids
NASA Astrophysics Data System (ADS)
Sturch, C. R.; Wu, C.-C.
1983-03-01
Ultraviolet observations of three dwarf Cepheids (VZ Cnc, SX Phe, and AI Vel) are presented. The UV light curves are consistent with those in the visual region. When compared to standard stars, all three dwarf Cepheids exhibit flux deficiencies at the shortest observed wavelengths. The most extreme deficiencies appear for SX Phe; these may be related to the other properties previously noted for this star, including low metallicity, high space motion, and low luminosity.
Ultimately Reliable Pyrotechnic Systems
NASA Technical Reports Server (NTRS)
Scott, John H.; Hinkel, Todd
2015-01-01
This paper presents the methods by which NASA has designed, built, tested, and certified pyrotechnic devices for high reliability operation in extreme environments and illustrates the potential applications in the oil and gas industry. NASA's extremely successful application of pyrotechnics is built upon documented procedures and test methods that have been maintained and developed since the Apollo Program. Standards are managed and rigorously enforced for performance margins, redundancy, lot sampling, and personnel safety. The pyrotechnics utilized in spacecraft include such devices as small initiators and detonators with the power of a shotgun shell, detonating cord systems for explosive energy transfer across many feet, precision linear shaped charges for breaking structural membranes, and booster charges to actuate valves and pistons. NASA's pyrotechnics program is one of the more successful in the history of Human Spaceflight. No pyrotechnic device developed in accordance with NASA's Human Spaceflight standards has ever failed in flight use. NASA's pyrotechnic initiators work reliably in temperatures as low as -420 F. Each of the 135 Space Shuttle flights fired 102 of these initiators, some setting off multiple pyrotechnic devices, with never a failure. The recent landing on Mars of the Opportunity rover fired 174 of NASA's pyrotechnic initiators to complete the famous '7 minutes of terror.' Even after traveling through extreme radiation and thermal environments on the way to Mars, every one of them worked. These initiators have fired on the surface of Titan. NASA's design controls, procedures, and processes produce the most reliable pyrotechnics in the world. Application of pyrotechnics designed and procured in this manner could enable the energy industry's emergency equipment, such as shutoff valves and deep-sea blowout preventers, to be left in place for years in extreme environments and still be relied upon to function when needed, thus greatly enhancing safety and operational availability.
Hdr Imaging for Feature Detection on Detailed Architectural Scenes
NASA Astrophysics Data System (ADS)
Kontogianni, G.; Stathopoulou, E. K.; Georgopoulos, A.; Doulamis, A.
2015-02-01
3D reconstruction relies on accurate detection, extraction, description and matching of image features. This is even truer for complex architectural scenes that pose needs for 3D models of high quality, without any loss of detail in geometry or color. Illumination conditions influence the radiometric quality of images, as standard sensors cannot depict properly a wide range of intensities in the same scene. Indeed, overexposed or underexposed pixels cause irreplaceable information loss and degrade digital representation. Images taken under extreme lighting environments may be thus prohibitive for feature detection/extraction and consequently for matching and 3D reconstruction. High Dynamic Range (HDR) images could be helpful for these operators because they broaden the limits of illumination range that Standard or Low Dynamic Range (SDR/LDR) images can capture and increase in this way the amount of details contained in the image. Experimental results of this study prove this assumption as they examine state of the art feature detectors applied both on standard dynamic range and HDR images.
Upper extremity transplantation: current concepts and challenges in an emerging field.
Elliott, River M; Tintle, Scott M; Levin, L Scott
2014-03-01
Loss of an isolated upper limb is an emotionally and physically devastating event that results in significant impairment. Patients who lose both upper extremities experience profound disability that affects nearly every aspect of their lives. While prosthetics and surgery can eventually provide the single limb amputee with a suitable assisting hand, limited utility, minimal haptic feedback, weight, and discomfort are persistent problems with these techniques that contribute to high rates of prosthetic rejection. Moreover, despite ongoing advances in prosthetic technology, bilateral amputees continue to experience high levels of dependency, disability, and distress. Hand and upper extremity transplantation holds several advantages over prosthetic rehabilitation. The missing limb is replaced with one of similar skin color and size. Sensibility, voluntary motor control, and proprioception are restored to a greater degree, and afford better dexterity and function than prosthetics. The main shortcomings of transplantation include the hazards of immunosuppression, the complications of rejection and its treatment, and high cost. Hand and upper limb transplantation represents the most commonly performed surgery in the growing field of Vascularized Composite Allotransplantation (VCA). As upper limb transplantation and VCA have become more widespread, several important challenges and controversies have emerged. These include: refining indications for transplantation, optimizing immunosuppression, establishing reliable criteria for monitoring, diagnosing, and treating rejection, and standardizing outcome measures. This article will summarize the historical background of hand transplantation and review the current literature and concepts surrounding it.
Abrupt state change of river water quality (turbidity): Effect of extreme rainfalls and typhoons.
Lee, Chih-Sheng; Lee, Yi-Chao; Chiang, Hui-Min
2016-07-01
River turbidity is of dynamic nature, and its stable state is significantly changed during the period of heavy rainfall events. The frequent occurrence of typhoons in Taiwan has caused serious problems in drinking water treatment due to extremely high turbidity. The aim of the present study is to evaluate impact of typhoons on river turbidity. The statistical methods used included analyses of paired annual mean and standard deviation, frequency distribution, and moving standard deviation, skewness, and autocorrelation; all clearly indicating significant state changes of river turbidity. Typhoon Morakot of 2009 (recorded high rainfall over 2000mm in three days, responsible for significant disaster in southern Taiwan) is assumed as a major initiated event leading to critical state change. In addition, increasing rate of turbidity in rainfall events is highly and positively correlated with rainfall intensity both for pre- and post-Morakot periods. Daily turbidity is also well correlated with daily flow rate for all the eleven events evaluated. That implies potential prediction of river turbidity by river flow rate during rainfall and typhoon events. Based on analysis of stable state changes, more effective regulations for better basin management including soil-water conservation in watershed are necessary. Furthermore, municipal and industrial water treatment plants need to prepare and ensure the adequate operation of water treatment with high raw water turbidity (e.g., >2000NTU). Finally, methodology used in the present of this study can be applied to other environmental problems with abrupt state changes. Copyright © 2016 Elsevier B.V. All rights reserved.
[The German DRG system 2003-2010 from the perspective of intensive care medicine].
Franz, Dominik; Bunzemeier, Holger; Roeder, Norbert; Reinecke, Holger
2010-01-01
Intensive care medicine is extremely heterogeneous, expensive and can only be partially planned and controlled. A correct and fair representation of intensive care medicine in the G-DRG system is an essential requirement for the use as a pricing system. From the perspective of intensive care medicine, pertinent changes of the DRG structure and differentiation of relevant parameters have been established within the G-DRG systems 2003-2010. Analysis of relevant diagnoses, medical procedures, co-payment structures and G-DRGs in the versions 2003-2010 based on the publications of the German DRG Institute (InEK) and the German Institute of Medical Documentation and Information (DIMDI). Since the first G-DRG system version 2003, numerous measures improved quality of case allocation of intensive care medicine. Highly relevant to the system version 2010 are duration of mechanical ventilation, the intensive care treatment complex and complicating constellations. The number of G-DRGs relevant to intensive medical care increased from n = 3 (2003) to n = 58 (2010). For standard cases, quality of case allocation and G-DRG reimbursement are adequate in 2010. The G-DRG system gained complexity again. High demands are made on correct and complete coding of complex cases. Nevertheless, further adjustments of the G-DRG system especially for cases with extremely high costs are necessary. Where the G-DRG system is unable to cover extremely high-cost cases, reimbursement solutions beyond the G-DRG structure should be taken into account.
Extreme Mean and Its Applications
NASA Technical Reports Server (NTRS)
Swaroop, R.; Brownlow, J. D.
1979-01-01
Extreme value statistics obtained from normally distributed data are considered. An extreme mean is defined as the mean of p-th probability truncated normal distribution. An unbiased estimate of this extreme mean and its large sample distribution are derived. The distribution of this estimate even for very large samples is found to be nonnormal. Further, as the sample size increases, the variance of the unbiased estimate converges to the Cramer-Rao lower bound. The computer program used to obtain the density and distribution functions of the standardized unbiased estimate, and the confidence intervals of the extreme mean for any data are included for ready application. An example is included to demonstrate the usefulness of extreme mean application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang Dian, E-mail: dwang@mcw.edu; Bosch, Walter; Kirsch, David G.
Purpose: To evaluate variability in the definition of preoperative radiotherapy gross tumor volume (GTV) and clinical target volume (CTV) delineated by sarcoma radiation oncologists. Methods and Materials: Extremity sarcoma planning CT images along with the corresponding diagnostic MRI from two patients were distributed to 10 Radiation Therapy Oncology Group sarcoma radiation oncologists with instructions to define GTV and CTV using standardized guidelines. The CT data with contours were then returned for central analysis. Contours representing statistically corrected 95% (V95) and 100% (V100) agreement were computed for each structure. Results: For the GTV, the minimum, maximum, mean (SD) volumes (mL) weremore » 674, 798, 752 {+-} 35 for the lower extremity case and 383, 543, 447 {+-} 46 for the upper extremity case. The volume (cc) of the union, V95 and V100 were 882, 761, and 752 for the lower, and 587, 461, and 455 for the upper extremity, respectively. The overall GTV agreement was judged to be almost perfect in both lower and upper extremity cases (kappa = 0.9 [p < 0.0001] and kappa = 0.86 [p < 0.0001]). For the CTV, the minimum, maximum, mean (SD) volumes (mL) were 1145, 1911, 1605 {+-} 211 for the lower extremity case and 637, 1246, 1006 {+-} 180 for the upper extremity case. The volume (cc) of the union, V95, and V100 were 2094, 1609, and 1593 for the lower, and 1533, 1020, and 965 for the upper extremity cases, respectively. The overall CTV agreement was judged to be almost perfect in the lower extremity case (kappa = 0.85 [p < 0.0001]) but only substantial in the upper extremity case (kappa = 0.77 [p < 0.0001]). Conclusions: Almost perfect agreement existed in the GTV of these two representative cases. Tshere was no significant disagreement in the CTV of the lower extremity, but variation in the CTV of upper extremity was seen, perhaps related to the positional differences between the planning CT and the diagnostic MRI.« less
A Case Report of a Patient Carrying CYP2C9*3/4 Genotype with Extremely Low Warfarin Dose Requirement
Lee, Soo-Youn; Nam, Myung-Hyun; Kim, June Soo
2007-01-01
We report a case of intolerance to warfarin dosing due to impaired drug metabolism in a patient with CYP2C9*3/*4. A 73-yr-old woman with atrial fibrilation was taking warfarin. She attained a high prothrombin time international normalized ratio (INR) at the standard doses during the induction of anticoagulation and extremely low dose of warfarin (6.5 mg/week) was finally chosen to reach the target INR. Genotyping for CYP2C9 revealed that this patient had a genotype CYP2C9*3/*4. This is the first Korean compound heterozygote for CYP2C9*3 and *4. This case suggests the clinical usefulness of pharmacogenetic testing for individualized dosage adjustments of warfarin. PMID:17596671
Lee, Soo Youn; Nam, Myung Hyun; Kim, June Soo; Kim, Jong Won
2007-06-01
We report a case of intolerance to warfarin dosing due to impaired drug metabolism in a patient with CYP2C9*3/*4. A 73-yr-old woman with atrial fibrilation was taking warfarin. She attained a high prothrombin time international normalized ratio (INR) at the standard doses during the induction of anticoagulation and extremely low dose of warfarin (6.5 mg/week) was finally chosen to reach the target INR. Genotyping for CYP2C9 revealed that this patient had a genotype CYP2C9*3/*4. This is the first Korean compound heterozygote for CYP2C9*3 and *4. This case suggests the clinical usefulness of pharmacogenetic testing for individualized dosage adjustments of warfarin.
Quality‐control of an hourly rainfall dataset and climatology of extremes for the UK
Lewis, Elizabeth; Chan, Steven C.; Fowler, Hayley J.
2016-01-01
ABSTRACT Sub‐daily rainfall extremes may be associated with flash flooding, particularly in urban areas but, compared with extremes on daily timescales, have been relatively little studied in many regions. This paper describes a new, hourly rainfall dataset for the UK based on ∼1600 rain gauges from three different data sources. This includes tipping bucket rain gauge data from the UK Environment Agency (EA), which has been collected for operational purposes, principally flood forecasting. Significant problems in the use of such data for the analysis of extreme events include the recording of accumulated totals, high frequency bucket tips, rain gauge recording errors and the non‐operation of gauges. Given the prospect of an intensification of short‐duration rainfall in a warming climate, the identification of such errors is essential if sub‐daily datasets are to be used to better understand extreme events. We therefore first describe a series of procedures developed to quality control this new dataset. We then analyse ∼380 gauges with near‐complete hourly records for 1992–2011 and map the seasonal climatology of intense rainfall based on UK hourly extremes using annual maxima, n‐largest events and fixed threshold approaches. We find that the highest frequencies and intensities of hourly extreme rainfall occur during summer when the usual orographically defined pattern of extreme rainfall is replaced by a weaker, north–south pattern. A strong diurnal cycle in hourly extremes, peaking in late afternoon to early evening, is also identified in summer and, for some areas, in spring. This likely reflects the different mechanisms that generate sub‐daily rainfall, with convection dominating during summer. The resulting quality‐controlled hourly rainfall dataset will provide considerable value in several contexts, including the development of standard, globally applicable quality‐control procedures for sub‐daily data, the validation of the new generation of very high‐resolution climate models and improved understanding of the drivers of extreme rainfall. PMID:28239235
Floridian heatwaves and extreme precipitation: future climate projections
NASA Astrophysics Data System (ADS)
Raghavendra, Ajay; Dai, Aiguo; Milrad, Shawn M.; Cloutier-Bisbee, Shealynn R.
2018-02-01
Observational analysis and climate modeling efforts concur that the frequency, intensity, and duration of heatwaves will increase as the Earth's mean climate shifts towards warmer temperatures. While the impacts and mechanisms of heatwaves have been well explored, extreme temperatures over Florida are generally understudied. This paper sheds light on Floridian heatwaves by exploring 13 years of daily data from surface observations and high-resolution WRF climate simulations for the same timeframe. The characteristics of the current and future heatwaves under the RCP8.5 high emissions scenario for 2070-2099 were then investigated. Results show a tripling in the frequency, and greater than a sixfold increase in the mean duration of heatwaves over Florida when the current standard of heatwaves was used. The intensity of heatwaves also increased by 4-6 °C due to the combined effects of rising mean temperatures and a 1-2 °C increase attributed to the flattening of the temperature distribution. Since Florida's atmospheric boundary layer is rich in moisture and heatwaves could further increase the moisture content in the lower troposphere, the relationship between heatwaves and extreme precipitation was also explored in both the current and future climate. As expected, rainfall during a heatwave event was anomalously low, but it quickly recovered to normal within 3 days after the passage of a heatwave. Finally, the late 21st-century climate could witness a slight decrease in the mean precipitation over Florida, accompanied by heavier heatwave-associated extreme precipitation events over central and southern Florida.
Development of a low-cost multiple diode PIV laser for high-speed flow visualization
NASA Astrophysics Data System (ADS)
Bhakta, Raj; Hargather, Michael
2017-11-01
Particle imaging velocimetry (PIV) is an optical visualization technique that typically incorporates a single high-powered laser to illuminate seeded particles in a fluid flow. Standard PIV lasers are extremely costly and have low frequencies that severely limit its capability in high speed, time-resolved imaging. The development of a multiple diode laser system consisting of continuous lasers allows for flexible high-speed imaging with a wider range of test parameters. The developed laser system was fabricated with off-the-shelf parts for approximately 500. A series of experimental tests were conducted to compare the laser apparatus to a standard Nd:YAG double-pulsed PIV laser. Steady and unsteady flows were processed to compare the two systems and validate the accuracy of the multiple laser design. PIV results indicate good correlation between the two laser systems and verifies the construction of a precise laser instrument. The key technical obstacle to this approach was laser calibration and positioning which will be discussed. HDTRA1-14-1-0070.
Artificial O3 formation during fireworks
NASA Astrophysics Data System (ADS)
Fiedrich, M.; Kurtenbach, R.; Wiesen, P.; Kleffmann, J.
2017-09-01
In several previous studies emission of ozone (O3) during fireworks has been reported, which was attributed to either photolysis of molecular oxygen (O2) or nitrogen dioxide (NO2) by short/near UV radiation emitted during the high-temperature combustion of fireworks. In contrast, in the present study no O3 formation was observed using a selective O3-LOPAP instrument during the combustion of pyrotechnical material in the laboratory, while a standard O3 monitor using UV absorption showed extremely high O3 signals. The artificial O3 response of the standard O3 monitor was caused by known interferences associated with high levels of co-emitted VOCs and could also be confirmed in field measurements during New Year's Eve in the city of Wuppertal, Germany. The present results help to explain unreasonably high ozone levels documented during ambient fireworks, which are in contradiction to the fast titration of O3 by nitrogen monoxide (NO) in the night-time atmosphere.
Grid Frequency Extreme Event Analysis and Modeling: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Florita, Anthony R; Clark, Kara; Gevorgian, Vahan
Sudden losses of generation or load can lead to instantaneous changes in electric grid frequency and voltage. Extreme frequency events pose a major threat to grid stability. As renewable energy sources supply power to grids in increasing proportions, it becomes increasingly important to examine when and why extreme events occur to prevent destabilization of the grid. To better understand frequency events, including extrema, historic data were analyzed to fit probability distribution functions to various frequency metrics. Results showed that a standard Cauchy distribution fit the difference between the frequency nadir and prefault frequency (f_(C-A)) metric well, a standard Cauchy distributionmore » fit the settling frequency (f_B) metric well, and a standard normal distribution fit the difference between the settling frequency and frequency nadir (f_(B-C)) metric very well. Results were inconclusive for the frequency nadir (f_C) metric, meaning it likely has a more complex distribution than those tested. This probabilistic modeling should facilitate more realistic modeling of grid faults.« less
Guidelines on Lithium-ion Battery Use in Space Applications
NASA Technical Reports Server (NTRS)
Mckissock, Barbara; Loyselle, Patricia; Vogel, Elisa
2009-01-01
This guideline discusses a standard approach for defining, determining, and addressing safety, handling, and qualification standards for lithium-ion (Li-Ion) batteries to help the implementation of the technology in aerospace applications. Information from a variety of other sources relating to Li-ion batteries and their aerospace uses has been collected and included in this document. The sources used are listed in the reference section at the end of this document. The Li-Ion chemistry is highly energetic due to its inherent high specific energy and its flammable electrolyte. Due to the extreme importance of appropriate design, test, and hazard control of Li-ion batteries, it is recommended that all Government and industry users and vendors of this technology for space applications, especially involving humans, use this document for appropriate guidance prior to implementing the technology.
Rennie, Laura J; Bazillier-Bruneau, Cécile; Rouëssé, Jacques
2016-04-01
Electronic cigarettes are marketed as a tool to give up or reduce cigarette smoking, and their use has risen sharply in recent years. There is concern that use is increasing particularly among adolescents and that they are not being used as a cessation tool but as a novel experience in their own right. The present research assessed prevalence and sociodemographic correlates of e-cigarette use and standard cigarette use and also explored the extent to which e-cigarettes appear to be used as a cessation tool. This was assessed using a questionnaire administered to 1,486 French adolescents aged 16 years. Prevalence of e-cigarette experimentation was high (54%) and comparable to that for standard cigarettes (55%). Furthermore, 20% of those who had experimented with e-cigarettes had never tried standard cigarettes, and among regular smokers of standard cigarettes, intentions to quit were not associated with e-cigarette usage frequency. Experimentation with both e-cigarettes and standard cigarettes was significantly predicted by higher age, higher socioeconomic status, and parental smoking of standard cigarettes (in particular the father). Being male marginally predicted e-cigarette use, whereas being female significantly predicted standard cigarette use. These findings give cause for concern: e-cigarette usage experimentation is extremely high, and is not associated with attempts to quit smoking standard cigarettes. Rather, it is exposing adolescents to a highly addictive drug (nicotine) and may pave the way for a future cigarette habit. Copyright © 2016 Society for Adolescent Health and Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Wadey, M. P.; Brown, J. M.; Haigh, I. D.; Dolphin, T.; Wisse, P.
2015-10-01
The extreme sea levels and waves experienced around the UK's coast during the 2013/14 winter caused extensive coastal flooding and damage. Coastal managers seek to place such extremes in relation to the anticipated standards of flood protection, and the long-term recovery of the natural system. In this context, return periods are often used as a form of guidance. This paper provides these levels for the winter storms, and discusses their application to the given data sets for two UK case study sites: Sefton, northwest England, and Suffolk, east England. Tide gauge records and wave buoy data were used to compare the 2013/14 storms with return periods from a national data set, and also joint probabilities of sea level and wave heights were generated, incorporating the recent events. The 2013/14 high waters and waves were extreme due to the number of events, as well as the extremity of the 5 December 2013 "Xaver" storm, which had a high return period at both case study sites. The national-scale impact of this event was due to its coincidence with spring high tide at multiple locations. Given that this event is such an outlier in the joint probability analyses of these observed data sets, and that the season saw several events in close succession, coastal defences appear to have provided a good level of protection. This type of assessment could in the future be recorded alongside defence performance and upgrade. Ideally other variables (e.g. river levels at estuarine locations) would also be included, and with appropriate offsetting for local trends (e.g. mean sea-level rise) so that the storm-driven component of coastal flood events can be determined. This could allow long-term comparison of storm severity, and an assessment of how sea-level rise influences return levels over time, which is important for consideration of coastal resilience in strategic management plans.
Metsavaht, Leonardo; Leporace, Gustavo; Riberto, Marcelo; Sposito, Maria Matilde M; Del Castillo, Letícia N C; Oliveira, Liszt P; Batista, Luiz Alberto
2012-11-01
Clinical measurement. To translate and culturally adapt the Lower Extremity Functional Scale (LEFS) into a Brazilian Portuguese version, and to test the construct and content validity and reliability of this version in patients with knee injuries. There is no Brazilian Portuguese version of an instrument to assess the function of the lower extremity after orthopaedic injury. The translation of the original English version of the LEFS into a Brazilian Portuguese version was accomplished using standard guidelines and tested in 31 patients with knee injuries. Subsequently, 87 patients with a variety of knee disorders completed the Brazilian Portuguese LEFS, the Medical Outcomes Study 36-Item Short-Form Health Survey, the Western Ontario and McMaster Universities Osteoarthritis Index, and the International Knee Documentation Committee Subjective Knee Evaluation Form and a visual analog scale for pain. All patients were retested within 2 days to determine reliability of these measures. Validation was assessed by determining the level of association between the Brazilian Portuguese LEFS and the other outcome measures. Reliability was documented by calculating internal consistency, test-retest reliability, and standard error of measurement. The Brazilian Portuguese LEFS had a high level of association with the physical component of the Medical Outcomes Study 36-Item Short-Form Health Survey (r = 0.82), the Western Ontario and McMaster Universities Osteoarthritis Index (r = 0.87), the International Knee Documentation Committee Subjective Knee Evaluation Form (r = 0.82), and the pain visual analog scale (r = -0.60) (all, P<.05). The Brazilian Portuguese LEFS had a low level of association with the mental component of the Medical Outcomes Study 36-Item Short-Form Health Survey (r = 0.38, P<.05). The internal consistency (Cronbach α = .952) and test-retest reliability (intraclass correlation coefficient = 0.957) of the Brazilian Portuguese version of the LEFS were high. The standard error of measurement was low (3.6) and the agreement was considered high, demonstrated by the small differences between test and retest and the narrow limit of agreement, as observed in Bland-Altman and survival-agreement plots. The translation of the LEFS into a Brazilian Portuguese version was successful in preserving the semantic and measurement properties of the original version and was shown to be valid and reliable in a Brazilian population with knee injuries.
Subtropical air masses over eastern Canada: Their links to extreme precipitation
NASA Astrophysics Data System (ADS)
Gyakum, John; Wood, Alice; Milrad, Shawn; Atallah, Eyad
2017-04-01
We investigate extremely warm, moist air masses with an analysis of 850-hPa equivalent potential temperature (θe) extremes at Montreal, Quebec. The utility of using this metric is that it represents the thermodynamic property of air that ascends during a precipitation event. We produce an analysis of the 40 most extreme cases of positive θe, 10 for each season, based upon standardized anomalies from the 33-year climatology. The analysis shows the cases to be characterized by air masses with distinct subtropical traits for all seasons: reduced static stability, anomalously high precipitable water, and anomalously elevated dynamic tropopause heights. Persistent, slow moving upper- and lower-level features were essential in the build up of high- θe air encompassing much of eastern Canada. The trajectory analysis also showed anticyclonic curvature to all paths in all seasons, implying that the subtropical anticyclone is crucial in the transport of high- θe air. These atmospheric rivers during the winter are characterized by trajectories from the subtropical North Atlantic, and over the Gulf Stream current, northward into Montreal. In contrast, the summer anticyclonic trajectories are primarily continental, traveling from Texas north-northeastward into the Great Lakes, and then eastward into Montreal. The role of the air mass in modulating the strength of a precipitation event is addressed with an analysis of the expression, P = RD, where P is the total precipitation, and R is the precipitation rate, averaged through the duration, D, of the event. Though appearing simple, this expression includes R, (assumed to be same as condensation, with an efficiency of 1), which may be expressed as the product of vertical motion and the change of saturation mixing ratio following a moist adiabat, through the troposphere. This expression for R includes the essential ingredients of lift, air mass temperature, and static stability (implicit in vertical motion). We use this expression for precipitation rate to study the extreme precipitation events in Montreal that are associated with these same cases of extreme warm, moist air masses, and their physical impacts on the precipitation rate. Implications of this air mass modulation on precipitation rate are discussed in the context of longer-term global climate change.
Inter- and intra-observer reliability of clinical movement-control tests for marines
2012-01-01
Background Musculoskeletal disorders particularly in the back and lower extremities are common among marines. Here, movement-control tests are considered clinically useful for screening and follow-up evaluation. However, few studies have addressed the reliability of clinical tests, and no such published data exists for marines. The present aim was therefore to determine the inter- and intra-observer reliability of clinically convenient tests emphasizing movement control of the back and hip among marines. A secondary aim was to investigate the sensitivity and specificity of these clinical tests for discriminating musculoskeletal pain disorders in this group of military personnel. Methods This inter- and intra-observer reliability study used a test-retest approach with six standardized clinical tests focusing on movement control for back and hip. Thirty-three marines (age 28.7 yrs, SD 5.9) on active duty volunteered and were recruited. They followed an in-vivo observation test procedure that covered both low- and high-load (threshold) tasks relevant for marines on operational duty. Two independent observers simultaneously rated performance as “correct” or “incorrect” following a standardized assessment protocol. Re-testing followed 7–10 days thereafter. Reliability was analysed using kappa (κ) coefficients, while discriminative power of the best-fitting tests for back- and lower-extremity pain was assessed using a multiple-variable regression model. Results Inter-observer reliability for the six tests was moderate to almost perfect with κ-coefficients ranging between 0.56-0.95. Three tests reached almost perfect inter-observer reliability with mean κ-coefficients > 0.81. However, intra-observer reliability was fair-to-moderate with mean κ-coefficients between 0.22-0.58. Three tests achieved moderate intra-observer reliability with κ-coefficients > 0.41. Combinations of one low- and one high-threshold test best discriminated prior back pain, but results were inconsistent for lower-extremity pain. Conclusions Our results suggest that clinical tests of movement control of back and hip are reliable for use in screening protocols using several observers with marines. However, test-retest reproducibility was less accurate, which should be considered in follow-up evaluations. The results also indicate that combinations of low- and high-threshold tests have discriminative validity for prior back pain, but were inconclusive for lower-extremity pain. PMID:23273285
Kalenik, Tatiana K; Costa, Rui; Motkina, Elena V; Kosenko, Tamara A; Skripko, Olga V; Kadnikova, Irina A
2017-01-01
There is a need to develop new foods for participants of expeditions in extreme conditions, which must be self-sufficient. These foods should be light to carry, with a long shelf life, tasty and with high nutrient density. Currently, protein sources are limited mainly to dried and canned meat. In this work, a protein-rich dried concentrate suitable for extreme expeditions was developed using soya, tomato, milk whey and meat by-products. Protein concentrates were developed using minced beef liver and heart, dehydrated and mixed with a soya protein-lycopene coagulate (SPLC) obtained from a solution prepared with germi- nated soybeans and mixed with tomato paste in milk whey, and finally dried. The technological parameters of pressing SPLC and of drying the protein concentrate were optimized using response surface methodology. The optimized technological parameters to prepare the protein concentrates were obtained, with 70:30 being the ideal ratio of minced meat to SPLC. The developed protein concentrates are characterized by a high calorific value of 376 kcal/100 g of dry product, with a water content of 98 g·kg-1, and 641-644 g·kg-1 of proteins. The essential amino acid indices are 100, with minimum essential amino acid content constitut- ing 100-128% of the FAO standard, depending on the raw meat used. These concentrates are also rich in micronutrients such as β-carotene and vitamin C. Analysis of the nutrient content showed that these non-perishable concentrates present a high nutritional value and complement other widely available vegetable concentrates to prepare a two-course meal. The soups and porridges prepared with these concentrates can be classified as functional foods, and comply with army requirements applicable to food products for extreme conditions.
Wei, Xi-Jun; Tong, Kai-Yu; Hu, Xiao-Ling
2011-12-01
Responsiveness of clinical assessments is an important element in the report of clinical effectiveness after rehabilitation. The correlation could reflect the validity of assessments as an indication of clinical performance before and after interventions. This study investigated the correlation and responsiveness of Fugl-Meyer Assessment (FMA), Motor Status Scale (MSS), Action Research Arm Test (ARAT) and the Modified Ashworth Scale (MAS), which are used frequently in effectiveness studies of robotic upper-extremity training in stroke rehabilitation. Twenty-seven chronic stroke patients were recruited for a 20-session upper-extremity rehabilitation robotic training program. This was a rater-blinded randomized controlled trial. All participants were evaluated with FMA, MSS, ARAT, MAS, and Functional Independent Measure before and after robotic training. Spearman's rank correlation coefficient was applied for the analysis of correlation. The standardized response mean (SRM) and Guyatt's responsiveness index (GRI) were used to analyze responsiveness. Spearman's correlation coefficient showed a significantly high correlation (ρ=0.91-0.96) among FMA, MSS, and ARAT and a fair-to-moderate correlation (ρ=0.40-0.62) between MAS and the other assessments. FMA, MSS, and MAS on the wrist showed higher responsiveness (SRM=0.85-0.98, GRI=1.59-3.62), whereas ARAT showed relatively less responsiveness (SRM=0.22, GRI=0.81). The results showed that FMA or MSS would be the best choice for evaluating the functional improvement in stroke studies on robotic upper-extremity training with high responsiveness and good correlation with ARAT. MAS could be used separately to evaluate the spasticity changes after intervention in terms of high responsiveness.
Pardo, Deborah; Jenouvrier, Stéphanie; Weimerskirch, Henri; Barbraud, Christophe
2017-06-19
Climate changes include concurrent changes in environmental mean, variance and extremes, and it is challenging to understand their respective impact on wild populations, especially when contrasted age-dependent responses to climate occur. We assessed how changes in mean and standard deviation of sea surface temperature (SST), frequency and magnitude of warm SST extreme climatic events (ECE) influenced the stochastic population growth rate log( λ s ) and age structure of a black-browed albatross population. For changes in SST around historical levels observed since 1982, changes in standard deviation had a larger (threefold) and negative impact on log( λ s ) compared to changes in mean. By contrast, the mean had a positive impact on log( λ s ). The historical SST mean was lower than the optimal SST value for which log( λ s ) was maximized. Thus, a larger environmental mean increased the occurrence of SST close to this optimum that buffered the negative effect of ECE. This 'climate safety margin' (i.e. difference between optimal and historical climatic conditions) and the specific shape of the population growth rate response to climate for a species determine how ECE affect the population. For a wider range in SST, both the mean and standard deviation had negative impact on log( λ s ), with changes in the mean having a greater effect than the standard deviation. Furthermore, around SST historical levels increases in either mean or standard deviation of the SST distribution led to a younger population, with potentially important conservation implications for black-browed albatrosses.This article is part of the themed issue 'Behavioural, ecological and evolutionary responses to extreme climatic events'. © 2017 The Author(s).
Zhang, Mi; Wen, Xue Fa; Zhang, Lei Ming; Wang, Hui Min; Guo, Yi Wen; Yu, Gui Rui
2018-02-01
Extreme high temperature is one of important extreme weathers that impact forest ecosystem carbon cycle. In this study, applying CO 2 flux and routine meteorological data measured during 2003-2012, we examined the impacts of extreme high temperature and extreme high temperature event on net carbon uptake of subtropical coniferous plantation in Qianyanzhou. Combining with wavelet analysis, we analyzed environmental controls on net carbon uptake at different temporal scales, when the extreme high temperature and extreme high temperature event happened. The results showed that mean daily cumulative NEE decreased by 51% in the days with daily maximum air temperature range between 35 ℃ and 40 ℃, compared with that in the days with the range between 30 ℃ and 34 ℃. The effects of the extreme high temperature and extreme high temperature event on monthly NEE and annual NEE related to the strength and duration of extreme high tempe-rature event. In 2003, when strong extreme high temperature event happened, the sum of monthly cumulative NEE in July and August was only -11.64 g C·m -2 ·(2 month) -1 . The value decreased by 90%, compared with multi-year average value. At the same time, the relative variation of annual NEE reached -6.7%. In July and August, when the extreme high temperature and extreme high temperature event occurred, air temperature (T a ) and vapor press deficit (VPD) were the dominant controller for the daily variation of NEE. The coherency between NEE T a and NEE VPD was 0.97 and 0.95, respectively. At 8-, 16-, and 32-day periods, T a , VPD, soil water content at 5 cm depth (SWC), and precipitation (P) controlled NEE. The coherency between NEE SWC and NEE P was higher than 0.8 at monthly scale. The results indicated that atmospheric water deficit impacted NEE at short temporal scale, when the extreme high temperature and extreme high temperature event occurred, both of atmospheric water deficit and soil drought stress impacted NEE at long temporal scales in this ecosystem.
Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue
2018-01-01
Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument. PMID:29621142
Du, Lei; Sun, Qiao; Cai, Changqing; Bai, Jie; Fan, Zhe; Zhang, Yue
2018-04-05
Traffic speed meters are important legal measuring instruments specially used for traffic speed enforcement and must be tested and verified in the field every year using a vehicular mobile standard speed-measuring instrument to ensure speed-measuring performances. The non-contact optical speed sensor and the GPS speed sensor are the two most common types of standard speed-measuring instruments. The non-contact optical speed sensor requires extremely high installation accuracy, and its speed-measuring error is nonlinear and uncorrectable. The speed-measuring accuracy of the GPS speed sensor is rapidly reduced if the amount of received satellites is insufficient enough, which often occurs in urban high-rise regions, tunnels, and mountainous regions. In this paper, a new standard speed-measuring instrument using a dual-antenna Doppler radar sensor is proposed based on a tradeoff between the installation accuracy requirement and the usage region limitation, which has no specified requirements for its mounting distance and no limitation on usage regions and can automatically compensate for the effect of an inclined installation angle on its speed-measuring accuracy. Theoretical model analysis, simulated speed measurement results, and field experimental results compared with a GPS speed sensor with high accuracy showed that the dual-antenna Doppler radar sensor is effective and reliable as a new standard speed-measuring instrument.
A systematic review of hypofractionation for primary management of prostate cancer.
Koontz, Bridget F; Bossi, Alberto; Cozzarini, Cesare; Wiegel, Thomas; D'Amico, Anthony
2015-10-01
Technological advances in radiation therapy delivery have permitted the use of high-dose-per-fraction radiation therapy (RT) for early-stage prostate cancer (PCa). Level 1 evidence supporting the safety and efficacy of hypofractionated RT is evolving as this modality becomes more widely utilized and refined. To perform a systematic review of the current evidence on the safety and efficacy of hypofractionated RT for early-stage PCa and to provide in-context recommendations for current application of this technology. Embase, PubMed, and Scopus electronic databases were queried for English-language articles from January 1990 through June 2014. Prospective studies with a minimum of 50 patients were included. Separate consideration was made for studies involving moderate hypofractionation (doses of 2.5-4Gy per fraction) and extreme hypofractionation (5-10Gy in 4-7 fractions). Six relatively small superiority designed randomized trials of standard fractionation versus moderate hypofractionation in predominantly low- and intermediate-risk PCa have been published with follow-up ranging from 4 to 8 yr, noting similar biochemical control (5-yr freedom from biochemical failure in modern studies is >80% for low-risk and intermediate-risk patients) and late grade ≥2 genitourinary and gastrointestinal toxicities (between 2% and 20%). Noninferiority studies are pending. In prospective phase 2 studies, extreme hypofractionation has promising 2- to 5-yr biochemical control rates of >90% for low-risk patients. Results from a randomized trial are expected in 2015. Moderate hypofractionation has 5-yr data to date establishing safety compared with standard fractionation, but 10-yr outcomes and longer follow-up are needed to establish noninferiority for clinical effectiveness. Extreme hypofractionation is promising but as yet requires reporting of randomized data prior to application outside of a clinical protocol. Hypofractionation for prostate cancer delivers relatively high doses of radiation per treatment. Prospective studies support the safety of moderate hypofractionation, while extreme fractionation may have greater toxicity. Both show promising cancer control but long-term results of noninferiority studies of both methods are required before use in routine treatment outside of clinical protocols. Copyright © 2014 European Association of Urology. Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Fei; Lin, Zhenhong
This paper explored factors that affect market-driven compliance with both Corporate Average Fuel Economy (CAFE) and greenhouse gas (GHG) standards (together called the National Program) in the United States for phase I 2012–2016 and phase II 2017–2025. We considered a consumer-choice-based simulation approach, using the MA3T model, to estimate the market acceptance of fuel efficiency (FE) technologies and alternative fuel technologies as reflected by new sales of light-duty vehicle (LDV). Because both full and extremely low FE valuations are common in the literature, we use a moderate assumption of a 10-year perceived vehicle lifetime at a 7% annual discount ratemore » in the baseline and include both extreme views (5 years and 15 years) in the sensitivity analysis. The study focuses on market-driven compliance and therefore excludes manufacturers’ cross-subsidization. The model results suggest that the LDV industry is able to comply with both standards even without cross-subsidization and with projected high technology cost, mainly thanks to the multiple credit programs and technology advancements. The compliance robustness, while encouraging, however is based on moderate market assumptions, such as Annual Energy Outlook 2016 Reference oil price projection and moderate FE consumer valuation. Finally, sensitivity analysis results reveal two significant risk factors for compliance: low oil prices and consumers’ FE undervaluation.« less
Xie, Fei; Lin, Zhenhong
2017-06-09
This paper explored factors that affect market-driven compliance with both Corporate Average Fuel Economy (CAFE) and greenhouse gas (GHG) standards (together called the National Program) in the United States for phase I 2012–2016 and phase II 2017–2025. We considered a consumer-choice-based simulation approach, using the MA3T model, to estimate the market acceptance of fuel efficiency (FE) technologies and alternative fuel technologies as reflected by new sales of light-duty vehicle (LDV). Because both full and extremely low FE valuations are common in the literature, we use a moderate assumption of a 10-year perceived vehicle lifetime at a 7% annual discount ratemore » in the baseline and include both extreme views (5 years and 15 years) in the sensitivity analysis. The study focuses on market-driven compliance and therefore excludes manufacturers’ cross-subsidization. The model results suggest that the LDV industry is able to comply with both standards even without cross-subsidization and with projected high technology cost, mainly thanks to the multiple credit programs and technology advancements. The compliance robustness, while encouraging, however is based on moderate market assumptions, such as Annual Energy Outlook 2016 Reference oil price projection and moderate FE consumer valuation. Finally, sensitivity analysis results reveal two significant risk factors for compliance: low oil prices and consumers’ FE undervaluation.« less
1982-02-08
is printed in any year-month block when the extreme value Is based on an in- complete month (at least one day missing for the month). When a month has...means, standard deviations, and total number of valid observations for each month and annual (all months). An asterisk (*) is printed n each data block...becomes the extreme or monthly total in any of these tables it is printed as "TRACE." Continued on Reverse Side Values ’or means and standard
Classifying BCI signals from novice users with extreme learning machine
NASA Astrophysics Data System (ADS)
Rodríguez-Bermúdez, Germán; Bueno-Crespo, Andrés; José Martinez-Albaladejo, F.
2017-07-01
Brain computer interface (BCI) allows to control external devices only with the electrical activity of the brain. In order to improve the system, several approaches have been proposed. However it is usual to test algorithms with standard BCI signals from experts users or from repositories available on Internet. In this work, extreme learning machine (ELM) has been tested with signals from 5 novel users to compare with standard classification algorithms. Experimental results show that ELM is a suitable method to classify electroencephalogram signals from novice users.
Chowdhary, Mudit; Sen, Neilayan; Jeans, Elizabeth B; Miller, Luke; Batus, Marta; Gitelis, Steven; Wang, Dian; Abrams, Ross A
2018-05-18
Patients with large, high-grade extremity soft tissue sarcoma (STS) are at high risk for both local and distant recurrence. RTOG 95-14, using a regimen of neoadjuvant interdigitated chemoradiotherapy with mesna, doxorubicin, ifosfamide, and dacarbazine followed by surgery and 3 cycles of adjuvant mesna, doxorubicin, ifosfamide, and dacarbazine, demonstrated high rates of disease control at the cost of significant toxicity (83% grade 4, 5% grade 5). As such, this regimen has not been widely adopted. Herein, we report our institutional outcomes utilizing a modified interdigitated chemoradiotherapy regimen, without dacarbazine, and current radiotherapy planning and delivery techniques for high-risk STS. Adults with large (≥5 cm; median, 12.9 cm), grade 3 extremity STS who were prospectively treated as part of our institutional standard of care from 2008 to 2016 are included. Neoadjuvant chemoradiotherapy consisted of 3 cycles of mesna, doxorubicin, and ifosfamide (MAI) and 44 Gy (22 Gy in 11 fractions between cycles of MAI) after which patients underwent surgical resection and received 3 additional cycles of MAI. Twenty-six patients received the MAI treatment protocol. At a median follow-up of 47.3 months, 23 (88.5%) patients are still alive. Three year locoregional recurrence-free survival, disease-free survival, and overall survival are 95.0%, 64.0%, and 95.0%, respectively. There have been no therapy-related deaths or secondary malignancies. The nonhematologic grade 4 toxicity rate was 7.7%. Neoadjuvant interdigitated MAI radiotherapy followed by resection and 3 cycles of adjuvant MAI has resulted in acceptable and manageable toxicity and highly favorable survival in patients at greatest risk for treatment failure.
Gauthier, Lynne V; Kane, Chelsea; Borstad, Alexandra; Strahl, Nancy; Uswatte, Gitendra; Taub, Edward; Morris, David; Hall, Alli; Arakelian, Melissa; Mark, Victor
2017-06-08
Constraint-Induced Movement therapy (CI therapy) is shown to reduce disability, increase use of the more affected arm/hand, and promote brain plasticity for individuals with upper extremity hemiparesis post-stroke. Randomized controlled trials consistently demonstrate that CI therapy is superior to other rehabilitation paradigms, yet it is available to only a small minority of the estimated 1.2 million chronic stroke survivors with upper extremity disability. The current study aims to establish the comparative effectiveness of a novel, patient-centered approach to rehabilitation utilizing newly developed, inexpensive, and commercially available gaming technology to disseminate CI therapy to underserved individuals. Video game delivery of CI therapy will be compared against traditional clinic-based CI therapy and standard upper extremity rehabilitation. Additionally, individual factors that differentially influence response to one treatment versus another will be examined. This protocol outlines a multi-site, randomized controlled trial with parallel group design. Two hundred twenty four adults with chronic hemiparesis post-stroke will be recruited at four sites. Participants are randomized to one of four study groups: (1) traditional clinic-based CI therapy, (2) therapist-as-consultant video game CI therapy, (3) therapist-as-consultant video game CI therapy with additional therapist contact via telerehabilitation/video consultation, and (4) standard upper extremity rehabilitation. After 6-month follow-up, individuals assigned to the standard upper extremity rehabilitation condition crossover to stand-alone video game CI therapy preceded by a therapist consultation. All interventions are delivered over a period of three weeks. Primary outcome measures include motor improvement as measured by the Wolf Motor Function Test (WMFT), quality of arm use for daily activities as measured by Motor Activity Log (MAL), and quality of life as measured by the Quality of Life in Neurological Disorders (NeuroQOL). This multi-site RCT is designed to determine comparative effectiveness of in-home technology-based delivery of CI therapy versus standard upper extremity rehabilitation and in-clinic CI therapy. The study design also enables evaluation of the effect of therapist contact time on treatment outcomes within a therapist-as-consultant model of gaming and technology-based rehabilitation. Clinicaltrials.gov, NCT02631850 .
Knapp, Alan K.; Avolio, Meghan L.; Beier, Claus; Carroll, Charles J.W.; Collins, Scott L.; Dukes, Jeffrey S.; Fraser, Lauchlan H.; Griffin-Nolan, Robert J.; Hoover, David L.; Jentsch, Anke; Loik, Michael E.; Phillips, Richard P.; Post, Alison K.; Sala, Osvaldo E.; Slette, Ingrid J.; Yahdjian, Laura; Smith, Melinda D.
2017-01-01
Intensification of the global hydrological cycle, ranging from larger individual precipitation events to more extreme multiyear droughts, has the potential to cause widespread alterations in ecosystem structure and function. With evidence that the incidence of extreme precipitation years (defined statistically from historical precipitation records) is increasing, there is a clear need to identify ecosystems that are most vulnerable to these changes and understand why some ecosystems are more sensitive to extremes than others. To date, opportunistic studies of naturally occurring extreme precipitation years, combined with results from a relatively small number of experiments, have provided limited mechanistic understanding of differences in ecosystem sensitivity, suggesting that new approaches are needed. Coordinated distributed experiments (CDEs) arrayed across multiple ecosystem types and focused on water can enhance our understanding of differential ecosystem sensitivity to precipitation extremes, but there are many design challenges to overcome (e.g., cost, comparability, standardization). Here, we evaluate contemporary experimental approaches for manipulating precipitation under field conditions to inform the design of ‘Drought-Net’, a relatively low-cost CDE that simulates extreme precipitation years. A common method for imposing both dry and wet years is to alter each ambient precipitation event. We endorse this approach for imposing extreme precipitation years because it simultaneously alters other precipitation characteristics (i.e., event size) consistent with natural precipitation patterns. However, we do not advocate applying identical treatment levels at all sites – a common approach to standardization in CDEs. This is because precipitation variability varies >fivefold globally resulting in a wide range of ecosystem-specific thresholds for defining extreme precipitation years. For CDEs focused on precipitation extremes, treatments should be based on each site's past climatic characteristics. This approach, though not often used by ecologists, allows ecological responses to be directly compared across disparate ecosystems and climates, facilitating process-level understanding of ecosystem sensitivity to precipitation extremes.
Knapp, Alan K; Avolio, Meghan L; Beier, Claus; Carroll, Charles J W; Collins, Scott L; Dukes, Jeffrey S; Fraser, Lauchlan H; Griffin-Nolan, Robert J; Hoover, David L; Jentsch, Anke; Loik, Michael E; Phillips, Richard P; Post, Alison K; Sala, Osvaldo E; Slette, Ingrid J; Yahdjian, Laura; Smith, Melinda D
2017-05-01
Intensification of the global hydrological cycle, ranging from larger individual precipitation events to more extreme multiyear droughts, has the potential to cause widespread alterations in ecosystem structure and function. With evidence that the incidence of extreme precipitation years (defined statistically from historical precipitation records) is increasing, there is a clear need to identify ecosystems that are most vulnerable to these changes and understand why some ecosystems are more sensitive to extremes than others. To date, opportunistic studies of naturally occurring extreme precipitation years, combined with results from a relatively small number of experiments, have provided limited mechanistic understanding of differences in ecosystem sensitivity, suggesting that new approaches are needed. Coordinated distributed experiments (CDEs) arrayed across multiple ecosystem types and focused on water can enhance our understanding of differential ecosystem sensitivity to precipitation extremes, but there are many design challenges to overcome (e.g., cost, comparability, standardization). Here, we evaluate contemporary experimental approaches for manipulating precipitation under field conditions to inform the design of 'Drought-Net', a relatively low-cost CDE that simulates extreme precipitation years. A common method for imposing both dry and wet years is to alter each ambient precipitation event. We endorse this approach for imposing extreme precipitation years because it simultaneously alters other precipitation characteristics (i.e., event size) consistent with natural precipitation patterns. However, we do not advocate applying identical treatment levels at all sites - a common approach to standardization in CDEs. This is because precipitation variability varies >fivefold globally resulting in a wide range of ecosystem-specific thresholds for defining extreme precipitation years. For CDEs focused on precipitation extremes, treatments should be based on each site's past climatic characteristics. This approach, though not often used by ecologists, allows ecological responses to be directly compared across disparate ecosystems and climates, facilitating process-level understanding of ecosystem sensitivity to precipitation extremes. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Wang, W.
2017-12-01
Theory resultsWang wanli left-skew L distribution density function is formula below, its interval is from -∞ to +1 , x indicates center pressure of hurricane, xA represents its long term mean, [(x-xA)/x] is standard random variable on boundary condition f(+1) =0 and f(-∞) =0 Standard variable is negative when x is less than xA ;standard variable is positive when x is more than xA : the standard variable is equal to zero when x is just xA; thus, standard variable is just -∞ if x is zero ,standard variable is also +1 if x is +∞ , finally standard random variable fall into interval of - ∞ 1 to +1 Application in table "-" signal presents individual hurricane center pressure is less than the hurricane long term averaged value; "+" signal presents individual hurricane center pressure is more than the hurricane its mean of long term, of course the mean (xA) is also substituted by other "standard" or "expected value" Tab multi-levels of hurricane strength or intense Index of Hurricane [(X-XA)/X]% XA / X Categories Descriptions X/ XA Probabilities Formula -∞ +∞ → 0 → 0 …… …… …… …… …… …… < -900 > 10.0 < -15 > extreme ( Ⅵ ) < 0.10 -800, -900 9.0, 10.0 -15 extreme ( Ⅵ ) 0.11, 0.10 -700, -800 8.0, 9.0 -14 extreme ( Ⅴ ) 0.13, 0.11 -600, -700 7.0, 8.0 -13 extreme ( Ⅳ ) 0.14, 0.13 -500, -600 6.0, 7.0 -12 extreme ( Ⅲ ) 0.17, 0.14 0.05287 % L(-5.0)- L(-6.0) -400, -500 5.0, 6.0 -11 extreme ( Ⅱ ) 0.20, 0.17 0.003 % L(-4.0)- L(-5.0) -300, -400 4.0, 5.0 -10 extreme ( Ⅰ ) 0.25, 0.20 0.132 % L(-3.0)- L(-4.0) -267, -300 3.67, 4.00 -9 strongest ( Ⅲ )-superior 0.27, 0.25 0.24 % L(-2.67)-L(-3.00) -233, -267 3.33, 3.67 -8 strongest ( Ⅱ )-medium 0.30, 0.27 0.61 % L(-2.33)-L(-2.67) -200, -233 3.00, 3.33 -7 strongest ( Ⅰ )-inferior 0.33, 0.30 1.28 % L(-2.00)- L(-2.33) -167, -200 2.67, 3.00 -6 strong ( Ⅲ )-superior 0.37, 0.33 2.47 % L(-1.67)-L(-2.00) -133, -167 2.33, 2.67 -5 strong ( Ⅱ )-medium 0.43, 0.37 4.43 % L(-1.33)- L(-1.67) -100, -133 2.00, 2.33 -4 strong ( Ⅰ )-inferior 0.50, 0.43 6.69 % L(-1.00) -L(-1.33) -67, -100 1.67, 2.00 -3 normal ( Ⅲ ) -superior 0.60, 0.50 9.27 % L(-0.67)-L(-1.00) -33, -67 1.33, 1.67 -2 normal ( Ⅱ )-medium 0.75, 0.60 11.93 % L(-0.33)-L(-0.67) 00, -33 1.00, 1.33 -1 normal ( Ⅰ )-inferior 1.0, 0.75 12.93 % L(0.00)-L(-0.33) 33, 00 0.67, 1.00 +1 normal 1.49, 1.00 34.79 % L(0.33)-L(0.00) 67, 33 0.33, 0.67 +2 weak 3.03, 1.49 12.12 % L(0.67)-L(0.33) 100, 67 0.00, 0.33 +3 more weaker ∞, 3.03 3.08 % L(1.00)-L(0.67)
Luria, Shai; Rivkin, Gurion; Avitzour, Malka; Liebergall, Meir; Mintz, Yoav; Mosheiff, Ram
2013-03-01
Explosion injuries to the upper extremity have specific clinical characteristics that differ from injuries due to other mechanisms. To evaluate the upper extremity injury pattern of attacks on civilian targets, comparing bomb explosion injuries to gunshot injuries and their functional recovery using standard outcome measures. Of 157 patients admitted to the hospital between 2000 and 2004, 72 (46%) sustained explosion injuries and 85 (54%) gunshot injuries. The trauma registry files were reviewed and the patients completed the DASH Questionnaire (Disabilities of Arm, Shoulder and Hand) and SF-12 (Short Form-12) after a minimum period of 1 year. Of the 157 patients, 72 (46%) had blast injuries and 85 (54%) had shooting injuries. The blast casualties had higher Injury Severity Scores (47% vs. 22% with a score of > 16, P = 0.02) and higher percent of patients treated in intensive care units (47% vs. 28%, P = 0.02). Although the Abbreviated Injury Scale score of the upper extremity injury was similar in the two groups, the blast casualties were found to have more bilateral and complex soft tissue injuries and were treated surgically more often. No difference was found in the SF-12 or DASH scores between the groups at follow up. The casualties with upper extremity blast injuries were more severely injured and sustained more bilateral and complex soft tissue injuries to the upper extremity. However, the rating of the local injury to the isolated limb is similar, as was the subjective functional recovery.
Chemical and physical microenvironments at the Viking landing sites
NASA Technical Reports Server (NTRS)
Clark, B. C.
1979-01-01
Physical and chemical considerations permit the division of the near-surface regolith on Mars into at least six zones of distinct microenvironments. The zones are euphotic, duricrust/peds, tempofrost, permafrost, endolithic, and interfacial/transitional. Microenvironments vary significantly in temperature extremes, mean temperature, salt content, relative pressure of water vapor, UV and visible light irradiance, and exposure to ionizing radiation events (100 Mrad) and oxidative molecular species. From what is known of the chemistry of the atmosphere and regolith fines (soil), limits upon the aqueous chemistry of soil pastes may be estimated. Heat of wetting could reach 45 cal/g dry soil; initial pH is indeterminate between 1 and 10; ionic strength and salinity are predicted to be extremely high; freezing point depression is inadequate to provide quantities of liquid water except in special cases. The prospects for biotic survival are grim by terrestrial standards, but the extremes of biological resiliency are inaccessible to evaluation. Second-generation in situ experiments which will better define Martian microenvironments are clearly possible. Antarctic dry valleys are approximations to Martian conditions, but deviate significantly by at least half-a-dozen criteria.
Ashnagar, Zinat; Hadian, Mohammad Reza; Olyaei, Gholamreza; Talebian Moghadam, Saeed; Rezasoltani, Asghar; Saeedi, Hassan; Yekaninejad, Mir Saeed; Mahmoodi, Rahimeh
2017-07-01
The aim of this study was to investigate the intratester reliability of digital photographic method for quantifying static lower extremity alignment in individuals with flatfeet and normal feet types. Thirteen females with flexible flatfeet and nine females with normal feet types were recruited from university communities. Reflective markers were attached over the participant's body landmarks. Frontal and sagittal plane photographs were taken while the participants were in a standardized standing position. The markers were removed and after 30 min the same procedure was repeated. Pelvic angle, quadriceps angle, tibiofemoral angle, genu recurvatum, femur length and tibia length were measured from photographs using the Image j software. All measured variables demonstrated good to excellent intratester reliability using digital photography in both flatfeet (ICC: 0.79-0.93) and normal feet type (ICC: 0.84-0.97) groups. The findings of the current study indicate that digital photography is a highly reliable method of measurement for assessing lower extremity alignment in both flatfeet and normal feet type groups. Copyright © 2016. Published by Elsevier Ltd.
Chemical and physical microenvironments at the Viking landing sites.
Clark, B C
1979-12-01
Physical and chemical considerations permit the division of the near-surface regolith on Mars into at least six zones of distinct microenvironments. The zones are euphotic, duricrust/peds, tempofrost, permafrost, endolithic, and interfacial/transitional. Microenvironments vary significantly in temperature extremes, mean temperature, salt content, relative pressure of water vapor, UV and visible light irradiance, and exposure to ionizing radiation events (100 Mrad) and oxidative molecular species. From what is known of the chemistry of the atmosphere and regolith fines (soil), limits upon the aqueous chemistry of soil pastes may be estimated. Heat of wetting could reach 45 cal/g dry soil; initial pH is indeterminate between 1 and 10; ionic strength and salinity are predicted to be extremely high; freezing point depression is inadequate to provide quantities of liquid water except in special cases. The prospects for biotic survival are grim by terrestrial standards, but the extremes of biological resiliency are inaccessible to evaluation. Second-generation in situ experiments which will better define Martian microenvironments are clearly possible. Antarctic dry valleys are approximations to Martian conditions, but deviate significantly by at least half-a-dozen criteria.
Griffiths, Silja Torvik; Aukland, Stein Magnus; Markestad, Trond; Eide, Geir Egil; Elgen, Irene; Craven, Alexander R; Hugdahl, Kenneth
2014-10-01
The purpose of the study was to investigate a possible association between brain activation in functional magnetic resonance imaging scans, cognition and school performance in extremely preterm children and term born controls. Twenty eight preterm and 28 term born children were scanned while performing a working memory/selective attention task, and school results from national standardized tests were collected. Brain activation maps reflected difference in cognitive skills but not in school performance. Differences in brain activation were found between children born preterm and at term, and between high and low performers in cognitive tests. However, the differences were located in different brain areas. The implication may be that lack of cognitive skills does not alone explain low performance due to prematurity. © 2014 Scandinavian Psychological Associations and John Wiley & Sons Ltd.
Ciesla, Nancy; Dinglas, Victor; Fan, Eddy; Kho, Michelle; Kuramoto, Jill; Needham, Dale
2011-04-12
Survivors of acute respiratory distress syndrome (ARDS) and other causes of critical illness often have generalized weakness, reduced exercise tolerance, and persistent nerve and muscle impairments after hospital discharge. Using an explicit protocol with a structured approach to training and quality assurance of research staff, manual muscle testing (MMT) is a highly reliable method for assessing strength, using a standardized clinical examination, for patients following ARDS, and can be completed with mechanically ventilated patients who can tolerate sitting upright in bed and are able to follow two-step commands. (7, 8) This video demonstrates a protocol for MMT, which has been taught to ≥ 43 research staff who have performed >800 assessments on >280 ARDS survivors. Modifications for the bedridden patient are included. Each muscle is tested with specific techniques for positioning, stabilization, resistance, and palpation for each score of the 6-point ordinal Medical Research Council scale. Three upper and three lower extremity muscles are graded in this protocol: shoulder abduction, elbow flexion, wrist extension, hip flexion, knee extension, and ankle dorsiflexion. These muscles were chosen based on the standard approach for evaluating patients for ICU-acquired weakness used in prior publications. (1,2).
Wu, Zhibin; Li, Nianping; Cui, Haijiao; Peng, Jinqing; Chen, Haowen; Liu, Penglong
2017-01-01
Existing thermal comfort field studies are mainly focused on the relationship between the indoor physical environment and the thermal comfort. In numerous chamber experiments, physiological parameters were adopted to assess thermal comfort, but the experiments’ conclusions may not represent a realistic thermal environment due to the highly controlled thermal environment and few occupants. This paper focuses on determining the relationships between upper extremity skin temperatures (i.e., finger, wrist, hand and forearm) and the indoor thermal comfort. Also, the applicability of predicting thermal comfort by using upper extremity skin temperatures was explored. Field studies were performed in office buildings equipped with split air-conditioning (SAC) located in the hot summer and cold winter (HSCW) climate zone of China during the summer of 2016. Psychological responses of occupants were recorded and physical and physiological factors were measured simultaneously. Standard effective temperature (SET*) was used to incorporate the effect of humidity and air velocity on thermal comfort. The results indicate that upper extremity skin temperatures are good indicators for predicting thermal sensation, and could be used to assess the thermal comfort in terms of physiological mechanism. In addition, the neutral temperature was 24.7 °C and the upper limit for 80% acceptability was 28.2 °C in SET*. PMID:28934173
Solar Imaging UV/EUV Spectrometers Using TVLS Gratings
NASA Technical Reports Server (NTRS)
Thomas, Roger J.
2003-01-01
It is a particular challenge to develop a stigmatic spectrograph for UV, EUV wavelengths since the very low normal-incidence reflectance of standard materials most often requires that the design be restricted to a single optical element which must simultaneously provide both reimaging and spectral dispersion. This problem has been solved in the past by the use of toroidal gratings with uniform line-spaced rulings (TULS). A number of solar extreme ultraviolet (EUV) spectrometers have been based on such designs, including SOHO/CDS, Solar-B/EIS, and the sounding rockets Solar Extreme ultraviolet Research Telescope and Spectrograph (SERTS) and Extreme Ultraviolet Normal Incidence Spectrograph (EUNIS). More recently, Kita, Harada, and collaborators have developed the theory of spherical gratings with varied line-space rulings (SVLS) operated at unity magnification, which have been flown on several astronomical satellite missions. We now combine these ideas into a spectrometer concept that puts varied-line space rulings onto toroidal gratings. Such TVLS designs are found to provide excellent imaging even at very large spectrograph magnifications and beam-speeds, permitting extremely high-quality performance in remarkably compact instrument packages. Optical characteristics of three new solar spectrometers based on this concept are described: SUMI and RAISE, two sounding rocket payloads, and NEXUS, currently being proposed as a Small-Explorer (SMEX) mission.
Wu, Zhibin; Li, Nianping; Cui, Haijiao; Peng, Jinqing; Chen, Haowen; Liu, Penglong
2017-09-21
Existing thermal comfort field studies are mainly focused on the relationship between the indoor physical environment and the thermal comfort. In numerous chamber experiments, physiological parameters were adopted to assess thermal comfort, but the experiments' conclusions may not represent a realistic thermal environment due to the highly controlled thermal environment and few occupants. This paper focuses on determining the relationships between upper extremity skin temperatures (i.e., finger, wrist, hand and forearm) and the indoor thermal comfort. Also, the applicability of predicting thermal comfort by using upper extremity skin temperatures was explored. Field studies were performed in office buildings equipped with split air-conditioning (SAC) located in the hot summer and cold winter (HSCW) climate zone of China during the summer of 2016. Psychological responses of occupants were recorded and physical and physiological factors were measured simultaneously. Standard effective temperature (SET*) was used to incorporate the effect of humidity and air velocity on thermal comfort. The results indicate that upper extremity skin temperatures are good indicators for predicting thermal sensation, and could be used to assess the thermal comfort in terms of physiological mechanism. In addition, the neutral temperature was 24.7 °C and the upper limit for 80% acceptability was 28.2 °C in SET*.
NASA Astrophysics Data System (ADS)
Lawhead, Carlos; Cooper, Nathan; Anderson, Josiah; Shiver, Tegan; Ujj, Laszlo
2014-03-01
Electronic and vibrational spectroscopy is extremely important tools used in material characterization; therefore a table-top laser spectrometer system was built in the spectroscopy lab at the UWF physics department. The system is based upon an injection seeded nanosecond Nd:YAG Laser. The second and the third harmonics of the fundamental 1064 nm radiation are used to generate Raman and fluorescence spectra measured with MS260i imaging spectrograph occupied with a CCD detector and cooled to -85 °C, in order to minimize the dark background noise. The wavelength calibration was performed with the emission spectra of standard gas-discharge lamps. Spectral sensitivity calibration is needed before any spectra are recorded, because of the table-top nature of the instrument. A variety of intensity standards were investigated to find standards suitable for our table top setup that do not change the geometry of the system. High quality measurement of Raman standards where analyzed to test spectral corrections. Background fluorescence removal methods were used to improve Raman signal intensity reading on highly fluorescent molecules. This instrument will be used to measure vibrational and electronic spectra of biological molecules.
Standardized Patients Provide Realistic and Worthwhile Experiences for Athletic Training Students
ERIC Educational Resources Information Center
Walker, Stacy E.; Weidner, Thomas G.
2010-01-01
Context: Standardized patients are more prominently used to both teach and evaluate students' clinical skills and abilities. Objective: To investigate whether athletic training students perceived an encounter with a standardized patient (SP) as realistic and worthwhile and to determine their perceived comfort in future lower extremity evaluations…
Move towards New ILO Standards on Child Labour.
ERIC Educational Resources Information Center
World of Work, 1998
1998-01-01
Discusses major issues to be addressed during the debate on the proposed new international labor standards on child labor. The subject of the standards is extreme forms of child labor: work that is likely to jeopardize the health, safety, and morals of children; slavery; and child prostitution and pornography. (JOW)
76 FR 17673 - Bienvenido Tan, M.D.; Denial of Application
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-30
...]'' and that his ``prescribing was `an extreme departure from the standard of care expected of a licensed... reprimanded the Respondent `for his departures from the standard of care regarding his medical record keeping... each party's experts (who had examined various patient records) regarding the standard of care for...
Paleo-event data standards for dendrochronology
Elaine Kennedy Sutherland; P. Brewer; W. Gross
2017-01-01
Extreme environmental events, such as storm winds, landslides, insect infestations, and wildfire, cause loss of life, resources, and human infrastructure. Disaster riskreduction analysis can be improved with information about past frequency, intensity, and spatial patterns of extreme events. Tree-ring analyses can provide such information: tree rings reflect events as...
14 CFR 25.27 - Center of gravity limits.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Center of gravity limits. 25.27 Section 25... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Flight General § 25.27 Center of gravity limits. The extreme forward and the extreme aft center of gravity limitations must be established for each practicably...
14 CFR 29.27 - Center of gravity limits.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Center of gravity limits. 29.27 Section 29... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Flight General § 29.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity...
14 CFR 25.27 - Center of gravity limits.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Center of gravity limits. 25.27 Section 25... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Flight General § 25.27 Center of gravity limits. The extreme forward and the extreme aft center of gravity limitations must be established for each practicably...
14 CFR 29.27 - Center of gravity limits.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Center of gravity limits. 29.27 Section 29... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Flight General § 29.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity...
14 CFR 25.27 - Center of gravity limits.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Center of gravity limits. 25.27 Section 25... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Flight General § 25.27 Center of gravity limits. The extreme forward and the extreme aft center of gravity limitations must be established for each practicably...
14 CFR 29.27 - Center of gravity limits.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Center of gravity limits. 29.27 Section 29... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Flight General § 29.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity...
14 CFR 25.27 - Center of gravity limits.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Center of gravity limits. 25.27 Section 25... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Flight General § 25.27 Center of gravity limits. The extreme forward and the extreme aft center of gravity limitations must be established for each practicably...
14 CFR 29.27 - Center of gravity limits.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Center of gravity limits. 29.27 Section 29... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Flight General § 29.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity...
14 CFR 25.27 - Center of gravity limits.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Center of gravity limits. 25.27 Section 25... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY AIRPLANES Flight General § 25.27 Center of gravity limits. The extreme forward and the extreme aft center of gravity limitations must be established for each practicably...
14 CFR 29.27 - Center of gravity limits.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Center of gravity limits. 29.27 Section 29... AIRWORTHINESS STANDARDS: TRANSPORT CATEGORY ROTORCRAFT Flight General § 29.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity...
Lim, Changwon
2015-03-30
Nonlinear regression is often used to evaluate the toxicity of a chemical or a drug by fitting data from a dose-response study. Toxicologists and pharmacologists may draw a conclusion about whether a chemical is toxic by testing the significance of the estimated parameters. However, sometimes the null hypothesis cannot be rejected even though the fit is quite good. One possible reason for such cases is that the estimated standard errors of the parameter estimates are extremely large. In this paper, we propose robust ridge regression estimation procedures for nonlinear models to solve this problem. The asymptotic properties of the proposed estimators are investigated; in particular, their mean squared errors are derived. The performances of the proposed estimators are compared with several standard estimators using simulation studies. The proposed methodology is also illustrated using high throughput screening assay data obtained from the National Toxicology Program. Copyright © 2014 John Wiley & Sons, Ltd.
Subsonic roll oscillation experiments on the Standard Dynamics Model
NASA Technical Reports Server (NTRS)
Beyers, M. E.
1983-01-01
The experimental determination of the subsonic roll derivatives of the Standard Dynamics Model, which is representative of a current fighter aircraft configuration, is described. The direct, cross and cross-coupling derivatives are presented for angles of attack up to 41 deg and sideslip angles in the range from -5 deg to 5 deg, as functions of oscillation frequency. The derivatives exhibited significant nonlinear trends at high incidences and were found to be extremely sensitive to sideslip angle at angles of attack near 36 deg. The roll damping and dynamic cross derivatives were highly frequency dependent at angles of attack above 30 deg. The highest values measured for the dynamic cross and cross-coupling derivatives were comparable in magnitude with the maximum roll damping. The effects of oscillation amplitude and Mach number were also investigated, and the direct derivatives were correlated with data from another facility.
A New Standard for Assessing the Performance of High Contrast Imaging Systems
NASA Astrophysics Data System (ADS)
Jensen-Clem, Rebecca; Mawet, Dimitri; Gomez Gonzalez, Carlos A.; Absil, Olivier; Belikov, Ruslan; Currie, Thayne; Kenworthy, Matthew A.; Marois, Christian; Mazoyer, Johan; Ruane, Garreth; Tanner, Angelle; Cantalloube, Faustine
2018-01-01
As planning for the next generation of high contrast imaging instruments (e.g., WFIRST, HabEx, and LUVOIR, TMT-PFI, EELT-EPICS) matures and second-generation ground-based extreme adaptive optics facilities (e.g., VLT-SPHERE, Gemini-GPI) finish their principal surveys, it is imperative that the performance of different designs, post-processing algorithms, observing strategies, and survey results be compared in a consistent, statistically robust framework. In this paper, we argue that the current industry standard for such comparisons—the contrast curve—falls short of this mandate. We propose a new figure of merit, the “performance map,” that incorporates three fundamental concepts in signal detection theory: the true positive fraction, the false positive fraction, and the detection threshold. By supplying a theoretical basis and recipe for generating the performance map, we hope to encourage the widespread adoption of this new metric across subfields in exoplanet imaging.
Simple adaptation of the Bridgman high pressure technique for use with liquid media
NASA Astrophysics Data System (ADS)
Colombier, E.; Braithwaite, D.
2007-09-01
We present a simple novel technique to adapt a standard Bridgman cell for the use of a liquid pressure transmitting medium. The technique has been implemented in a compact cell, able to fit in a commercial Quantum Design PPMS system, and would also be easily adaptable to extreme conditions of very low temperatures or high magnetic fields. Several media have been tested and a mix of fluorinert FC84:FC87 has been shown to produce a considerable improvement over the pressure conditions in the standard steatite solid medium, while allowing a relatively easy setup procedure. For optimized hydrostatic conditions, the success rate is about 80% and the maximum pressure achieved so far is 7.1GPa. Results are shown for the heavy fermion system YbAl3 and for NaV6O15, an insulator showing charge order.
Ground and Flight Testing for Aircraft Guidance and Control,
1984-12-01
almost rigid structure (Figure 3). It is equipped with control surfa- - S ces (inner flaps, outer flaps, elevator) which are driven by fast acting...extremely fast -response actuators com- bined with a full fly-by-wire/light system is envisaged. The technology for doing this is not yet available today...6.6 late S Standard deviation 23.7 (77.8) 6.5 12.0 *Maximum error 51.5 (169) high 12.9 fast 29.0 late *The values of these errors were judged by the
Horstkotte, M A; Knobloch, J K; Rohde, H; Mack, D
2001-10-01
The detection of PBP 2a by the MRSA-Screen latex agglutination test with 201 clinical coagulase-negative staphylococci had an initial sensitivity of 98% and a high degree of specificity for Staphylococcus epidermidis strains compared to PCR for mecA. Determination of oxacillin MICs evaluated according to the new breakpoint (0.5 microg/ml) of the National Committee for Clinical Laboratory Standards exhibited an extremely low specificity for this population.
Tong, Jasper W K; Kong, Pui W
2013-10-01
Systematic literature review with meta-analysis. To investigate the association between nonneutral foot types (high arch and flatfoot) and lower extremity and low back injuries, and to identify the most appropriate methods to use for foot classification. A search of 5 electronic databases (PubMed, Embase, CINAHL, SPORTDiscus, and ProQuest Dissertations and Theses), Google Scholar, and the reference lists of included studies was conducted to identify relevant articles. The review included comparative cross-sectional, case-control, and prospective studies that reported qualitative/quantitative associations between foot types and lower extremity and back injuries. Quality of the selected studies was evaluated, and data synthesis for the level of association between foot types and injuries was conducted. A random-effects model was used to pool odds ratio (OR) and standardized mean difference (SMD) results for meta-analysis. Twenty-nine studies were included for meta-analysis. A significant association between nonneutral foot types and lower extremity injuries was determined (OR = 1.23; 95% confidence interval [CI]: 1.11, 1.37; P<.001). Foot posture index (OR = 2.58; 95% CI: 1.33, 5.02; P<.01) and visual/physical examination (OR = 1.17; 95% CI: 1.06, 1.28; P<.01) were 2 assessment methods using distinct foot-type categories that showed a significant association with lower extremity injuries. For foot-assessment methods using a continuous scale, measurements of lateral calcaneal pitch angle (SMD, 1.92; 95% CI: 1.44, 2.39; P<.00001), lateral talocalcaneal angle (SMD, 1.36; 95% CI: 0.93, 1.80; P<.00001), and navicular height (SMD, 0.34; 95% CI: 0.16, 0.52; P<.001) showed significant effect sizes in identifying high-arch foot, whereas the navicular drop test (SMD, 0.45; 95% CI: 0.03, 0.87; P<.05) and relaxed calcaneal stance position (SMD, 0.49; 95% CI: 0.01, 0.97; P<.05) displayed significant effect sizes in identifying flatfoot. Subgroup analyses revealed no significant associations for children with flatfoot, cross-sectional studies, or prospective studies on high arch. High-arch and flatfoot foot types are associated with lower extremity injuries, but the strength of this relationship is low. Although the foot posture index and visual/physical examination showed significance, they are qualitative measures. Radiographic and navicular height measurements can delineate high-arch foot effectively, with only anthropometric measures accurately classifying flatfoot. Prognosis, level 2a.
FAME: freeform active mirror experiment
NASA Astrophysics Data System (ADS)
Aitink-Kroes, Gabby; Agócs, Tibor; Miller, Chris; Black, Martin; Farkas, Szigfrid; Lemared, Sabri; Bettonvil, Felix; Montgomery, David; Marcos, Michel; Jaskó, Attila; van Duffelen, Farian; Challita, Zalpha; Fok, Sandy; Kiaeerad, Fatemeh; Hugot, Emmanuel; Schnetler, Hermine; Venema, Lars
2016-07-01
FAME is a four-year project and part of the OPTICON/FP7 program that is aimed at providing a breakthrough component for future compact, wide field, high resolution imagers or spectrographs, based on both Freeform technology, and the flexibility and versatility of active systems. Due to the opening of a new parameter space in optical design, Freeform Optics are a revolution in imaging systems for a broad range of applications from high tech cameras to astronomy, via earth observation systems, drones and defense. Freeform mirrors are defined by a non-rotational symmetry of the surface shape, and the fact that the surface shape cannot be simply described by conicoids extensions, or off-axis conicoids. An extreme freeform surface is a significantly challenging optical surface, especially for UV/VIS/NIR diffraction limited instruments. The aim of the FAME effort is to use an extreme freeform mirror with standard optics in order to propose an integrated system solution for use in future instruments. The work done so far concentrated on identification of compact, fast, widefield optical designs working in the visible, with diffraction limited performance; optimization of the number of required actuators and their layout; the design of an active array to manipulate the face sheet, as well as the actuator design. In this paper we present the status of the demonstrator development, with focus on the different building blocks: an extreme freeform thin face sheet, the active array, a highly controllable thermal actuator array, and the metrology and control system.
NASA Astrophysics Data System (ADS)
Zolina, Olga; Simmer, Clemens; Kapala, Alice; Mächel, Hermann; Gulev, Sergey; Groisman, Pavel
2014-05-01
We present new high resolution precipitation daily grids developed at Meteorological Institute, University of Bonn and German Weather Service (DWD) under the STAMMEX project (Spatial and Temporal Scales and Mechanisms of Extreme Precipitation Events over Central Europe). Daily precipitation grids have been developed from the daily-observing precipitation network of DWD, which runs one of the World's densest rain gauge networks comprising more than 7500 stations. Several quality-controlled daily gridded products with homogenized sampling were developed covering the periods 1931-onwards (with 0.5 degree resolution), 1951-onwards (0.25 degree and 0.5 degree), and 1971-2000 (0.1 degree). Different methods were tested to select the best gridding methodology that minimizes errors of integral grid estimates over hilly terrain. Besides daily precipitation values with uncertainty estimates (which include standard estimates of the kriging uncertainty as well as error estimates derived by a bootstrapping algorithm), the STAMMEX data sets include a variety of statistics that characterize temporal and spatial dynamics of the precipitation distribution (quantiles, extremes, wet/dry spells, etc.). Comparisons with existing continental-scale daily precipitation grids (e.g., CRU, ECA E-OBS, GCOS) which include considerably less observations compared to those used in STAMMEX, demonstrate the added value of high-resolution grids for extreme rainfall analyses. These data exhibit spatial variability pattern and trends in precipitation extremes, which are missed or incorrectly reproduced over Central Europe from coarser resolution grids based on sparser networks. The STAMMEX dataset can be used for high-quality climate diagnostics of precipitation variability, as a reference for reanalyses and remotely-sensed precipitation products (including the upcoming Global Precipitation Mission products), and for input into regional climate and operational weather forecast models. We will present numerous application of the STAMMEX grids spanning from case studies of the major Central European floods to long-term changes in different precipitation statistics, including those accounting for the alternation of dry and wet periods and precipitation intensities associated with prolonged rainy episodes.
Jackson, Steven M; Cheng, M Samuel; Smith, A Russell; Kolber, Morey J
2017-02-01
Hand held dynamometry (HHD) is a more objective way to quantify muscle force production (MP) compared to traditional manual muscle testing. HHD reliability can be negatively impacted by both the strength of the tester and the subject particularly in the lower extremities due to larger muscle groups. The primary aim of this investigation was to assess intrarater reliability of HHD with use of a portable stabilization device for lower extremity MP in an athletic population. Isometric lower extremity strength was measured for bilateral lower extremities including hip abductors, external rotators, adductors, knee extensors, and ankle plantar flexors was measured in a sample of healthy recreational runners (8 male, 7 females, = 30 limbs) training for a marathon. These measurements were assessed using an intrasession intrarater reliability design. Intraclass correlation coefficients (ICC) were calculated using 3,1 model based on the single rater design. The standard error of measurement (SEM) for each muscle group was also calculated. ICC were excellent ranging from ICC (3,1) = 0.93-0.98 with standard error of measurements ranging from 0.58 to 17.2 N. This study establishes the use of a HHD with a portable stabilization device as demonstrating good reliability within testers for measuring lower extremity muscle performance in an active healthy population. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cao, Q; Brehler, M; Sisniega, A; Stayman, J W; Yorkston, J; Siewerdsen, J H; Zbijewski, W
2017-03-01
CMOS x-ray detectors offer small pixel sizes and low electronic noise that may support the development of novel high-resolution imaging applications of cone-beam CT (CBCT). We investigate the effects of CsI scintillator thickness on the performance of CMOS detectors in high resolution imaging tasks, in particular in quantitative imaging of bone microstructure in extremity CBCT. A scintillator thickness-dependent cascaded systems model of CMOS x-ray detectors was developed. Detectability in low-, high- and ultra-high resolution imaging tasks (Gaussian with FWHM of ~250 μ m, ~80 μ m and ~40 μ m, respectively) was studied as a function of scintillator thickness using the theoretical model. Experimental studies were performed on a CBCT test bench equipped with DALSA Xineos3030 CMOS detectors (99 μ m pixels) with CsI scintillator thicknesses of 400 μ m and 700 μ m, and a 0.3 FS compact rotating anode x-ray source. The evaluation involved a radiographic resolution gauge (0.6-5.0 lp/mm), a 127 μm tungsten wire for assessment of 3D resolution, a contrast phantom with tissue-mimicking inserts, and an excised fragment of human tibia for visual assessment of fine trabecular detail. Experimental studies show ~35% improvement in the frequency of 50% MTF modulation when using the 400 μ m scintillator compared to the standard nominal CsI thickness of 700 μ m. Even though the high-frequency DQE of the two detectors is comparable, theoretical studies show a 14% to 28% increase in detectability index ( d' 2 ) of high- and ultrahigh resolution tasks, respectively, for the detector with 400 μ m CsI compared to 700 μ m CsI. Experiments confirm the theoretical findings, showing improvements with the adoption of 400 μ m panel in the visibility of the radiographic pattern (2× improvement in peak-to-through distance at 4.6 lp/mm) and a 12.5% decrease in the FWHM of the tungsten wire. Reconstructions of the tibial plateau reveal enhanced visibility of trabecular structures with the CMOS detector with 400 μ m scinitllator. Applications on CMOS detectors in high resolution CBCT imaging of trabecular bone will benefit from using a thinner scintillator than the current standard in general radiography. The results support the translation of the CMOS sensor with 400 μ m CsI onto the clinical prototype of CMOS-based extremity CBCT.
Cao, Q.; Brehler, M.; Sisniega, A.; Stayman, J. W.; Yorkston, J.; Siewerdsen, J. H.; Zbijewski, W.
2017-01-01
Purpose CMOS x-ray detectors offer small pixel sizes and low electronic noise that may support the development of novel high-resolution imaging applications of cone-beam CT (CBCT). We investigate the effects of CsI scintillator thickness on the performance of CMOS detectors in high resolution imaging tasks, in particular in quantitative imaging of bone microstructure in extremity CBCT. Methods A scintillator thickness-dependent cascaded systems model of CMOS x-ray detectors was developed. Detectability in low-, high- and ultra-high resolution imaging tasks (Gaussian with FWHM of ~250 μm, ~80 μm and ~40 μm, respectively) was studied as a function of scintillator thickness using the theoretical model. Experimental studies were performed on a CBCT test bench equipped with DALSA Xineos3030 CMOS detectors (99 μm pixels) with CsI scintillator thicknesses of 400 μm and 700 μm, and a 0.3 FS compact rotating anode x-ray source. The evaluation involved a radiographic resolution gauge (0.6–5.0 lp/mm), a 127 μm tungsten wire for assessment of 3D resolution, a contrast phantom with tissue-mimicking inserts, and an excised fragment of human tibia for visual assessment of fine trabecular detail. Results Experimental studies show ~35% improvement in the frequency of 50% MTF modulation when using the 400 μm scintillator compared to the standard nominal CsI thickness of 700 μm. Even though the high-frequency DQE of the two detectors is comparable, theoretical studies show a 14% to 28% increase in detectability index (d′2) of high- and ultrahigh resolution tasks, respectively, for the detector with 400 μm CsI compared to 700 μm CsI. Experiments confirm the theoretical findings, showing improvements with the adoption of 400 μm panel in the visibility of the radiographic pattern (2× improvement in peak-to-through distance at 4.6 lp/mm) and a 12.5% decrease in the FWHM of the tungsten wire. Reconstructions of the tibial plateau reveal enhanced visibility of trabecular structures with the CMOS detector with 400 μm scinitllator. Conclusion Applications on CMOS detectors in high resolution CBCT imaging of trabecular bone will benefit from using a thinner scintillator than the current standard in general radiography. The results support the translation of the CMOS sensor with 400 μm CsI onto the clinical prototype of CMOS-based extremity CBCT. PMID:28989220
Developments in United Kingdom Waveguide Power Standards,
1980-04-01
would manifest itself when a calibrated bolometer was compared with a non-bolometric standard (including a thermistor standard where the current...Geneva mechanism and this ensures extremely smooth mechanical operation. d) temperature control of the thermistor power meters at DI and D2 to better... thermistor heads. During calibration in terms of a power standard, and a subsequent measurement, the noise and drift in the standard power meter and device
[The comparative evaluation of level of security culture in medical organizations].
Roitberg, G E; Kondratova, N V; Galanina, E V
2016-01-01
The study was carried out on the basis of clinic “Medicine” in 2014-2015 concerning security culture. The sampling included 465 filled HSPSC questionnaires. The comparative analysis of received was implemented. The “Zubovskaia district hospital” Having no accreditation according security standards and group of clinics from USA functioning for many years in the system of patient security support were selected as objects for comparison. The evaluation was implemented concerning dynamics of security culture in organization at implementation of strategies of security of patients during 5 years and comparison of obtained results with USA clinics was made. The study results demonstrated that in conditions of absence of implemented standards of security in medical organization total evaluation of security remains extremely low. The study of security culture using HSPSC questionnaire is an effective tool for evaluating implementation of various strategies of security ofpatient. The functioning in the system of international standards of quality, primarily JCI standards, permits during several years to achieve high indices of security culture.
Extreme Droughts In Sydney And Melbourne Since The 1850s
NASA Astrophysics Data System (ADS)
Dogan, Selim
2014-05-01
Sydney and Melbourne are the two highly populated and very well known Australian cities. Population is over 4 million for each. These cities are subject to extreme droughts which affect regional water resources and cause substantial agricultural and economic losses. This study presents a drought analysis of Sydney and Melbourne for the period of 1850s to date by using Effective Drought Index (EDI) and Standardized Precipitation Index (SPI). EDI is a function of precipitation needed for return to normal conditions, the amount of precipitation necessary for recovery from the accumulated deficit since the beginning of a drought. SPI is the most popular and widely used drought index for the last decades. According to the results of EDI analysis; 8 different extreme drought events identified in Sydney, and 5 events in Melbourne since 1850s. The characterization of these extreme drought events were investigated in terms of magnitude, duration, intensity and interarrival time between previous drought event. EDI results were compared with the results of SPI and the similarities and differences were then discussed in more detail. The most severe drought event was identified for the period of July 1979 to February 1981 (lasted 19 months) for Sydney, while the most severe drought took longer in Melbourne for the period of March 2006 to February 2010 (47 months). This study focuses on the benefits of the use of EDI and SPI methods in order to monitor droughts beside presenting the extreme drought case study of Sydney and Melbourne.
Discovery of extreme [O III] λ5007 Å outflows in high-redshift red quasars
NASA Astrophysics Data System (ADS)
Zakamska, Nadia L.; Hamann, Fred; Pâris, Isabelle; Brandt, W. N.; Greene, Jenny E.; Strauss, Michael A.; Villforth, Carolin; Wylezalek, Dominika; Alexandroff, Rachael M.; Ross, Nicholas P.
2016-07-01
Black hole feedback is now a standard component of galaxy formation models. These models predict that the impact of black hole activity on its host galaxy likely peaked at z = 2-3, the epoch of strongest star formation activity and black hole accretion activity in the Universe. We used XSHOOTER on the Very Large Telescope to measure rest-frame optical spectra of four z ˜ 2.5 extremely red quasars with infrared luminosities ˜1047 erg s-1. We present the discovery of very broad (full width at half max = 2600-5000 km s-1), strongly blueshifted (by up to 1500 km s-1) [O III] λ5007 Å emission lines in these objects. In a large sample of type 2 and red quasars, [O III] kinematics are positively correlated with infrared luminosity, and the four objects in our sample are on the extreme end in both [O III] kinematics and infrared luminosity. We estimate that at least 3 per cent of the bolometric luminosity in these objects is being converted into the kinetic power of the observed wind. Photo-ionization estimates suggest that the [O III] emission might be extended on a few kpc scales, which would suggest that the extreme outflow is affecting the entire host galaxy of the quasar. These sources may be the signposts of the most extreme form of quasar feedback at the peak epoch of galaxy formation, and may represent an active `blow-out' phase of quasar evolution.
Pushing the boundaries of viability: the economic impact of extreme preterm birth.
Petrou, Stavros; Henderson, Jane; Bracewell, Melanie; Hockley, Christine; Wolke, Dieter; Marlow, Neil
2006-02-01
Previous assessments of the economic impact of preterm birth focussed on short term health service costs across the broad spectrum of prematurity. To estimate the societal costs of extreme preterm birth during the sixth year after birth. Unit costs were applied to estimates of health, social and broader resource use made by 241 children born at 20 through 25 completed weeks of gestation in the United Kingdom and Republic of Ireland and a comparison group of 160 children born at full term. Societal costs per child during the sixth year after birth were estimated and subjected to a rigorous sensitivity analysis. The effects of gestational age at birth on annual societal costs were analysed, first in a simple linear regression and then in a multiple linear regression. Mean societal costs over the 12 month period were 9541 pounds sterling (standard deviation 11,678 pounds sterling) for the extreme preterm group and 3883 pounds sterling (1098 pounds sterling) for the term group, generating a mean cost difference of 5658 pounds sterling (bootstrap 95% confidence interval: 4203 pounds sterling, 7256 pounds sterling) that was statistically significant (P<0.001). After adjustment for clinical and sociodemographic covariates, sex-specific extreme preterm birth was a strong predictor of high societal costs. The results of this study should facilitate the effective planning of services and may be used to inform the development of future economic evaluations of interventions aimed at preventing extreme preterm birth or alleviating its effects.
The National Extreme Events Data and Research Center (NEED)
NASA Astrophysics Data System (ADS)
Gulledge, J.; Kaiser, D. P.; Wilbanks, T. J.; Boden, T.; Devarakonda, R.
2014-12-01
The Climate Change Science Institute at Oak Ridge National Laboratory (ORNL) is establishing the National Extreme Events Data and Research Center (NEED), with the goal of transforming how the United States studies and prepares for extreme weather events in the context of a changing climate. NEED will encourage the myriad, distributed extreme events research communities to move toward the adoption of common practices and will develop a new database compiling global historical data on weather- and climate-related extreme events (e.g., heat waves, droughts, hurricanes, etc.) and related information about impacts, costs, recovery, and available research. Currently, extreme event information is not easy to access and is largely incompatible and inconsistent across web sites. NEED's database development will take into account differences in time frames, spatial scales, treatments of uncertainty, and other parameters and variables, and leverage informatics tools developed at ORNL (i.e., the Metadata Editor [1] and Mercury [2]) to generate standardized, robust documentation for each database along with a web-searchable catalog. In addition, NEED will facilitate convergence on commonly accepted definitions and standards for extreme events data and will enable integrated analyses of coupled threats, such as hurricanes/sea-level rise/flooding and droughts/wildfires. Our goal and vision is that NEED will become the premiere integrated resource for the general study of extreme events. References: [1] Devarakonda, Ranjeet, et al. "OME: Tool for generating and managing metadata to handle BigData." Big Data (Big Data), 2014 IEEE International Conference on. IEEE, 2014. [2] Devarakonda, Ranjeet, et al. "Mercury: reusable metadata management, data discovery and access system." Earth Science Informatics 3.1-2 (2010): 87-94.
Kropatsch, Regina; Melis, Claudia; Stronen, Astrid V; Jensen, Henrik; Epplen, Joerg T
2015-01-01
The Norwegian Lundehund breed of dog has undergone a severe loss of genetic diversity as a result of inbreeding and epizootics of canine distemper. As a consequence, the breed is extremely homogeneous and accurate sex identification is not always possible by standard screening of X-chromosomal loci. To improve our genetic understanding of the breed we genotyped 17 individuals using a genome-wide array of 170 000 single nucleotide polymorphisms (SNPs). Standard analyses based on expected homozygosity of X-chromosomal loci failed in assigning individuals to the correct sex, as determined initially by physical examination and confirmed with the Y-chromosomal marker, amelogenin. This demonstrates that identification of sex using standard SNP assays can be erroneous in highly inbred individuals. © The American Genetic Association 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Extreme Programming: Maestro Style
NASA Technical Reports Server (NTRS)
Norris, Jeffrey; Fox, Jason; Rabe, Kenneth; Shu, I-Hsiang; Powell, Mark
2009-01-01
"Extreme Programming: Maestro Style" is the name of a computer programming methodology that has evolved as a custom version of a methodology, called extreme programming that has been practiced in the software industry since the late 1990s. The name of this version reflects its origin in the work of the Maestro team at NASA's Jet Propulsion Laboratory that develops software for Mars exploration missions. Extreme programming is oriented toward agile development of software resting on values of simplicity, communication, testing, and aggressiveness. Extreme programming involves use of methods of rapidly building and disseminating institutional knowledge among members of a computer-programming team to give all the members a shared view that matches the view of the customers for whom the software system is to be developed. Extreme programming includes frequent planning by programmers in collaboration with customers, continually examining and rewriting code in striving for the simplest workable software designs, a system metaphor (basically, an abstraction of the system that provides easy-to-remember software-naming conventions and insight into the architecture of the system), programmers working in pairs, adherence to a set of coding standards, collaboration of customers and programmers, frequent verbal communication, frequent releases of software in small increments of development, repeated testing of the developmental software by both programmers and customers, and continuous interaction between the team and the customers. The environment in which the Maestro team works requires the team to quickly adapt to changing needs of its customers. In addition, the team cannot afford to accept unnecessary development risk. Extreme programming enables the Maestro team to remain agile and provide high-quality software and service to its customers. However, several factors in the Maestro environment have made it necessary to modify some of the conventional extreme-programming practices. The single most influential of these factors is that continuous interaction between customers and programmers is not feasible.
14 CFR 27.27 - Center of gravity limits.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 14 Aeronautics and Space 1 2013-01-01 2013-01-01 false Center of gravity limits. 27.27 Section 27... AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Flight General § 27.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity must be...
14 CFR 27.27 - Center of gravity limits.
Code of Federal Regulations, 2011 CFR
2011-01-01
... 14 Aeronautics and Space 1 2011-01-01 2011-01-01 false Center of gravity limits. 27.27 Section 27... AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Flight General § 27.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity must be...
14 CFR 27.27 - Center of gravity limits.
Code of Federal Regulations, 2010 CFR
2010-01-01
... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Center of gravity limits. 27.27 Section 27... AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Flight General § 27.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity must be...
14 CFR 27.27 - Center of gravity limits.
Code of Federal Regulations, 2012 CFR
2012-01-01
... 14 Aeronautics and Space 1 2012-01-01 2012-01-01 false Center of gravity limits. 27.27 Section 27... AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Flight General § 27.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity must be...
14 CFR 27.27 - Center of gravity limits.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 14 Aeronautics and Space 1 2014-01-01 2014-01-01 false Center of gravity limits. 27.27 Section 27... AIRWORTHINESS STANDARDS: NORMAL CATEGORY ROTORCRAFT Flight General § 27.27 Center of gravity limits. The extreme forward and aft centers of gravity and, where critical, the extreme lateral centers of gravity must be...
Extremely cold events and sudden air temperature drops during winter season in the Czech Republic
NASA Astrophysics Data System (ADS)
Crhová, Lenka; Valeriánová, Anna; Holtanová, Eva; Müller, Miloslav; Kašpar, Marek; Stříž, Martin
2014-05-01
Today a great attention is turned to analysis of extreme weather events and frequency of their occurrence under changing climate. In most cases, these studies are focused on extremely warm events in summer season. However, extremely low values of air temperature during winter can have serious impacts on many sectors as well (e.g. power engineering, transportation, industry, agriculture, human health). Therefore, in present contribution we focus on extremely and abnormally cold air temperature events in winter season in the Czech Republic. Besides the seasonal extremes of minimum air temperature determined from station data, the standardized data with removed annual cycle are used as well. Distribution of extremely cold events over the season and the temporal evolution of frequency of occurrence during the period 1961-2010 are analyzed. Furthermore, the connection of cold events with extreme sudden temperature drops is studied. The extreme air temperature events and events of extreme sudden temperature drop are assessed using the Weather Extremity Index, which evaluates the extremity (based on return periods) and spatial extent of the meteorological extreme event of interest. The generalized extreme value distribution parameters are used to estimate return periods of daily temperature values. The work has been supported by the grant P209/11/1990 funded by the Czech Science Foundation.
The space shuttle program from challenge to achievement: Space exploration rolling on tires
NASA Technical Reports Server (NTRS)
Felder, G. L.
1985-01-01
The Space Shuttle Transportation System is the first space program to employ the pneumatic tire as a part of space exploration. For aircraft tires, this program establishes new expectations as to what constitutes acceptable performance within a set of tough environmental and operational conditions. Tire design, stresses the usual low weight, high load, high speed, and excellent air retention features but at extremes well outside industry standards. Tires will continue to be an integral part of the Shuttle's landing phase in the immediate future since they afford a unique combination of directional control, braking traction, flotation and shock absorption not available by other systems.
Neutron reflecting supermirror structure
Wood, James L.
1992-01-01
An improved neutron reflecting supermirror structure comprising a plurality of stacked sets of bilayers of neutron reflecting materials. The improved neutron reflecting supermirror structure is adapted to provide extremely good performance at high incidence angles, i.e. up to four time the critical angle of standard neutron mirror structures. The reflection of neutrons striking the supermirror structure at a high critical angle provides enhanced neutron throughput, and hence more efficient and economical use of neutron sources. One layer of each set of bilayers consist of titanium, and the second layer of each set of bilayers consist of an alloy of nickel with carbon interstitially present in the nickel alloy.
Science Objectives of the JEM EUSO Mission on International Space Station
NASA Technical Reports Server (NTRS)
Takahashi, Yoshiyuki
2007-01-01
JEM-EUSO space observatory is planned with a very large exposure factor which will exceed the critical exposure factor required for observing the most of the sources within the propagational horizon of about one hundred Mpc. The main science objective of JEM-EUSO is the source-identifying astronomy in particle channel with extremey-high-energy particles. Quasi-linear tracking of the source objects through galactic magnetic field should become feasible at energy > 10(exp 20) eV for all-sky. The individual GZK profile in high statistics experiments should differ from source to source due to different distance unless Lorentz invariance is somehow limited. hi addition, JEM-EUSO has three exploratory test observations: (i), extremely high energy neutrinos beginning at E > 10(exp 19) eV: neutrinos as being expected to have a slowly increasing cross section in the Standard Model, and in particular, hundreds of times more in the extra-dimension models. (ii). fundamental physics at extreme Super LHC (Large Hadronic Collider) energies with the hierarchical unified energy much below the GUT scale, and (iii). global atmospheric observation, including large-scale and local plasma discharges, night-glow, meteorites, and others..
DOE Office of Scientific and Technical Information (OSTI.GOV)
Close, Devin W.; Paul, Craig Don; Langan, Patricia S.
In this paper, we describe the engineering and X-ray crystal structure of Thermal Green Protein (TGP), an extremely stable, highly soluble, non-aggregating green fluorescent protein. TGP is a soluble variant of the fluorescent protein eCGP123, which despite being highly stable, has proven to be aggregation-prone. The X-ray crystal structure of eCGP123, also determined within the context of this paper, was used to carry out rational surface engineering to improve its solubility, leading to TGP. The approach involved simultaneously eliminating crystal lattice contacts while increasing the overall negative charge of the protein. Despite intentional disruption of lattice contacts and introduction ofmore » high entropy glutamate side chains, TGP crystallized readily in a number of different conditions and the X-ray crystal structure of TGP was determined to 1.9 Å resolution. The structural reasons for the enhanced stability of TGP and eCGP123 are discussed. We demonstrate the utility of using TGP as a fusion partner in various assays and significantly, in amyloid assays in which the standard fluorescent protein, EGFP, is undesirable because of aberrant oligomerization.« less
Close, Devin W.; Paul, Craig Don; Langan, Patricia S.; ...
2015-05-08
In this paper, we describe the engineering and X-ray crystal structure of Thermal Green Protein (TGP), an extremely stable, highly soluble, non-aggregating green fluorescent protein. TGP is a soluble variant of the fluorescent protein eCGP123, which despite being highly stable, has proven to be aggregation-prone. The X-ray crystal structure of eCGP123, also determined within the context of this paper, was used to carry out rational surface engineering to improve its solubility, leading to TGP. The approach involved simultaneously eliminating crystal lattice contacts while increasing the overall negative charge of the protein. Despite intentional disruption of lattice contacts and introduction ofmore » high entropy glutamate side chains, TGP crystallized readily in a number of different conditions and the X-ray crystal structure of TGP was determined to 1.9 Å resolution. The structural reasons for the enhanced stability of TGP and eCGP123 are discussed. We demonstrate the utility of using TGP as a fusion partner in various assays and significantly, in amyloid assays in which the standard fluorescent protein, EGFP, is undesirable because of aberrant oligomerization.« less
NASA Astrophysics Data System (ADS)
Yan, Hongxiang; Sun, Ning; Wigmosta, Mark; Skaggs, Richard; Hou, Zhangshuan; Leung, Ruby
2018-02-01
There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial overestimation/underestimation of design basis events and subsequent overdesign/underdesign of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme events at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to underdesign, many with significant underestimation of 100 year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for underdesign were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for overdesign at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.
Pütter, Carolin; Pechlivanis, Sonali; Nöthen, Markus M; Jöckel, Karl-Heinz; Wichmann, Heinz-Erich; Scherag, André
2011-01-01
Genome-wide association studies have identified robust associations between single nucleotide polymorphisms and complex traits. As the proportion of phenotypic variance explained is still limited for most of the traits, larger and larger meta-analyses are being conducted to detect additional associations. Here we investigate the impact of the study design and the underlying assumption about the true genetic effect in a bimodal mixture situation on the power to detect associations. We performed simulations of quantitative phenotypes analysed by standard linear regression and dichotomized case-control data sets from the extremes of the quantitative trait analysed by standard logistic regression. Using linear regression, markers with an effect in the extremes of the traits were almost undetectable, whereas analysing extremes by case-control design had superior power even for much smaller sample sizes. Two real data examples are provided to support our theoretical findings and to explore our mixture and parameter assumption. Our findings support the idea to re-analyse the available meta-analysis data sets to detect new loci in the extremes. Moreover, our investigation offers an explanation for discrepant findings when analysing quantitative traits in the general population and in the extremes. Copyright © 2011 S. Karger AG, Basel.
NASA Astrophysics Data System (ADS)
Walker, D.; Ayyub, B. M.
2017-12-01
According to U.S. Census, new construction spending in the U.S. for 2014 was $993 Billion (roughly 6 percent of U.S. GDP). Informing the development of standards of engineering practice related to design and maintenance thus represents a significant opportunity to promote climate adaptation and mitigation, as well as community resilience. The climate science community informs us that extremes of climate and weather are changing from historical values and that the changes are driven substantially by emissions of greenhouse gases caused by human activities. Civil infrastructure systems traditionally have been designed, constructed, operated and maintained for appropriate probabilities of functionality, durability and safety while exposed to climate and weather extremes during their full service lives. Because of uncertainties in future greenhouse gas emissions and in the models for future climate and weather extremes, neither the climate science community nor the engineering community presently can define the statistics of future climate and weather extremes. The American Society for Civil Engineering's (ASCE) Committee on Adapting to a Changing Climate is actively involved in efforts internal and external to ASCE to promote understanding of the challenges climate change represents in engineering practice and to promote a re-examination of those practices that may need to change in light of changing climate. In addition to producing an ASCE e-book, as well as number of ASCE webinars, the Committee is currently developing a Manual of Practice intended to provide guidance for the development or enhancement of standards for infrastructure analysis and design in a world in which risk profiles are changing (non-stationarity) and climate change is a reality, but cannot be projected with a high degree of certainty. This presentation will explore both the need for such guidance as well as some of the challenges and opportunities facing its implementation.
Montgomery, Melissa M; Shultz, Sandra J; Schmitz, Randy J
2014-08-01
Less lean mass and strength may result in greater relative task demands on females compared to males when landing from a standardized height and could explain sex differences in energy absorption strategies. We compared the magnitude of sex differences in energy absorption when task demands were equalized relative to the amount of lower extremity lean mass available to dissipate kinetic energy upon landing. Male-female pairs (n=35) were assessed for lower extremity lean mass with dual-energy X-ray absorptiometry. Relative task demands were calculated when landing from a standardized height. Based on the difference in lower extremity lean mass within each pair, task demands were equalized by increasing the drop height for males. Joint energetics were measured while landing from the two heights. Multivariate repeated measures ANOVAs compared the magnitude of sex differences in joint energetics between conditions. The multivariate test for absolute energy absorption was significant (P<0.01). The magnitude of sex difference in energy absorption was greater at the hip and knee (both P<0.01), but not the ankle (P=0.43) during the equalized condition compared to the standardized and exaggerated conditions (all P<0.01). There was no difference in the magnitude of sex differences between equalized, standardized and exaggerated conditions for relative energy absorption (P=0.18). Equalizing task demands increased the difference in absolute hip and knee energy absorption between sexes, but had no effect on relative joint contributions to total energy absorption. Sex differences in energy absorption are likely influenced by factors other than differences in relative task demands. Copyright © 2014 Elsevier Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Green, Tyler; Kuznetsov, Ilya; Willingham, David
The purpose of this research was to characterize Extreme Ultraviolet Time-of-Flight (EUV TOF) Laser Ablation Mass Spectrometry for high spatial resolution elemental and isotopic analysis. We compare EUV TOF results with Secondary Ionization Mass Spectrometry (SIMS) to orient the EUV TOF method within the overall field of analytical mass spectrometry. Using the well-characterized NIST 61x glasses, we show that the EUV ionization approach produces relatively few molecular ion interferences in comparison to TOF SIMS. We demonstrate that the ratio of element ion to element oxide ion is adjustable with EUV laser pulse energy and that the EUV TOF instrument hasmore » a sample utilization efficiency of 0.014%. The EUV TOF system also achieves a lateral resolution of 80 nm and we demonstrate this lateral resolution with isotopic imaging of closely spaced particles or uranium isotopic standard materials.« less
An accreting pulsar with extreme properties drives an ultraluminous x-ray source in NGC 5907.
Israel, Gian Luca; Belfiore, Andrea; Stella, Luigi; Esposito, Paolo; Casella, Piergiorgio; De Luca, Andrea; Marelli, Martino; Papitto, Alessandro; Perri, Matteo; Puccetti, Simonetta; Castillo, Guillermo A Rodríguez; Salvetti, David; Tiengo, Andrea; Zampieri, Luca; D'Agostino, Daniele; Greiner, Jochen; Haberl, Frank; Novara, Giovanni; Salvaterra, Ruben; Turolla, Roberto; Watson, Mike; Wilms, Joern; Wolter, Anna
2017-02-24
Ultraluminous x-ray sources (ULXs) in nearby galaxies shine brighter than any x-ray source in our Galaxy. ULXs are usually modeled as stellar-mass black holes (BHs) accreting at very high rates or intermediate-mass BHs. We present observations showing that NGC 5907 ULX is instead an x-ray accreting neutron star (NS) with a spin period evolving from 1.43 seconds in 2003 to 1.13 seconds in 2014. It has an isotropic peak luminosity of [Formula: see text]1000 times the Eddington limit for a NS at 17.1 megaparsec. Standard accretion models fail to explain its luminosity, even assuming beamed emission, but a strong multipolar magnetic field can describe its properties. These findings suggest that other extreme ULXs (x-ray luminosity [Formula: see text] 10 41 erg second[Formula: see text]) might harbor NSs. Copyright © 2017, American Association for the Advancement of Science.
NASA Astrophysics Data System (ADS)
Borisov, V. M.; Vinokhodov, A. Yu; Ivanov, A. S.; Kiryukhin, Yu B.; Mishchenko, V. A.; Prokof'ev, A. V.; Khristoforov, O. B.
2009-10-01
The development of high-power discharge sources emitting in the 13.5±0.135-nm spectral band is of current interest because they are promising for applications in industrial EUV (extreme ultraviolet) lithography for manufacturing integrated circuits according to technological precision standards of 22 nm and smaller. The parameters of EUV sources based on a laser-induced discharge in tin vapours between rotating disc electrodes are investigated. The properties of the discharge initiation by laser radiation at different wavelengths are established and the laser pulse parameters providing the maximum energy characteristics of the EUV source are determined. The EUV source developed in the study emits an average power of 276 W in the 13.5±0.135-nm spectral band on conversion to the solid angle 2π sr in the stationary regime at a pulse repetition rate of 3000 Hz.
Zürch, Michael; Foertsch, Stefan; Matzas, Mark; Pachmann, Katharina; Kuth, Rainer; Spielmann, Christian
2014-01-01
Abstract. In cancer treatment, it is highly desirable to classify single cancer cells in real time. The standard method is polymerase chain reaction requiring a substantial amount of resources and time. Here, we present an innovative approach for rapidly classifying different cell types: we measure the diffraction pattern of a single cell illuminated with coherent extreme ultraviolet (XUV) laser-generated radiation. These patterns allow distinguishing different breast cancer cell types in a subsequent step. Moreover, the morphology of the object can be retrieved from the diffraction pattern with submicron resolution. In a proof-of-principle experiment, we prepared single MCF7 and SKBR3 breast cancer cells on gold-coated silica slides. The output of a laser-driven XUV light source is focused onto a single unstained and unlabeled cancer cell. With the resulting diffraction pattern, we could clearly identify the different cell types. With an improved setup, it will not only be feasible to classify circulating tumor cells with a high throughput, but also to identify smaller objects such as bacteria or even viruses. PMID:26158049
High-resolution spectroscopy of the extremely iron-poor post-AGB star CC Lyr
NASA Astrophysics Data System (ADS)
Aoki, Wako; Matsuno, Tadafumi; Honda, Satoshi; Parthasarathy, Mudumba; Li, Haining; Suda, Takuma
2017-04-01
High-resolution optical spectroscopy was conducted for the metal-poor post-AGB star CC Lyr to determine its chemical abundances and spectral line profiles. Our standard abundance analysis confirms its extremely low metallicity ([Fe/H] < -3.5) and a clear correlation between abundance ratios and the condensation temperature for 11 elements, indicating that dust depletion is the cause of the abundance anomaly of this object. The very low abundances of Sr and Ba, which are detected for the first time for this object, suggest that heavy neutron-capture elements are not significantly enhanced in this object by the s-process during its evolution through the AGB phase. The radial velocity of this object and profiles of some atomic absorption lines show variations depending on pulsation phases, which could be formed by dynamics of the atmosphere rather than by binarity or contributions of circumstellar absorption. On the other hand, the Hα emission with double peaks shows no evident velocity shift, suggesting that the emission is originating from the circumstellar matter, presumably the rotating disk around the object.
Rich but poor: life in the Roman period with extreme rheumatoid arthritis.
Bašić, Željana; Jerković, Ivan; Kružić, Ivana; Anđelinović, Šimun
2017-01-01
In a Sidonian sarcophagus, from the Late Antique/early Christian period, skeletal remains of two persons were found. One of them, male, 30-50 years old, was found almost completely ankylosed, with highly osteoporotic bones and prominent erosion of joint surfaces. We diagnosed rheumatoid arthritis based on the eroded odontoid process, mandibular condyles, distal humerus, proximal and distal ulna, as well ankylosed hand and foot bones. Despite the fact that ankyloses of vertebrae and sacroiliac joint could point towards ankylosing spondylitis, the lack of typical vertebral ankyloses and new bone formation led to exclusion. In a practical sense, due to the advanced stage of the disease, the man was fixed in the supine position, on the left, with his head turned to the right. Apparently, he could not move and had problems with chewing and breathing. But, the high standard of provided healthcare probably enabled him to survive in advanced stages of the disease. This case shed light on the antiquity of the disease, its medical, and social context and provided the example of most extreme osteological changes reported in the paleopathological and medical literature.
Zürch, Michael; Foertsch, Stefan; Matzas, Mark; Pachmann, Katharina; Kuth, Rainer; Spielmann, Christian
2014-10-01
In cancer treatment, it is highly desirable to classify single cancer cells in real time. The standard method is polymerase chain reaction requiring a substantial amount of resources and time. Here, we present an innovative approach for rapidly classifying different cell types: we measure the diffraction pattern of a single cell illuminated with coherent extreme ultraviolet (XUV) laser-generated radiation. These patterns allow distinguishing different breast cancer cell types in a subsequent step. Moreover, the morphology of the object can be retrieved from the diffraction pattern with submicron resolution. In a proof-of-principle experiment, we prepared single MCF7 and SKBR3 breast cancer cells on gold-coated silica slides. The output of a laser-driven XUV light source is focused onto a single unstained and unlabeled cancer cell. With the resulting diffraction pattern, we could clearly identify the different cell types. With an improved setup, it will not only be feasible to classify circulating tumor cells with a high throughput, but also to identify smaller objects such as bacteria or even viruses.
Gapeev, A B; Mikhaĭlik, E N; Rubanik, A V; Cheremis, N K
2007-01-01
A pronounced anti-inflammatory effect of high peak-power pulsed electromagnetic radiation of extremely high frequency was shown for the first time in a model of zymosan-induced footpad edema in mice. Exposure to radiation of specific parameters (35, 27 GHz, peak power 20 kW, pulse widths 400-600 ns, pulse repetition frequency 5-500 Hz) decreased the exudative edema and local hyperthermia by 20% compared to the control. The kinetics and the magnitude of the anti-inflammatory effect were comparable with those induced by sodium diclofenac at a dose of 3 mg/kg. It was found that the anti-inflammatory effect linearly increased with increasing pulse width at a fixed pulse repetition frequency and had threshold dependence on the average incident power density of the radiation at a fixed pulse width. When animals were whole-body exposed in the far-field zone of radiator, the optimal exposure duration was 20 min. Increasing the average incident power density upon local exposure of the inflamed paw accelerated both the development of the anti-inflammatory effect and the reactivation time. The results obtained will undoubtedly be of great importance in the hygienic standardization of pulsed electromagnetic radiation and in further studies of the mechanisms of its biological action.
Kowalczewski, Jan; Gritsenko, Valeriya; Ashworth, Nigel; Ellaway, Peter; Prochazka, Arthur
2007-07-01
To test the efficacy of functional electric stimulation (FES)-assisted exercise therapy (FES-ET) on a workstation in the subacute phase of recovery from a stroke. Single-blind, randomly controlled comparison of high- and low-intensity treatment. Laboratory in a rehabilitation hospital. Nineteen stroke survivors (10 men, 9 women; mean age +/- standard deviation, 60.6+/-5.8y), with upper-extremity hemiplegia (mean poststroke time, 48+/-17d). The main inclusion criteria were: stroke occurred within 3 months of onset of trial and resulted in severe upper-limb dysfunction, and FES produced adequate hand opening. An FES stimulator and an exercise workstation with instrumented objects were used by 2 groups to perform specific motor tasks with their affected upper extremity. Ten subjects in the high-intensity FES-ET group received FES-ET for 1 hour a day on 15 to 20 consecutive workdays. Nine subjects in the low-intensity FES-ET group received 15 minutes of sensory electric stimulation 4 days a week and on the fifth day they received 1 hour of FES-ET. Primary outcome measure included the Wolf Motor Function Test (WMFT). Secondary outcome measures included the Motor Activity Log (MAL), the upper-extremity portion of the Fugl-Meyer Assessment (FMA), and the combined kinematic score (CKS) derived from workstation measurements. The WMFT, MAL, and FMA were used to assess function in the absence of FES whereas CKS was used to evaluate function assisted by FES. Improvements in the WMFT and CKS were significantly greater in the high-intensity group (post-treatment effect size, .95) than the low-intensity group (post-treatment effect size, 1.3). The differences in MAL and FMA were not statistically significant. Subjects performing high-intensity FES-ET showed significantly greater improvements on the WMFT than those performing low-intensity FES-ET. However, this was not reflected in subjects' self-assessments (MAL) or in their FMA scores, so the clinical significance of the result is open to debate. The CKS data suggest that high-intensity FES-ET may be advantageous in neuroprosthetic applications.
Wheat and ultra high diluted gibberellic acid--further experiments and re-analysis of data.
Endler, Peter Christian; Scherer-Pongratz, Waltraud; Lothaller, Harald; Stephen, Saundra
2015-10-01
Following studies (a) on wheat seedlings and ultra high diluted silver nitrate, and (b) on amphibians and an ultra high diluted hormone, (c) a bio-assay on wheat and extremely diluted gibberellic acid was standardized. This assay was intended to combine the easy-to-handle aspect of (a) and biologically interesting aspects of (b). The purpose of the data analysis presented here was to investigate the influence of an extreme dilution of gibberellic acid on wheat stalk length and to determine the influence of external factors on the experimental outcome. Grains of winter wheat (Triticum aestivum, Capo variety) were observed under the influence of extremely diluted gibberellic acid (10(-30)) prepared by stepwise dilution and agitation according to a protocol derived from homeopathy ('G30x'). Analogously prepared water was used for control ('W30x'). 16 experiments including 8000+8000 grains were performed by 9 researchers. Experiments that were performed between January and April showed inconsistent results, whereas most of the experiments performed between September and December showed shorter stalks in the G30x group. This was confirmed by correlation analysis (p<0.01). Thus winter/spring experiments and autumn experiments were analysed separately. When all 10 autumn experiments were pooled, mean stalk lengths (mm) were 48.3±21.4 for the verum group and 52.1±20.4 for control (mean±SD) at grain level (N=5000 per group) and ±5.3 and ±5.1 respectively at dish level. In other words, verum stalk length (92.67%) was 7.33% smaller than control stalk length (100%). The effect size is small when calculation is done on the basis of grains (d=0.18) but, due to the smaller SD at dish level, medium when done on the basis of dishes (d=0.73). The inhibiting effect was observed by 6 of the 6 researchers who performed the autumn experiments. The model may be useful for further research as there exists a theoretical justification due to previous studies with wheat and extremely diluted silver nitrate, as well as to previous studies with amphibians and diluted hormones, and its methods are well standardized. Data confirm the hypothesis that information can be stored in the test liquid, even at a dilution of the original substance beyond Avogadro's value; and that the wheat bio-assay is sensitive to such information. Copyright © 2015 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.
Obscenity and Public Morality: Censorship in a Liberal Society.
ERIC Educational Resources Information Center
Clor, Harry M.
Challenging both the extreme libertarians and the extreme moralists, this book explores the public interest in moral norms and moral character and discusses how that interest is best served. Chapters are devoted to (1) "The Evolution of Standards and the 'Roth' Case," (2) "Aftermath of 'Roth'," (3) "The First Amendment and the Free Society:…
Multiresolution Iterative Reconstruction in High-Resolution Extremity Cone-Beam CT
Cao, Qian; Zbijewski, Wojciech; Sisniega, Alejandro; Yorkston, John; Siewerdsen, Jeffrey H; Stayman, J Webster
2016-01-01
Application of model-based iterative reconstruction (MBIR) to high resolution cone-beam CT (CBCT) is computationally challenging because of the very fine discretization (voxel size <100 µm) of the reconstructed volume. Moreover, standard MBIR techniques require that the complete transaxial support for the acquired projections is reconstructed, thus precluding acceleration by restricting the reconstruction to a region-of-interest. To reduce the computational burden of high resolution MBIR, we propose a multiresolution Penalized-Weighted Least Squares (PWLS) algorithm, where the volume is parameterized as a union of fine and coarse voxel grids as well as selective binning of detector pixels. We introduce a penalty function designed to regularize across the boundaries between the two grids. The algorithm was evaluated in simulation studies emulating an extremity CBCT system and in a physical study on a test-bench. Artifacts arising from the mismatched discretization of the fine and coarse sub-volumes were investigated. The fine grid region was parameterized using 0.15 mm voxels and the voxel size in the coarse grid region was varied by changing a downsampling factor. No significant artifacts were found in either of the regions for downsampling factors of up to 4×. For a typical extremities CBCT volume size, this downsampling corresponds to an acceleration of the reconstruction that is more than five times faster than a brute force solution that applies fine voxel parameterization to the entire volume. For certain configurations of the coarse and fine grid regions, in particular when the boundary between the regions does not cross high attenuation gradients, downsampling factors as high as 10× can be used without introducing artifacts, yielding a ~50× speedup in PWLS. The proposed multiresolution algorithm significantly reduces the computational burden of high resolution iterative CBCT reconstruction and can be extended to other applications of MBIR where computationally expensive, high-fidelity forward models are applied only to a sub-region of the field-of-view. PMID:27694701
Gozani, Shai N
2016-01-01
Objective The objective of this study was to determine if fixed-site high-frequency transcutaneous electrical nerve stimulation (FS-TENS) is effective in treating chronic low back and lower extremity pain. Background Transcutaneous electrical nerve stimulation is widely used for treatment of chronic pain. General-purpose transcutaneous electrical nerve stimulation devices are designed for stimulation anywhere on the body and often cannot be used while the user is active or sleeping. FS-TENS devices are designed for placement at a pre-determined location, which enables development of a wearable device for use over extended time periods. Methods Study participants with chronic low back and/or lower extremity pain self-administered an FS-TENS device for 60 days. Baseline, 30-, and 60-day follow-up data were obtained through an online questionnaire. The primary outcome measure was the patient global impression of change. Pain intensity and interference were assessed using the Brief Pain Inventory. Changes in use of concomitant pain medications were evaluated with a single-item global self-rating. Results One hundred and thirty participants were enrolled, with 88 completing the 60-day follow-up questionnaire. Most participants (73.9%) were 50 years of age or older. At baseline, low back pain was identified by 85.3%, lower extremity pain by 71.6%, and upper extremity pain by 62.5%. Participants reported widespread pain, at baseline, with a mean of 3.4 (standard deviation 1.1) pain sites. At the 60-day follow-up, 80.7% of participants reported that their chronic pain had improved and they were classified as responders. Baseline characteristics did not differentiate non-responders from responders. There were numerical trends toward reduced pain interference with walking ability and sleep, and greater pain relief in responders. There was a large difference in use of concomitant pain medications, with 80.3% of responders reporting a reduction compared to 11.8% of non-responders. Conclusion FS-TENS is a safe and effective option for treating chronic low back and lower extremity pain. These results motivate the use of FS-TENS in development of wearable analgesic devices. PMID:27418854
Gozani, Shai N
2016-01-01
The objective of this study was to determine if fixed-site high-frequency transcutaneous electrical nerve stimulation (FS-TENS) is effective in treating chronic low back and lower extremity pain. Transcutaneous electrical nerve stimulation is widely used for treatment of chronic pain. General-purpose transcutaneous electrical nerve stimulation devices are designed for stimulation anywhere on the body and often cannot be used while the user is active or sleeping. FS-TENS devices are designed for placement at a pre-determined location, which enables development of a wearable device for use over extended time periods. Study participants with chronic low back and/or lower extremity pain self-administered an FS-TENS device for 60 days. Baseline, 30-, and 60-day follow-up data were obtained through an online questionnaire. The primary outcome measure was the patient global impression of change. Pain intensity and interference were assessed using the Brief Pain Inventory. Changes in use of concomitant pain medications were evaluated with a single-item global self-rating. One hundred and thirty participants were enrolled, with 88 completing the 60-day follow-up questionnaire. Most participants (73.9%) were 50 years of age or older. At baseline, low back pain was identified by 85.3%, lower extremity pain by 71.6%, and upper extremity pain by 62.5%. Participants reported widespread pain, at baseline, with a mean of 3.4 (standard deviation 1.1) pain sites. At the 60-day follow-up, 80.7% of participants reported that their chronic pain had improved and they were classified as responders. Baseline characteristics did not differentiate non-responders from responders. There were numerical trends toward reduced pain interference with walking ability and sleep, and greater pain relief in responders. There was a large difference in use of concomitant pain medications, with 80.3% of responders reporting a reduction compared to 11.8% of non-responders. FS-TENS is a safe and effective option for treating chronic low back and lower extremity pain. These results motivate the use of FS-TENS in development of wearable analgesic devices.
7 CFR 29.1168 - Nondescript (N Group).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Nondescript (N Group). 29.1168 Section 29.1168 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... REGULATIONS TOBACCO INSPECTION Standards Grades § 29.1168 Nondescript (N Group). Extremely common tobacco...
7 CFR 29.3156 - Nondescript (N Group).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Nondescript (N Group). 29.3156 Section 29.3156 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... REGULATIONS TOBACCO INSPECTION Standards Grades § 29.3156 Nondescript (N Group). Extremely common tobacco...
7 CFR 29.3651 - Nondescript (N Group).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Nondescript (N Group). 29.3651 Section 29.3651 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... REGULATIONS TOBACCO INSPECTION Standards Grades § 29.3651 Nondescript (N Group). Extremely common tobacco...
7 CFR 29.2665 - Nondescript (N Group).
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false Nondescript (N Group). 29.2665 Section 29.2665 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... REGULATIONS TOBACCO INSPECTION Standards Grades § 29.2665 Nondescript (N Group). Extremely common tobacco...
7 CFR 29.1168 - Nondescript (N Group).
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Nondescript (N Group). 29.1168 Section 29.1168 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... REGULATIONS TOBACCO INSPECTION Standards Grades § 29.1168 Nondescript (N Group). Extremely common tobacco...
7 CFR 29.3156 - Nondescript (N Group).
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Nondescript (N Group). 29.3156 Section 29.3156 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... REGULATIONS TOBACCO INSPECTION Standards Grades § 29.3156 Nondescript (N Group). Extremely common tobacco...
7 CFR 29.3651 - Nondescript (N Group).
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Nondescript (N Group). 29.3651 Section 29.3651 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... REGULATIONS TOBACCO INSPECTION Standards Grades § 29.3651 Nondescript (N Group). Extremely common tobacco...
7 CFR 29.2665 - Nondescript (N Group).
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false Nondescript (N Group). 29.2665 Section 29.2665 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards... REGULATIONS TOBACCO INSPECTION Standards Grades § 29.2665 Nondescript (N Group). Extremely common tobacco...
Impact of possible climate changes on river runoff under different natural conditions
NASA Astrophysics Data System (ADS)
Gusev, Yeugeniy M.; Nasonova, Olga N.; Kovalev, Evgeny E.; Ayzel, Georgy V.
2018-06-01
The present study was carried out within the framework of the International Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP) for 11 large river basins located in different continents of the globe under a wide variety of natural conditions. The aim of the study was to investigate possible changes in various characteristics of annual river runoff (mean values, standard deviations, frequency of extreme annual runoff) up to 2100 on the basis of application of the land surface model SWAP and meteorological projections simulated by five General Circulation Models (GCMs) according to four RCP scenarios. Analysis of the obtained results has shown that changes in climatic runoff are different (both in magnitude and sign) for the river basins located in different regions of the planet due to differences in natural (primarily climatic) conditions. The climatic elasticities of river runoff to changes in air temperature and precipitation were estimated that makes it possible, as the first approximation, to project changes in climatic values of annual runoff, using the projected changes in mean annual air temperature and annual precipitation for the river basins. It was found that for most rivers under study, the frequency of occurrence of extreme runoff values increases. This is true both for extremely high runoff (when the projected climatic runoff increases) and for extremely low values (when the projected climatic runoff decreases).
Quantitative Evaluation of Hard X-ray Damage to Biological Samples using EUV Ptychography
NASA Astrophysics Data System (ADS)
Baksh, Peter; Odstrcil, Michal; Parsons, Aaron; Bailey, Jo; Deinhardt, Katrin; Chad, John E.; Brocklesby, William S.; Frey, Jeremy G.
2017-06-01
Coherent diffractive imaging (CDI) has become a standard method on a variety of synchrotron beam lines. The high brilliance short wavelength radiation from these sources can be used to reconstruct attenuation and relative phase of a sample with nanometre resolution via CDI methods. However, the interaction between the sample and high energy ionising radiation can cause degradation to sample structure. We demonstrate, using a laboratory based high harmonic generation (HHG) based extreme ultraviolet (EUV) source, imaging a sample of hippocampal neurons using the ptychography method. The significant increase in contrast of the sample in the EUV light allows identification of damage induced from exposure to 7.3 keV photons, without causing any damage to the sample itself.
Flood hazard assessment in areas prone to flash flooding
NASA Astrophysics Data System (ADS)
Kvočka, Davor; Falconer, Roger A.; Bray, Michaela
2016-04-01
Contemporary climate projections suggest that there will be an increase in the occurrence of high-intensity rainfall events in the future. These precipitation extremes are usually the main cause for the emergence of extreme flooding, such as flash flooding. Flash floods are among the most unpredictable, violent and fatal natural hazards in the world. Furthermore, it is expected that flash flooding will occur even more frequently in the future due to more frequent development of extreme weather events, which will greatly increase the danger to people caused by flash flooding. This being the case, there will be a need for high resolution flood hazard maps in areas susceptible to flash flooding. This study investigates what type of flood hazard assessment methods should be used for assessing the flood hazard to people caused by flash flooding. Two different types of flood hazard assessment methods were tested: (i) a widely used method based on an empirical analysis, and (ii) a new, physically based and experimentally calibrated method. Two flash flood events were considered herein, namely: the 2004 Boscastle flash flood and the 2007 Železniki flash flood. The results obtained in this study suggest that in the areas susceptible to extreme flooding, the flood hazard assessment should be conducted using methods based on a mechanics-based analysis. In comparison to standard flood hazard assessment methods, these physically based methods: (i) take into account all of the physical forces, which act on a human body in floodwater, (ii) successfully adapt to abrupt changes in the flow regime, which often occur for flash flood events, and (iii) rapidly assess a flood hazard index in a relatively short period of time.
Changes in the frequency of extreme air pollution events over the Eastern United States and Europe
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Fiore, A. M.; Fang, Y.; Staehelin, J.
2011-12-01
Over the past few decades, thresholds for national air quality standards, intended to protect public health and welfare, have been lowered repeatedly. At the same time observations, over Europe and the Eastern U.S., demonstrate that extreme air pollution events (high O3 and PM2.5) are typically associated with stagnation events. Recent work showed that in a changing climate high air pollution events are likely to increase in frequency and duration. Within this work we examine meteorological and surface ozone observations from CASTNet over the U.S. and EMEP over Europe and "idealized" simulations with the GFDL AM3 chemistry-climate model, which isolate the role of climate change on air quality. Specifically, we examine an "idealized 1990s" simulation, forced with 20-year mean monthly climatologies for sea surface temperatures and sea ice from observations for 1981-2000, and an "idealized 2090s" simulation forced by the observed climatologies plus the multi-model mean changes in sea surface temperature and sea ice simulated by 19 IPCC AR-4 models under the A1B scenario for 2081-2100. With innovative statistical tools (empirical orthogonal functions (EOFs) and statistics of extremes (EVT)), we analyze the frequency distribution of past, present and future extreme air pollution events over the Eastern United States and Europe. The upper tail of observed values at individual stations (e.g., within the CASTNet), i.e., the extremes (maximum daily 8-hour average (MDA8) O3>60ppb) are poorly described by a Gaussian distribution. However, further analysis showed that applying Peak-Over-Threshold-models, better capture the extremes and allows us to estimate return levels of pollution events above certain threshold values of interest. We next apply EOF analysis to identify regions that vary coherently within the ground-based monitoring networks. Over the United States, the first EOF obtained from the model in both the 1990s and 2090s idealized simulations identifies the Northeast as a region that varies coherently. Correlation analysis reveals that this EOF pattern is most strongly expressed in association with high surface temperature and high surface pressure conditions, consistent with previous work showing that observed O3 episodes over this area reflect the combined impacts of stagnation and increased chemical production. Next steps include the extension of this analysis applying EVT tools to the principal component time series associated with this EOF. The combination of EOF and EVT tools applied to the GFDL AM3 1990s vs. 2090s idealized simulations will enable us to quantify changes in the return levels of air pollution extremes. Therefore the combination of observational data and numerical and statistical models should allow us to identify key driving forces between high air pollution events and to estimate changes in the frequency of such events under different climate change scenarios.
NASA Astrophysics Data System (ADS)
Guan, Wen; Li, Li; Jin, Weiqi; Qiu, Su; Zou, Yan
2015-10-01
Extreme-Low-Light CMOS has been widely applied in the field of night-vision as a new type of solid image sensor. But if the illumination in the scene has drastic changes or the illumination is too strong, Extreme-Low-Light CMOS can't both clearly present the high-light scene and low-light region. According to the partial saturation problem in the field of night-vision, a HDR image fusion algorithm based on the Laplace Pyramid was researched. The overall gray value and the contrast of the low light image is very low. We choose the fusion strategy based on regional average gradient for the top layer of the long exposure image and short exposure image, which has rich brightness and textural features. The remained layers which represent the edge feature information of the target are based on the fusion strategy based on regional energy. In the process of source image reconstruction with Laplacian pyramid image, we compare the fusion results with four kinds of basal images. The algorithm is tested using Matlab and compared with the different fusion strategies. We use information entropy, average gradient and standard deviation these three objective evaluation parameters for the further analysis of the fusion result. Different low illumination environment experiments show that the algorithm in this paper can rapidly get wide dynamic range while keeping high entropy. Through the verification of this algorithm features, there is a further application prospect of the optimized algorithm. Keywords: high dynamic range imaging, image fusion, multi-exposure image, weight coefficient, information fusion, Laplacian pyramid transform.
Estimating missing daily temperature extremes in Jaffna, Sri Lanka
NASA Astrophysics Data System (ADS)
Thevakaran, A.; Sonnadara, D. U. J.
2018-04-01
The accuracy of reconstructing missing daily temperature extremes in the Jaffna climatological station, situated in the northern part of the dry zone of Sri Lanka, is presented. The adopted method utilizes standard departures of daily maximum and minimum temperature values at four neighbouring stations, Mannar, Anuradhapura, Puttalam and Trincomalee to estimate the standard departures of daily maximum and minimum temperatures at the target station, Jaffna. The daily maximum and minimum temperatures from 1966 to 1980 (15 years) were used to test the validity of the method. The accuracy of the estimation is higher for daily maximum temperature compared to daily minimum temperature. About 95% of the estimated daily maximum temperatures are within ±1.5 °C of the observed values. For daily minimum temperature, the percentage is about 92. By calculating the standard deviation of the difference in estimated and observed values, we have shown that the error in estimating the daily maximum and minimum temperatures is ±0.7 and ±0.9 °C, respectively. To obtain the best accuracy when estimating the missing daily temperature extremes, it is important to include Mannar which is the nearest station to the target station, Jaffna. We conclude from the analysis that the method can be applied successfully to reconstruct the missing daily temperature extremes in Jaffna where no data is available due to frequent disruptions caused by civil unrests and hostilities in the region during the period, 1984 to 2000.
Balaguer Martínez, Josep Vicent; Del Castillo Aguas, Guadalupe; Gallego Iborra, Ana
2017-12-30
To assess whether there is a relationship between the prescription of antibiotics and the performance of complementary tests with frequency of use and loyalty in Primary Care. Analytical descriptive study performed through a network of Primary Care sentinel paediatricians (PAPenRed). Each paediatrician reviewed the spontaneous visits (in Primary Care and in Emergency Departments) of 15 patients for 12 months, randomly chosen from their quota. The prescription of antibiotics and the complementary tests performed on these patients were also collected. A total of 212 paediatricians took part and reviewed 2,726 patients. It was found that 8.3% were moderate over-users (mean + 1-2 standard deviations) and 5.2% extreme over-users (mean + 2 standard deviations). Almost half (49.6%) were high-loyalty patients (more than 75% of visits with their doctor). The incidence ratio of antibiotic prescriptions for moderate over-users was 2.13 (1.74-2.62) and 3.25 (2.55-4.13) for extreme over-users, compared to non-over-user children. The incidence ratio for the diagnostic tests were 2.25 (1.86-2.73) and 3.48 (2.78-4.35), respectively. The incidence ratios for antibiotic prescription were 1.34 (1.16-1.55) in patients with medium-high loyalty, 1.45 (1.15-1.83) for medium-low loyalty, and 1.08 (0.81-1.44) for those with low loyalty, compared to patients with high loyalty. The incidence ratios to perform diagnostic tests were 1.46 (1.27-1.67); 1.60 (1.28 - 2.00), and 0.84 (0.63-1.12), respectively. Antibiotics prescription and complementary tests were significantly related to medical overuse. They were also related to loyalty, but less significantly. Copyright © 2017. Publicado por Elsevier España, S.L.U.
Synergetic approach for simple and rapid conjugation of gold nanoparticles with oligonucleotides.
Li, Jiuxing; Zhu, Binqing; Yao, Xiujie; Zhang, Yicong; Zhu, Zhi; Tu, Song; Jia, Shasha; Liu, Rudi; Kang, Huaizhi; Yang, Chaoyong James
2014-10-08
Attaching thiolated DNA on gold nanoparticles (AuNPs) has been extremely important in nanobiotechnology because DNA-AuNPs combine the programmability and molecular recognition properties of the biopolymers with the optical, thermal, and catalytic properties of the inorganic nanomaterials. However, current standard protocols to attach thiolated DNA on AuNPs involve time-consuming, tedious steps and do not perform well for large AuNPs, thereby greatly restricting applications of DNA-AuNPs. Here we demonstrate a rapid and facile strategy to attach thiolated DNA on AuNPs based on the excellent stabilization effect of mPEG-SH on AuNPs. AuNPs are first protected by mPEG-SH in the presence of Tween 20, which results in excellent stability of AuNPs in high ionic strength environments and extreme pHs. A high concentration of NaCl can be applied to the mixture of DNA and AuNP directly, allowing highly efficient DNA attachment to the AuNP surface by minimizing electrostatic repulsion. The entire DNA loading process can be completed in 1.5 h with only a few simple steps. DNA-loaded AuNPs are stable for more than 2 weeks at room temperature, and they can precisely hybridize with the complementary sequence, which was applied to prepare core-satellite nanostructures. Moreover, cytotoxicity assay confirmed that the DNA-AuNPs synthesized by this method exhibit lower cytotoxicity than those prepared by current standard methods. The proposed method provides a new way to stabilize AuNPs for rapid and facile loading thiolated DNA on AuNPs and will find wide applications in many areas requiring DNA-AuNPs, including diagnosis, therapy, and imaging.
Bayer, F L
1997-01-01
Recycled plastics have been used in food-contact applications since 1990 in various countries around the world. To date, there have been no reported issues concerning health or off-taste resulting from the use of recycled plastics in food-contact applications. This is due to the fact that the criteria that have been established regarding safety and processing are based on extremely high standards that render the finished recycled material equivalent in virtually all aspects to virgin polymers. The basis for this conclusion is detailed in this document.
Characterization of the IEC 61000-4-6 Electromagnetic Clamp for Conducted-Immunity Testing
NASA Astrophysics Data System (ADS)
Grassi, F.; Pignari, S. A.; Spadacini, G.; Toscani, N.; Pelissou, P.
2016-05-01
A multiconductor transmission line model (MTL) is used to investigate the operation of the IEC 61000-4-6 electromagnetic (EM) clamp in a conducted-immunity test setup for aerospace applications. Aspects of interest include the performance of such a coupling device at very high frequencies (up to 1 GHz), and for extreme values of the common-mode impedance of equipment (short circuits, open circuits). The MTL model is finally exploited to predict the frequency response of coupling and decoupling factors defined in the IEC 61000-4-6 standard.
Ultrasonographic identification of the anatomical landmarks that define cervical lymph nodes spaces.
Lenghel, Lavinia Manuela; Baciuţ, Grigore; Botar-Jid, Carolina; Vasilescu, Dan; Bojan, Anca; Dudea, Sorin M
2013-03-01
The localization of cervical lymph nodes is extremely important in practice for the positive and differential diagnosis as well as the staging of cervical lymphadenopathies. Ultrasonography represents the first line imaging method in the diagnosis of cervical lymphadenopathies due to its excellent resolution and high diagnosis accuracy. The present paper aims to illustrate the ultrasonographic identification of the anatomical landmarks used for the definition of cervical lymphatic spaces. The application of standardized views allows a delineation of clear anatomical landmarks and an accurate localization of the cervical lymph nodes.
Tidally Heated Terrestrial Exoplanets
NASA Astrophysics Data System (ADS)
Henning, Wade Garrett
This work models the surface and internal temperatures for hypothetical terrestrial planets in situations involving extreme tidal heating. The feasibility of such planets is evaluated in terms of the orbital perturbations that may give rise to them, their required proximity to a hoststar, and the potential for the input tidal heating to cause significant partial melting of the mantle. Trapping terrestrial planets into 2:1 resonances with migrating Hot Jupiters is considered as a reasonable way for Earth-like worlds to both maintain high eccentricities and to move to short enough orbital periods (1-20 days) for extreme tidal heating to occur. Secular resonance and secular orbital perturbations may support moderate tidal heating at a low equilibrium eccentricity. At orbital periods below 10-30 days, with eccentricities from 0.01 to 0.1, tidal heat may greatly exceed radiogenic heat production. It is unlikely to exceed insolation, except when orbiting very low luminosity hosts, and thus will have limited surface temperature expression. Observations of such bodies many not be able to detect tidal surface enhancements given a few percent uncertainty in albedo, except on the nightside of spin synchronous airless objects. Otherwise detection may occur via spectral detection of hotspots or high volcanic gas concentrations including sulfur dioxide and hydrogen sulfide. The most extreme cases may be able to produce magma oceans, or magma slush mantles with up to 40-60% melt fractions. Tides may alter the habitable zones for smaller red dwarf stars, but are generally detrimental. Multiple viscoelastic models, including the Maxwell, Voigt-Kelvin, Standard Anelastic Solid, and Burgers rheologies are explored and applied to objects such as Io and the super-Earth planet GJ 876d. The complex valued Love number for the Burgers rheology is derived and found to be a useful improvement when modeling the low temperature behavior of tidal bodies, particularly during low eccentricity excursions. Viscoelastic solutions for GJ 876d are typical of extreme short period high eccentricity objects with tidal-convectiveequilibrium heat rates between ˜10,000 to 500,000 terawatts.
Cintas, Holly Lea; Parks, Rebecca; Don, Sarah; Gerber, Lynn
2011-01-01
Content validity and reliability of the Brief Assessment of Motor Function (BAMF) Upper Extremity Gross Motor Scale (UEGMS) were evaluated in this prospective, descriptive study. The UEGMS is one of five ordinal scales designed for quick documentation of gross, fine and oral motor skill levels. Designed to be independent of age and diagnosis, it is intended for use for infants through young adults. An expert panel of 17 physical therapists and 13 occupational therapists refined the content by responding to a standard questionnaire comprised of questions which asked whether each item should be included, is clearly worded, should be reordered higher or lower, is functionally relevant, and is easily discriminated. Ratings of content validity exceeded the criterion except for two items which may represent different perspectives of physical and occupational therapists. The UEGMS was modified using the quantitative and qualitative feedback from the questionnaires. For reliability, five raters scored videotaped motor performances of ten children. Coefficients for inter-rater (0.94) and intra-rater (0.95) reliability were high. The results provide evidence of content validity and reliability of the UEGMS for assessment of upper extremity gross motor skill. PMID:21599568
Santos, Celso Augusto Guimarães; Brasil Neto, Reginaldo Moura; Passos, Jacqueline Sobral de Araújo; da Silva, Richarde Marques
2017-06-01
In this work, the use of Tropical Rainfall Measuring Mission (TRMM) rainfall data and the Standardized Precipitation Index (SPI) for monitoring spatial and temporal drought variabilities in the Upper São Francisco River basin is investigated. Thus, the spatiotemporal behavior of droughts and cluster regions with similar behaviors is identified. As a result, the joint analysis of clusters, dendrograms, and the spatial distribution of SPI values proved to be a powerful tool in identifying homogeneous regions. The results showed that the northeast region of the basin has the lowest rainfall indices and the southwest region has the highest rainfall depths, and that the region has well-defined dry and rainy seasons from June to August and November to January, respectively. An analysis of the drought and rain conditions showed that the studied region was homogeneous and well-distributed; however, the quantity of extreme and severe drought events in short-, medium- and long-term analysis was higher than that expected in regions with high rainfall depths, particularly in the south/southwest and southeast areas. Thus, an alternative classification is proposed to characterize the drought, which spatially categorizes the drought type (short-, medium-, and long-term) according to the analyzed drought event type (extreme, severe, moderate, and mild).
NASA Astrophysics Data System (ADS)
Bourdine, Anton V.; Zhukov, Alexander E.
2017-04-01
High bit rate laser-based data transmission over silica optical fibers with enlarged core diameter in comparison with standard singlemode fibers is found variety infocommunication applications. Since IEEE 802.3z standard was ratified on 1998 this technique started to be widely used for short-range in-premises distributed multi-Gigabit networks based on new generation laser optimized multimode fibers 50/125 of Cat. OM2…OM4. Nowadays it becomes to be in demand for on-board cable systems and industrial network applications requiring 1Gps and more bit rates over fibers with extremely enlarged core diameter up to 100 μm. This work presents an alternative method for design the special refractive index profiles of silica few-mode fibers with extremely enlarged core diameter, that provides modal bandwidth enhancing under a few-mode regime of laser-based data optical transmission. Here some results are presented concerning with refractive index profile synthesis for few-mode fibers with reduced differential mode delay for "O"-band central region, as well as computed differential mode delay spectral curves corresponding to profiles for fibers 50/125 and 100/125 for in-premises and on-board/industrial cable systems.
The other side of the coin: urban heat islands as shields from extreme cold
NASA Astrophysics Data System (ADS)
Yang, J.; Bou-Zeid, E.
2017-12-01
Extensive studies focusing on urban heat islands (UHIs) during hot periods create a perception that UHIs are invariably hazardous to human health and the sustainability of cities. Consequently, cities have invested substantial resources to try to mitigate UHIs. These urban policies can have serious repercussions since the health risks associated with cold weather are in fact higher than for heat episodes, yet wintertime UHIs have hardly been explored. We combine ground observations from 12 U.S. cities and high-resolution simulations to show that UHIs not only warm urban areas in the winter, but also further intensify during cold waves by up to 1.32 ± 0.78 oC (mean ± standard deviation) at night. Urban heat islands serve as shelters against extreme colds and provide invaluable benefits of reducing health risks and heating demand. More importantly, our simulations indicate that standard UHI mitigation measures such as green or cool roofs reduce these cold time amenities to different extents. Cities, particularly in cool and cold temperate climates, should hence revisit policies and efforts that are only desgined for hot periods. A paradigm shift is urgently needed to give an equal weight to the wintertime benefits of UHIs in the sustainability and resilience blueprints of cities.
Low Cost Desktop Image Analysis Workstation With Enhanced Interactive User Interface
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Huang, H. K.
1989-05-01
A multimodality picture archiving and communication system (PACS) is in routine clinical use in the UCLA Radiology Department. Several types workstations are currently implemented for this PACS. Among them, the Apple Macintosh II personal computer was recently chosen to serve as a desktop workstation for display and analysis of radiological images. This personal computer was selected mainly because of its extremely friendly user-interface, its popularity among the academic and medical community and its low cost. In comparison to other microcomputer-based systems the Macintosh II offers the following advantages: the extreme standardization of its user interface, file system and networking, and the availability of a very large variety of commercial software packages. In the current configuration the Macintosh II operates as a stand-alone workstation where images are imported from a centralized PACS server through an Ethernet network using a standard TCP-IP protocol, and stored locally on magnetic disk. The use of high resolution screens (1024x768 pixels x 8bits) offer sufficient performance for image display and analysis. We focused our project on the design and implementation of a variety of image analysis algorithms ranging from automated structure and edge detection to sophisticated dynamic analysis of sequential images. Specific analysis programs were developed for ultrasound images, digitized angiograms, MRI and CT tomographic images and scintigraphic images.
Wolfe, Edward W; McGill, Michael T
2011-01-01
This article summarizes a simulation study of the performance of five item quality indicators (the weighted and unweighted versions of the mean square and standardized mean square fit indices and the point-measure correlation) under conditions of relatively high and low amounts of missing data under both random and conditional patterns of missing data for testing contexts such as those encountered in operational administrations of a computerized adaptive certification or licensure examination. The results suggest that weighted fit indices, particularly the standardized mean square index, and the point-measure correlation provide the most consistent information between random and conditional missing data patterns and that these indices perform more comparably for items near the passing score than for items with extreme difficulty values.
TH-A-BRF-09: Integration of High-Resolution MRSI Into Glioblastoma Treatment Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schreibmann, E; Cordova, J; Shu, H
2014-06-15
Purpose: Identification of a metabolite signature that shows significant tumor cell infiltration into normal brain in regions that do not appear abnormal on standard MRI scans would be extremely useful for radiation oncologists to choose optimal regions of brain to treat, and to quantify response beyond the MacDonald criteria. We report on integration of high-resolution magnetic resonance spectroscopic imaging (HR-MRSI) with radiation dose escalation treatment planning to define and target regions at high risk for recurrence. Methods: We propose to supplement standard MRI with a special technique performed on an MRI scanner to measure the metabolite levels within defined volumes.more » Metabolite imaging was acquired using an advanced MRSI technique combining 3D echo-planar spectroscopic imaging (EPSI) with parallel acquisition (GRAPPA) using a multichannel head coil that allows acquisition of whole brain metabolite maps with 108 μl resolution in 12 minutes implemented on a 3T MR scanner. Elevation in the ratio of two metabolites, choline (Cho, elevated in proliferating high-grade gliomas) and N-acetyl aspartate (NAA, a normal neuronal metabolite), was used to image infiltrating high-grade glioma cells in vivo. Results: The metabolite images were co-registered with standard contrast-enhanced T1-weighted MR images using in-house registration software and imported into the treatment-planning system. Regions with tumor infiltration are identified on the metabolic images and used to create adaptive IMRT plans that deliver a standard dose of 60 Gy to the standard target volume and an escalated dose of 75 Gy (or higher) to the most suspicious regions, identified as areas with elevated Cho/NAA ratio. Conclusion: We have implemented a state-of-the-art HR-MRSI technology that can generate metabolite maps of the entire brain in a clinically acceptable scan time, coupled with introduction of an imaging co-registration/ analysis program that combines MRSI data with standard imaging studies in a clinically useful fashion.« less
Structural Extremes in a Cretaceous Dinosaur
Sereno, Paul C.; Wilson, Jeffrey A.; Witmer, Lawrence M.; Whitlock, John A.; Maga, Abdoulaye; Ide, Oumarou; Rowe, Timothy A.
2007-01-01
Fossils of the Early Cretaceous dinosaur, Nigersaurus taqueti, document for the first time the cranial anatomy of a rebbachisaurid sauropod. Its extreme adaptations for herbivory at ground-level challenge current hypotheses regarding feeding function and feeding strategy among diplodocoids, the larger clade of sauropods that includes Nigersaurus. We used high resolution computed tomography, stereolithography, and standard molding and casting techniques to reassemble the extremely fragile skull. Computed tomography also allowed us to render the first endocast for a sauropod preserving portions of the olfactory bulbs, cerebrum and inner ear, the latter permitting us to establish habitual head posture. To elucidate evidence of tooth wear and tooth replacement rate, we used photographic-casting techniques and crown thin sections, respectively. To reconstruct its 9-meter postcranial skeleton, we combined and size-adjusted multiple partial skeletons. Finally, we used maximum parsimony algorithms on character data to obtain the best estimate of phylogenetic relationships among diplodocoid sauropods. Nigersaurus taqueti shows extreme adaptations for a dinosaurian herbivore including a skull of extremely light construction, tooth batteries located at the distal end of the jaws, tooth replacement as fast as one per month, an expanded muzzle that faces directly toward the ground, and hollow presacral vertebral centra with more air sac space than bone by volume. A cranial endocast provides the first reasonably complete view of a sauropod brain including its small olfactory bulbs and cerebrum. Skeletal and dental evidence suggests that Nigersaurus was a ground-level herbivore that gathered and sliced relatively soft vegetation, the culmination of a low-browsing feeding strategy first established among diplodocoids during the Jurassic. PMID:18030355
Hedman, Travis L; Chapman, Ted T; Dewey, William S; Quick, Charles D; Wolf, Steven E; Holcomb, John B
2007-01-01
Burn therapists routinely are tasked to position the lower extremities of burn patients for pressure ulcer prevention, skin graft protection, donor site ventilation, and edema reduction. We developed two durable and low-maintenance devices that allow effective positioning of the lower extremities. The high-profile and low-profile leg net devices were simple to fabricate and maintain. The frame was assembled using a three-quarter-inch diameter copper pipe and copper fittings (45 degrees, 90 degrees, and tees). A double layer of elasticized tubular netting was pulled over the frame and doubled back for leg support to complete the devices. The devices can be placed on any bed surface. The netting can be exchanged when soiled and the frame can be disinfected between patients using standard techniques. Both devices were used on approximately 250 patients for a total of 1200 treatment days. No incidence of pressure ulcer was observed, and graft take was not adversely affected. The devices have not required repairs or replacement. Medical providers reported they are easy to apply and effectively maintain proper positioning throughout application. Neither device interfered with the application of other positioning devices. Both devices were found to be an effective method of positioning lower extremities to prevent pressure ulcer, minimize graft loss and donor site morbidity, and reduce edema. The devices allowed for proper wound ventilation and protected grafted lower extremities on any bed surface. The devices are simple to fabricate and maintain. Both devices can be effectively used simultaneously with other positioning devices.
Shan, Tong; Liu, Yulong; Tang, Xiangyang; Bai, Qing; Gao, Yu; Gao, Zhao; Li, Jinyu; Deng, Jian; Yang, Bing; Lu, Ping; Ma, Yuguang
2016-10-26
Great efforts have been devoted to develop efficient deep blue organic light-emitting diodes (OLEDs) materials meeting the standards of European Broadcasting Union (EBU) standard with Commission International de L'Eclairage (CIE) coordinates of (0.15, 0.06) for flat-panel displays and solid-state lightings. However, high-performance deep blue OLEDs are still rare for applications. Herein, two efficient deep blue emitters, PIMNA and PyINA, are designed and synthesized by coupling naphthalene with phenanthreneimidazole and pyreneimidazole, respectively. The balanced ambipolar transporting natures of them are demonstrated by single-carrier devices. Their nondoped OLEDs show deep blue emissions with extremely small CIE y of 0.034 for PIMNA and 0.084 for PyINA, with negligible efficiency roll-off. To take advantage of high photoluminescence quantum efficiency of PIMNA and large fraction of singlet exciton formation of PyINA, doped devices are fabricated by dispersing PyINA into PIMNA. A significantly improved maximum external quantum efficiency (EQE) of 5.05% is obtained through very effective energy transfer with CIE coordinates of (0.156, 0.060), and the EQE remains 4.67% at 1000 cd m -2 , which is among the best of deep blue OLEDs reported matching stringent EBU standard well.
Constructing An Event Based Aerosol Product Under High Aerosol Loading Conditions
NASA Astrophysics Data System (ADS)
Levy, R. C.; Shi, Y.; Mattoo, S.; Remer, L. A.; Zhang, J.
2016-12-01
High aerosol loading events, such as the Indonesia's forest fire in Fall 2015 or the persistent wintertime haze near Beijing, gain tremendous interests due to their large impact on regional visibility and air quality. Understanding the optical properties of these events and further being able to simulate and predict these events are beneficial. However, it is a great challenge to consistently identify and then retrieve aerosol optical depth (AOD) from passive sensors during heavy aerosol events. Some reasons include:1). large differences between optical properties of high-loading aerosols and those under normal conditions, 2) spectral signals of optically thick aerosols can be mistaken with surface depending on aerosol types, and 3) Extremely optically thick aerosol plumes can also be misidentified as clouds due to its high optical thickness. Thus, even under clear-sky conditions, the global distribution of extreme aerosol events is not well captured in datasets such as the MODIS Dark-Target (DT) aerosol product. In this study, with the synthetic use of OMI Aerosol Index, MODIS cloud product, and operational DT product, the heavy smoke events over the seven sea region are identified and retrieved over the dry season. An event based aerosol product that would compensate the standard "global" aerosol retrieval will be created and evaluated. The impact of missing high AOD retrievals on the regional aerosol climatology will be studied using this newly developed research product.
Testing and the Testing Industry: A Third View.
ERIC Educational Resources Information Center
Williams, John D.
Different viewpoints regarding educational testing are described. While some people advocate continuing reliance upon standardized tests, others favor the discontinuation of such achievement and intelligence tests. The author recommends a moderate view somewhere between these two extremes. Problems associated with standardized testing in the…
Code of Federal Regulations, 2011 CFR
2011-01-01
... 7 Agriculture 2 2011-01-01 2011-01-01 false (N Group). 29.2440 Section 29.2440 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Grades § 29.2440 (N Group). Extremely common tobacco which does not meet the minimum...
Code of Federal Regulations, 2010 CFR
2010-01-01
... 7 Agriculture 2 2010-01-01 2010-01-01 false (N Group). 29.2440 Section 29.2440 Agriculture Regulations of the Department of Agriculture AGRICULTURAL MARKETING SERVICE (Standards, Inspections, Marketing... INSPECTION Standards Grades § 29.2440 (N Group). Extremely common tobacco which does not meet the minimum...
NASA Technical Reports Server (NTRS)
Munasinghe, L.; Jun, T.; Rind, D. H.
2012-01-01
Consensus on global warming is the result of multiple and varying lines of evidence, and one key ramification is the increase in frequency of extreme climate events including record high temperatures. Here we develop a metric- called "record equivalent draws" (RED)-based on record high (low) temperature observations, and show that changes in RED approximate changes in the likelihood of extreme high (low) temperatures. Since we also show that this metric is independent of the specifics of the underlying temperature distributions, RED estimates can be aggregated across different climates to provide a genuinely global assessment of climate change. Using data on monthly average temperatures across the global landmass we find that the frequency of extreme high temperatures increased 10-fold between the first three decades of the last century (1900-1929) and the most recent decade (1999-2008). A more disaggregated analysis shows that the increase in frequency of extreme high temperatures is greater in the tropics than in higher latitudes, a pattern that is not indicated by changes in mean temperature. Our RED estimates also suggest concurrent increases in the frequency of both extreme high and extreme low temperatures during 2002-2008, a period when we observe a plateauing of global mean temperature. Using daily extreme temperature observations, we find that the frequency of extreme high temperatures is greater in the daily minimum temperature time-series compared to the daily maximum temperature time-series. There is no such observable difference in the frequency of extreme low temperatures between the daily minimum and daily maximum.
FastID: Extremely Fast Forensic DNA Comparisons
2017-05-19
FastID: Extremely Fast Forensic DNA Comparisons Darrell O. Ricke, PhD Bioengineering Systems & Technologies Massachusetts Institute of...Technology Lincoln Laboratory Lexington, MA USA Darrell.Ricke@ll.mit.edu Abstract—Rapid analysis of DNA forensic samples can have a critical impact on...time sensitive investigations. Analysis of forensic DNA samples by massively parallel sequencing is creating the next gold standard for DNA
Eating Problems at Age 6 Years in a Whole Population Sample of Extremely Preterm Children
ERIC Educational Resources Information Center
Samara, Muthanna; Johnson, Samantha; Lamberts, Koen; Marlow, Neil; Wolke, Dieter
2010-01-01
Aim: The aim of this study was to investigate the prevalence of eating problems and their association with neurological and behavioural disabilities and growth among children born extremely preterm (EPC) at age 6 years. Method: A standard questionnaire about eating was completed by parents of 223 children (125 males [56.1%], 98 females [43.9%])…
ERIC Educational Resources Information Center
Sharma, Kshitij; Chavez-Demoulin, Valérie; Dillenbourg, Pierre
2017-01-01
The statistics used in education research are based on central trends such as the mean or standard deviation, discarding outliers. This paper adopts another viewpoint that has emerged in statistics, called extreme value theory (EVT). EVT claims that the bulk of normal distribution is comprised mainly of uninteresting variations while the most…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eckert-Gallup, Aubrey C.; Sallaberry, Cédric J.; Dallman, Ann R.
Environmental contours describing extreme sea states are generated as the input for numerical or physical model simulations as a part of the standard current practice for designing marine structures to survive extreme sea states. These environmental contours are characterized by combinations of significant wave height (H s) and either energy period (T e) or peak period (T p) values calculated for a given recurrence interval using a set of data based on hindcast simulations or buoy observations over a sufficient period of record. The use of the inverse first-order reliability method (I-FORM) is a standard design practice for generating environmentalmore » contours. This paper develops enhanced methodologies for data analysis prior to the application of the I-FORM, including the use of principal component analysis (PCA) to create an uncorrelated representation of the variables under consideration as well as new distribution and parameter fitting techniques. As a result, these modifications better represent the measured data and, therefore, should contribute to the development of more realistic representations of environmental contours of extreme sea states for determining design loads for marine structures.« less
Eckert-Gallup, Aubrey C.; Sallaberry, Cédric J.; Dallman, Ann R.; ...
2016-01-06
Environmental contours describing extreme sea states are generated as the input for numerical or physical model simulations as a part of the standard current practice for designing marine structures to survive extreme sea states. These environmental contours are characterized by combinations of significant wave height (H s) and either energy period (T e) or peak period (T p) values calculated for a given recurrence interval using a set of data based on hindcast simulations or buoy observations over a sufficient period of record. The use of the inverse first-order reliability method (I-FORM) is a standard design practice for generating environmentalmore » contours. This paper develops enhanced methodologies for data analysis prior to the application of the I-FORM, including the use of principal component analysis (PCA) to create an uncorrelated representation of the variables under consideration as well as new distribution and parameter fitting techniques. As a result, these modifications better represent the measured data and, therefore, should contribute to the development of more realistic representations of environmental contours of extreme sea states for determining design loads for marine structures.« less
High pressure studies using two-stage diamond micro-anvils grown by chemical vapor deposition
Vohra, Yogesh K.; Samudrala, Gopi K.; Moore, Samuel L.; ...
2015-06-10
Ultra-high static pressures have been achieved in the laboratory using a two-stage micro-ball nanodiamond anvils as well as a two-stage micro-paired diamond anvils machined using a focused ion-beam system. The two-stage diamond anvils’ designs implemented thus far suffer from a limitation of one diamond anvil sliding past another anvil at extreme conditions. We describe a new method of fabricating two-stage diamond micro-anvils using a tungsten mask on a standard diamond anvil followed by microwave plasma chemical vapor deposition (CVD) homoepitaxial diamond growth. A prototype two stage diamond anvil with 300 μm culet and with a CVD diamond second stage ofmore » 50 μm in diameter was fabricated. We have carried out preliminary high pressure X-ray diffraction studies on a sample of rare-earth metal lutetium sample with a copper pressure standard to 86 GPa. Furthermore, the micro-anvil grown by CVD remained intact during indentation of gasket as well as on decompression from the highest pressure of 86 GPa.« less
Ciotlos, Serban; Mao, Qing; Zhang, Rebecca Yu; Li, Zhenyu; Chin, Robert; Gulbahce, Natali; Liu, Sophie Jia; Drmanac, Radoje; Peters, Brock A
2016-01-01
The cell line BT-474 is a popular cell line for studying the biology of cancer and developing novel drugs. However, there is no complete, published genome sequence for this highly utilized scientific resource. In this study we sought to provide a comprehensive and useful data set for the scientific community by generating a whole genome sequence for BT-474. Five μg of genomic DNA, isolated from an early passage of the BT-474 cell line, was used to generate a whole genome sequence (114X coverage) using Complete Genomics' standard sequencing process. To provide additional variant phasing and structural variation data we also processed and analyzed two separate libraries of 5 and 6 individual cells to depths of 99X and 87X, respectively, using Complete Genomics' Long Fragment Read (LFR) technology. BT-474 is a highly aneuploid cell line with an extremely complex genome sequence. This ~300X total coverage genome sequence provides a more complete understanding of this highly utilized cell line at the genomic level.
Radon gamma-ray spectrometry with YAP:Ce scintillator
NASA Astrophysics Data System (ADS)
Plastino, Wolfango; De Felice, Pierino; de Notaristefani, Francesco
2002-06-01
The detection properties of a YAP:Ce scintillator (YAlO 3:Ce crystal) optically coupled to a Hamamatsu H5784 photomultiplier with standard bialkali photocathode have been analyzed. In particular, the application to radon and radon-daughters gamma-ray spectrometry was investigated. The crystal response has been studied under severe extreme conditions to simulate environments of geophysical interest, particularly those found in geothermal and volcanic areas. Tests in water up to a temperature of 100°C and in acids solutions such as HCl (37%), H 2SO 4 (48%) and HNO 3 (65%) have been performed. The measurements with standard radon sources provided by the National Institute for Metrology of Ionizing Radiations (ENEA) have emphasized the non-hygroscopic properties of the scintillator and a small dependence of the light yield on temperature and HNO 3. The data collected in this first step of our research have pointed out that the YAP:Ce scintillator can allow high response stability for radon gamma-ray spectrometry in environments with large temperature gradients and high acid concentrations.
The Relationships Between the Trends of Mean and Extreme Precipitation
NASA Technical Reports Server (NTRS)
Zhou, Yaping; Lau, William K.-M.
2017-01-01
This study provides a better understanding of the relationships between the trends of mean and extreme precipitation in two observed precipitation data sets: the Climate Prediction Center Unified daily precipitation data set and the Global Precipitation Climatology Program (GPCP) pentad data set. The study employs three kinds of definitions of extreme precipitation: (1) percentile, (2) standard deviation and (3) generalize extreme value (GEV) distribution analysis for extreme events based on local statistics. Relationship between trends in the mean and extreme precipitation is identified with a novel metric, i.e. area aggregated matching ratio (AAMR) computed on regional and global scales. Generally, more (less) extreme events are likely to occur in regions with a positive (negative) mean trend. The match between the mean and extreme trends deteriorates for increasingly heavy precipitation events. The AAMR is higher in regions with negative mean trends than in regions with positive mean trends, suggesting a higher likelihood of severe dry events, compared with heavy rain events in a warming climate. AAMR is found to be higher in tropics and oceans than in the extratropics and land regions, reflecting a higher degree of randomness and more important dynamical rather than thermodynamical contributions of extreme events in the latter regions.
Peripheral Quantitative CT (pQCT) Using a Dedicated Extremity Cone-Beam CT Scanner
Muhit, A. A.; Arora, S.; Ogawa, M.; Ding, Y.; Zbijewski, W.; Stayman, J. W.; Thawait, G.; Packard, N.; Senn, R.; Yang, D.; Yorkston, J.; Bingham, C.O.; Means, K.; Carrino, J. A.; Siewerdsen, J. H.
2014-01-01
Purpose We describe the initial assessment of the peripheral quantitative CT (pQCT) imaging capabilities of a cone-beam CT (CBCT) scanner dedicated to musculoskeletal extremity imaging. The aim is to accurately measure and quantify bone and joint morphology using information automatically acquired with each CBCT scan, thereby reducing the need for a separate pQCT exam. Methods A prototype CBCT scanner providing isotropic, sub-millimeter spatial resolution and soft-tissue contrast resolution comparable or superior to standard multi-detector CT (MDCT) has been developed for extremity imaging, including the capability for weight-bearing exams and multi-mode (radiography, fluoroscopy, and volumetric) imaging. Assessment of pQCT performance included measurement of bone mineral density (BMD), morphometric parameters of subchondral bone architecture, and joint space analysis. Measurements employed phantoms, cadavers, and patients from an ongoing pilot study imaged with the CBCT prototype (at various acquisition, calibration, and reconstruction techniques) in comparison to MDCT (using pQCT protocols for analysis of BMD) and micro-CT (for analysis of subchondral morphometry). Results The CBCT extremity scanner yielded BMD measurement within ±2–3% error in both phantom studies and cadaver extremity specimens. Subchondral bone architecture (bone volume fraction, trabecular thickness, degree of anisotropy, and structure model index) exhibited good correlation with gold standard micro-CT (error ~5%), surpassing the conventional limitations of spatial resolution in clinical MDCT scanners. Joint space analysis demonstrated the potential for sensitive 3D joint space mapping beyond that of qualitative radiographic scores in application to non-weight-bearing versus weight-bearing lower extremities and assessment of phalangeal joint space integrity in the upper extremities. Conclusion The CBCT extremity scanner demonstrated promising initial results in accurate pQCT analysis from images acquired with each CBCT scan. Future studies will include improved x-ray scatter correction and image reconstruction techniques to further improve accuracy and to correlate pQCT metrics with known pathology. PMID:25076823
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yan, Hongxiang; Sun, Ning; Wigmosta, Mark
There is a renewed focus on the design of infrastructure resilient to extreme hydrometeorological events. While precipitation-based intensity-duration-frequency (IDF) curves are commonly used as part of infrastructure design, a large percentage of peak runoff events in snow-dominated regions are caused by snowmelt, particularly during rain-on-snow (ROS) events. In these regions, precipitation-based IDF curves may lead to substantial over-/under-estimation of design basis events and subsequent over-/under-design of infrastructure. To overcome this deficiency, we proposed next-generation IDF (NG-IDF) curves, which characterize the actual water reaching the land surface. We compared NG-IDF curves to standard precipitation-based IDF curves for estimates of extreme eventsmore » at 376 Snowpack Telemetry (SNOTEL) stations across the western United States that each had at least 30 years of high-quality records. We found standard precipitation-based IDF curves at 45% of the stations were subject to under-design, many with significant under-estimation of 100-year extreme events, for which the precipitation-based IDF curves can underestimate water potentially available for runoff by as much as 125% due to snowmelt and ROS events. The regions with the greatest potential for under-design were in the Pacific Northwest, the Sierra Nevada Mountains, and the Middle and Southern Rockies. We also found the potential for over-design at 20% of the stations, primarily in the Middle Rockies and Arizona mountains. These results demonstrate the need to consider snow processes in the development of IDF curves, and they suggest use of the more robust NG-IDF curves for hydrologic design in snow-dominated environments.« less
Residential exposure from extremely low frequency electromagnetic field (ELF EMF) radiation
NASA Astrophysics Data System (ADS)
Parthasarathy, Shamesh Raj; Tukimin, Roha
2018-01-01
ELF EMF radiation have received considerable attention as a potential threat to the safety and health of people living in the vicinity of high voltage transmission lines, electric distribution substations, power stations and even in close proximity to electronics and electrical household appliances. The paper highlights the study on the ELF EMF safety assessment performed at residences comprising of an owner-occupied house, a completed vacant house and an under construction condominium. The objectives of this study were to determine the ELF EMF radiation exposure level from the high voltage transmission line, electric distribution substation, power station and electrical household appliances in the residences, and to assess the potential exposure received by the occupants at the assessed locations. The results were logged in the electric and magnetic field strength with the units of volt per meter (V/m) and miliGauss (mG) respectively. The instrument setup and measurement protocols during the assessment were adopted from standard measurement method and procedures stipulated under the Institute of Electrical and Electronics Engineers (IEEE) Standard. The results were compared with the standards recommended in the International Commission on Non-Ionizing Radiation Protection (ICNIRP) guidelines.
Weights, growth, and survival of timber wolf pups in Minnesota
Van Ballenberghe, V.; Mech, L.D.
1975-01-01
Weights, growth rates, canine tooth lengths, and survival data were obtained from 73 wild wolf (Canis lupus) pups that were 8 to 28 weeks old when live-trapped in three areas of northern Minnesota from 1969 to 1972. Relative weights of wild pups are expressed as percentages of a standard weight curve based on data from captive pups of similar age. These relative weights varied greatly within litters, between litters, and between years; extremes of 31 to 144 percent of the standard were observed. Growth rates ranging from 0.05 to 0.23 kilograms per day were observed, and similar variations in general devel pment and in replacement and growth of canine teeth were noted. Survival data based on radio-tracking and tag returns indicated that pups with relative weights less than 65 percent of standard have a poor chance of survival, whereas pups of at least 80 percent of standard weight have a high survivability. Pups born in 1972 were especially underweight, probably a result of declining white-tailed deer (Odocoileus virginianus) densities in the interior of the Superior National Forest study area.
Lee, Seung-Heon; Lu, Jian; Lee, Seung-Jun; Han, Jae-Hyun; Jeong, Chan-Uk; Lee, Seung-Chul; Li, Xian; Jazbinšek, Mojca; Yoon, Woojin; Yun, Hoseop; Kang, Bong Joo; Rotermund, Fabian; Nelson, Keith A; Kwon, O-Pil
2017-08-01
Highly efficient nonlinear optical organic crystals are very attractive for various photonic applications including terahertz (THz) wave generation. Up to now, only two classes of ionic crystals based on either pyridinium or quinolinium with extremely large macroscopic optical nonlinearity have been developed. This study reports on a new class of organic nonlinear optical crystals introducing electron-accepting benzothiazolium, which exhibit higher electron-withdrawing strength than pyridinium and quinolinium in benchmark crystals. The benzothiazolium crystals consisting of new acentric core HMB (2-(4-hydroxy-3-methoxystyryl)-3-methylbenzo[d]thiazol-3-ium) exhibit extremely large macroscopic optical nonlinearity with optimal molecular ordering for maximizing the diagonal second-order nonlinearity. HMB-based single crystals prepared by simple cleaving method satisfy all required crystal characteristics for intense THz wave generation such as large crystal size with parallel surfaces, moderate thickness and high optical quality with large optical transparency range (580-1620 nm). Optical rectification of 35 fs pulses at the technologically very important wavelength of 800 nm in 0.26 mm thick HMB crystal leads to one order of magnitude higher THz wave generation efficiency with remarkably broader bandwidth compared to standard inorganic 0.5 mm thick ZnTe crystal. Therefore, newly developed HMB crystals introducing benzothiazolium with extremely large macroscopic optical nonlinearity are very promising materials for intense broadband THz wave generation and other nonlinear optical applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Mirror therapy for motor function of the upper extremity in patients with stroke: A meta-analysis.
Zeng, Wen; Guo, Yonghong; Wu, Guofeng; Liu, Xueyan; Fang, Qian
2018-01-10
To evaluate the mean treatment effect of mirror therapy on motor function of the upper extremity in patients with stroke. Electronic databases, including the Cochrane Library, PubMed, MEDLINE, Embase and CNKSystematic, were searched for relevant studies published in English between 1 January 2007 and 22 June 2017. Randomized controlled trials and pilot randomized controlled trials that compared mirror therapy/mirror box therapy with other rehabilitation approaches were selected. Two authors independently evaluated the searched studies based on the inclusion/exclusion criteria and appraised the quality of included studies according to the criteria of the updated version 5.1.0 of the Cochrane Handbook for Systematic Review of Interventions. Eleven trials, with a total of 347 patients, were included in the meta-analysis. A moderate effect of mirror therapy (standardized mean difference 0.51, 95% confidence interval (CI) 0.29, 0.73) on motor function of the upper extremity was found. However, a high degree of heterogeneity (χ2 = 25.65, p = 0.004; I2 = 61%) was observed. The heterogeneity decreased a great deal (χ2 = 6.26, p = 0.62; I2 = 0%) after 2 trials were excluded though sensitivity analysis. Although the included studies had high heterogeneity, meta-analysis provided some evidence that mirror therapy may significantly improve motor function of the upper limb in patients with stroke. Further well-designed studies are needed.
Statistic analysis of annual total ozone extremes for the period 1964-1988
NASA Technical Reports Server (NTRS)
Krzyscin, Janusz W.
1994-01-01
Annual extremes of total column amount of ozone (in the period 1964-1988) from a network of 29 Dobson stations have been examined using the extreme value analysis. The extremes have been calculated as the highest deviation of daily mean total ozone from its long-term monthly mean, normalized by the monthly standard deviations. The extremes have been selected from the direct-Sun total ozone observations only. The extremes resulting from abrupt changes in ozone (day to day changes greater than 20 percent) have not been considered. The ordered extremes (maxima in ascending way, minima in descending way) have been fitted to one of three forms of the Fisher-Tippet extreme value distribution by the nonlinear least square method (Levenberg-Marguard method). We have found that the ordered extremes from a majority of Dobson stations lie close to Fisher-Tippet type III. The extreme value analysis of the composite annual extremes (combined from averages of the annual extremes selected at individual stations) has shown that the composite maxima are fitted by the Fisher-Tippet type III and the composite minima by the Fisher-Tippet type I. The difference between the Fisher-Tippet types of the composite extremes seems to be related to the ozone downward trend. Extreme value prognoses for the period 1964-2014 (derived from the data taken at: all analyzed stations, the North American, and the European stations) have revealed that the prognostic extremes are close to the largest annual extremes in the period 1964-1988 and there are only small regional differences in the prognoses.
NASA Astrophysics Data System (ADS)
Parey, S.
2014-12-01
F. J. Acero1, S. Parey2, T.T.H. Hoang2, D. Dacunha-Castelle31Dpto. Física, Universidad de Extremadura, Avda. de Elvas s/n, 06006, Badajoz 2EDF/R&D, 6 quai Watier, 78401 Chatou Cedex, France 3Laboratoire de Mathématiques, Université Paris 11, Orsay, France Trends can already be detected in daily rainfall amount in the Iberian Peninsula (IP), and this will have an impact on the extreme levels. In this study, we compare different ways to estimate future return levels for heavy rainfall, based on the statistical extreme value theory. Both Peaks over Threshold (POT) and block maxima with the Generalized Extreme Value (GEV) distribution will be used and their results compared when linear trends are assumed in the parameters: threshold and scale parameter for POT and location and scale parameter for GEV. But rainfall over the IP is a special variable in that a large number of the values are 0. Thus, the impact of taking this into account is discussed too. Another approach is then tested, based on the evolutions of the mean and variance obtained from the time series of rainy days only, and of the number of rainy days. A statistical test, similar to that designed for temperature in Parey et al. 2013, is used to assess if the trends in extremes can be considered as mostly due to these evolutions when considering only rainy days. The results show that it is mainly the case: the extremes of the residuals, after removing the trends in mean and standard deviation, cannot be differentiated from those of a stationary process. Thus, the future return levels can be estimated from the stationary return level of these residuals and an estimation of the future mean and standard deviation. Moreover, an estimation of the future number of rainy days is used to retrieve the return levels for all days. All of these comparisons are made for an ensemble of high quality rainfall time series observed in the Iberian Peninsula over the period 1961-2010, from which we want to estimate a 20-year return level expected in 2020. The evolutions and the impact of the different approaches will be discussed for 3 seasons: fall, spring and winter. Parey S., Hoang T.T.H., Dacunha-Castelle D.: The importance of mean and variance in predicting changes in temperature extremes, Journal of Geophysical Research: Atmospheres, Vol. 118, 1-12, 2013.
Hirata, Aya; Sugiyama, Daisuke; Watanabe, Makoto; Tamakoshi, Akiko; Iso, Hiroyasu; Kotani, Kazuhiko; Kiyama, Masahiko; Yamada, Michiko; Ishikawa, Shizukiyo; Murakami, Yoshitaka; Miura, Katsuyuki; Ueshima, Hirotsugu; Okamura, Tomonori
2018-02-08
The effect of very high or extremely high levels of high-density lipoprotein cholesterol (HDL-C) on cardiovascular disease (CVD) is not well described. Although a few recent studies have reported the adverse effects of extremely high levels of HDL-C on CVD events, these did not show a statistically significant association between extremely high levels of HDL-C and cause-specific CVD mortality. In addition, Asian populations have not been studied. We examine the impact of extremely high levels of HDL-C on cause-specific CVD mortality using pooled data of Japanese cohort studies. We performed a large-scale pooled analysis of 9 Japanese cohorts including 43,407 participants aged 40-89 years, dividing the participants into 5 groups by HDL-C levels, including extremely high levels of HDL-C ≥2.33 mmol/L (≥90 mg/dL). We estimated the adjusted hazard ratio of each HDL-C category for all-cause death and cause-specific deaths compared with HDL-C 1.04-1.55 mmol/L (40-59 mg/dL) using a cohort-stratified Cox proportional hazards model. During a 12.1-year follow-up, 4995 all-cause deaths and 1280 deaths due to overall CVD were identified. Extremely high levels of HDL-C were significantly associated with increased risk of atherosclerotic CVD mortality (hazard ratio = 2.37, 95% confidence interval: 1.37-4.09 for total) and increased risk for coronary heart disease and ischemic stroke. In addition, the risk for extremely high HDL-C was more evident among current drinkers. We showed extremely high levels of HDL-C had an adverse effect on atherosclerotic CVD mortality in a pooled analysis of Japanese cohorts. Copyright © 2018 National Lipid Association. Published by Elsevier Inc. All rights reserved.
Lopez, David H.; Rabbani, Michael R.; Crosbie, Ewan; Raman, Aishwarya; Arellano, Avelino F.; Sorooshian, Armin
2016-01-01
This study uses more than a decade’s worth of data across Arizona to characterize the spatiotemporal distribution, frequency, and source of extreme aerosol events, defined as when the concentration of a species on a particular day exceeds that of the average plus two standard deviations for that given month. Depending on which of eight sites studied, between 5% and 7% of the total days exhibited an extreme aerosol event due to either extreme levels of PM10, PM2.5, and/or fine soil. Grand Canyon exhibited the most extreme event days (120, i.e., 7% of its total days). Fine soil is the pollutant type that most frequently impacted multiple sites at once at an extreme level. PM10, PM2.5, fine soil, non-Asian dust, and Elemental Carbon extreme events occurred most frequently in August. Nearly all Asian dust extreme events occurred between March and June. Extreme Elemental Carbon events have decreased as a function of time with statistical significance, while other pollutant categories did not show any significant change. Extreme events were most frequent for the various pollutant categories on either Wednesday or Thursday, but there was no statistically significant difference in the number of events on any particular day or on weekends versus weekdays. PMID:27088005
Dry seasons identified in oak tree-ring chronology in the Czech Lands over the last millennium
NASA Astrophysics Data System (ADS)
Dobrovolny, Petr; Brazdil, Rudolf; Büntgen, Ulf; Rybnicek, Michal; Kolar, Tomas; Reznickova, Ladislava; Valasek, Hubert; Kotyza, Oldrich
2015-04-01
There is growing evidence on amplification of hydrological regimes as a consequence of rising temperatures, increase in evaporation and changes in circulation patterns. These processes may be responsible for higher probability of hydroclimatic extremes occurrence in regional scale. Extreme events such as floods or droughts are rare from their definition and for better understanding of possible changes in the frequency and intensity of their occurrence, long-term proxy archives may be analysed. Recently several tree ring width chronologies were compiled from hardwood species occurring in lowland positions and their analysis proved that they are moisture-sensitive and suitable for hydroclimate reconstructions. Here, we introduce a new oak (Quercus sp) ring width (RW) dataset for the Czech Republic and the last 1250 years. We explain the process of oak chronology standardization that was based on several only slightly different de-trending techniques and subsequent chronology development steps. We hypothesize that the most severe RW increment reductions (negative extremes) reflect extremely dry spring-summer conditions. Negative extremes were assigned for years in which transformed oak RWs were lower than the minus 1.5 standard deviation. To verify our hypothesis, we compare typical climatic conditions in negative extreme years with climatology of the reference period 1961-1990. Comparison was done for various instrumental measurements (1805-2012), existing proxy reconstructions (1500-1804) and also for documentary evidence from historical archives (before 1500). We found that years of negative extremes are characterized with distinctly above average spring (MAM) and summer (JJA) air temperatures and below average precipitation amounts. Typical sea level pressure spatial distribution in those years shows positive pressure anomaly over British Isles and Northern Sea, the pattern that synoptically corresponds to blocking anticyclone bringing to Central Europe warm air from SW and low precipitation totals with higher probability of drought occurrence. Our results provide consistent physical explanation of extremely dry seasons occurring in Central Europe. However, direct comparisons of individual RW extreme seasons with existing documentary evidence show the complexity the problem as some extremes identified in oak RW chronology were not confirmed in documentary archives and vice versa. We discuss possible causes of such differences related to the fact that various proxies may have problems to record real intensity or duration of extreme events e.g. due to non-linear response of proxy data to climate drivers or due to shift in seasonality.
Mahmoud, Amal H; El Anany, Ayman Mohammed
2014-12-01
Childhood malnutrition is a common disorder in developing countries. To formulate a complementary food from rice, germinated-decoated faba bean, orange-fleshed sweet potato flour, and peanut oil (RFPP formula) for infants aged 6 to 24 months. The nutritional and sensory characteristics of the RFPP complementary food in comparison with those of a commercial complementary food were determined using standard official procedures. The levels of protein (17.89 g/100 g), fat (10.35 g/100 g), carbohydrate (67.82 g/100 g), and energy (435.99 kcal/100 g) of the RFPP complementary food met the specifications of the Codex standard (1991) and the Egyptian Standard No. 3284 (2005). The essential amino acid contents of the RFPP complementary food were higher than the amino acid profile of the Food and Agriculture Organization/World Health Organization/United Nations University (2002) reference protein for children 0.5 to 1 and 1 to 2 years of age. The RFPP complementary food had high levels (54.00%) of monounsaturated fatty acids. However, the highest level of saturated fatty acids (51.10%) was recorded for the commercial complementary food. The sensory evaluation results, using a nine-point hedonic scale ranging from 1 (dislike extremely) to 9 (like extremely), show that the RFPP complementary food was acceptable in appearance (7.20), color (6.35), aroma (6.75), taste (7.25), and mouthfeel (7.10) and had an overall acceptability of 6.40. The RFPP formulated complementary food was acceptable and adequate in nutrients for weaning purposes.
Extreme Kinematics in Selected Hip Hop Dance Sequences.
Bronner, Shaw; Ojofeitimi, Sheyi; Woo, Helen
2015-09-01
Hip hop dance has many styles including breakdance (breaking), house, popping and locking, funk, streetdance, krumping, Memphis jookin', and voguing. These movements combine the complexity of dance choreography with the challenges of gymnastics and acrobatic movements. Despite high injury rates in hip hop dance, particularly in breakdance, to date there are no published biomechanical studies in this population. The purpose of this study was to compare representative hip hop steps found in breakdance (toprock and breaking) and house and provide descriptive statistics of the angular displacements that occurred in these sequences. Six expert female hip hop dancers performed three choreographed dance sequences, top rock, breaking, and house, to standardized music-based tempos. Hip, knee, and ankle kinematics were collected during sequences that were 18 to 30 sec long. Hip, knee, and ankle three-dimensional peak joint angles were compared in repeated measures ANOVAs with post hoc tests where appropriate (p<0.01). Peak angles of the breaking sequence, which included floorwork, exceeded the other two sequences in the majority of planes and joints. Hip hop maximal joint angles exceeded reported activities of daily living and high injury sports such as gymnastics. Hip hop dancers work at weight-bearing joint end ranges where muscles are at a functional disadvantage. These results may explain why lower extremity injury rates are high in this population.
Fidelity of the representation of value in decision-making
Dowding, Ben A.
2017-01-01
The ability to make optimal decisions depends on evaluating the expected rewards associated with different potential actions. This process is critically dependent on the fidelity with which reward value information can be maintained in the nervous system. Here we directly probe the fidelity of value representation following a standard reinforcement learning task. The results demonstrate a previously-unrecognized bias in the representation of value: extreme reward values, both low and high, are stored significantly more accurately and precisely than intermediate rewards. The symmetry between low and high rewards pertained despite substantially higher frequency of exposure to high rewards, resulting from preferential exploitation of more rewarding options. The observed variation in fidelity of value representation retrospectively predicted performance on the reinforcement learning task, demonstrating that the bias in representation has an impact on decision-making. A second experiment in which one or other extreme-valued option was omitted from the learning sequence showed that representational fidelity is primarily determined by the relative position of an encoded value on the scale of rewards experienced during learning. Both variability and guessing decreased with the reduction in the number of options, consistent with allocation of a limited representational resource. These findings have implications for existing models of reward-based learning, which typically assume defectless representation of reward value. PMID:28248958
Comparison of iSTAT and EPOC Blood Analyzers
2017-10-25
requires accurate blood analysis across a range of environmental conditions and, in extreme circumstances, use beyond the expiration date. We compared... analysis across a range of environmental conditions and, in extreme circumstances, use beyond the expiration date. We compared gold standard laboratory...temperatures for either device can result in spurious results, particularly for blood gases. 2.0 BACKGROUND Blood analysis is a critical aspect of
Caple, Jodi; Stephan, Carl N
2017-05-01
Graphic exemplars of cranial sex and ancestry are essential to forensic anthropology for standardizing casework, training analysts, and communicating group trends. To date, graphic exemplars have comprised hand-drawn sketches, or photographs of individual specimens, which risks bias/subjectivity. Here, we performed quantitative analysis of photographic data to generate new photo-realistic and objective exemplars of skull form. Standardized anterior and left lateral photographs of skulls for each sex were analyzed in the computer graphics program Psychomorph for the following groups: South African Blacks, South African Whites, American Blacks, American Whites, and Japanese. The average cranial form was calculated for each photographic view, before the color information for every individual was warped to the average form and combined to produce statistical averages. These mathematically derived exemplars-and their statistical exaggerations or extremes-retain the high-resolution detail of the original photographic dataset, making them the ideal casework and training reference standards. © 2016 American Academy of Forensic Sciences.
NASA Technical Reports Server (NTRS)
Woodring, D. G.; Nichols, S. A.; Swanson, R.
1979-01-01
During 1978 and 1979, an Air Force C-135 test aircraft was flown to various locations in the North and South Atlantic and Pacific Oceans for satellite communications experiments. A part of the equipment tested on the aircraft was the SEACOM spread spectrum modem. The SEACOM modem operated at X band frequency from the aircraft via the DSCS II satellite to a ground station. For data to be phased successfully, it was necessary to maintain independent time and frequency accuracy over relatively long periods of time (up to two weeks) on the aircraft and at the ground station. To achieve this goal, two Efratom atomic frequency standards were used. The performance of these frequency standards as used in the spread spectrum modem is discussed, including the effects of high relative velocity, synchronization and the effects of the frequency standards on data performance is discussed. The aircraft environment, which includes extremes of temperature, as well as long periods of shutdown followed by rapid warmup requirements, is also discussed.
Achieving Innovation and Affordability Through Standardization of Materials Development and Testing
NASA Technical Reports Server (NTRS)
Bray, M. H.; Zook, L. M.; Raley, R. E.; Chapman, C.
2011-01-01
The successful expansion of development, innovation, and production within the aeronautics industry during the 20th century was facilitated by collaboration of government agencies with the commercial aviation companies. One of the initial products conceived from the collaboration was the ANC-5 Bulletin, first published in 1937. The ANC-5 Bulletin had intended to standardize the requirements of various government agencies in the design of aircraft structure. The national space policy shift in priority for NASA with an emphasis on transferring the travel to low earth orbit to commercial space providers highlights an opportunity and a need for the national and global space industries. The same collaboration and standardization that is documented and maintained by the industry within MIL-HDBK-5 (MMPDS-01) and MIL-HBDK-17 (nonmetallic mechanical properties) can also be exploited to standardize the thermal performance properties, processing methods, test methods, and analytical methods for use in aircraft and spacecraft design and associated propulsion systems. In addition to the definition of thermal performance description and standardization, the standardization for test methods and analysis for extreme environments (high temperature, cryogenics, deep space radiation, etc) would also be highly valuable to the industry. Its subsequent revisions and conversion to MIL-HDBK-5 and then MMPDS-01 established and then expanded to contain standardized mechanical property design values and other related design information for metallic materials used in aircraft, missiles, and space vehicles. It also includes guidance on standardization of composition, processing, and analytical methods for presentation and inclusion into the handbook. This standardization enabled an expansion of the technologies to provide efficiency and reliability to the consumers. It can be established that many individual programs within the government agencies have been overcome with development costs generated from these nonstandard requirements. Without industry standardization and acceptance, the programs are driven to shoulder the costs of determining design requirements, performance criteria, and then material qualification and certification. A significant investment that the industry could make to both reduce individual program development costs and schedules while expanding commercial space flight capabilities would be to invest in standardizing material performance properties for high temperature, cryogenic, and deep space environments for both metallic and nonmetallic materials.
Multimodality management of soft tissue tumors in the extremity
Crago, Aimee M.; Lee, Ann Y.
2016-01-01
Most extremity soft tissue sarcomas present as a painless mass. Workup should generally involve cross-sectional imaging with MRI, as well as a core biopsy for pathologic diagnosis. Limb-sparing surgery is the standard of care, and may be supplemented with radiation for histologic subtypes at higher risk for local recurrence and chemotherapy for those at higher risk for distant metastases. This article reviews the work-up and surgical approach to extremity soft tissue sarcomas, as well as the role for radiation and chemotherapy, with particular attention given to the distinguishing characteristics of some of the most common subtypes. PMID:27542637
Spectrophotometric determination of traces of boron in high purity silicon
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parashar, D.C.; Sarkar, A.K.; Singh, N.
1989-07-01
A reddish brown complex is formed between boron and curcumin in concentrated sulfuric acid and glacial acetic acid mixture (1:1). The colored complex is highly selective and stable for about 3 hours and has the maximum absorbance at 545 nm. The sensitivity of the method is extremely high and the detection limit is 3 parts per billion based on 0.004 absorbance value. The interference of some of the important cations and anions relevant to silicon were studied and it is found that 100 fold excess of most of these cations and anions do not interfere in the determination of boron.more » The method is successfully employed for the determination of boron in silicon used in semiconductor devices. The results have been verified by standard addition method.« less
Lithium Ion Batteries in Electric Drive Vehicles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pesaran, Ahmad A.
2016-05-16
This research focuses on the technical issues that are critical to the adoption of high-energy-producing lithium Ion batteries. In addition to high energy density / high power density, this publication considers performance requirements that are necessary to assure lithium ion technology as the battery format of choice for electrified vehicles. Presentation of prime topics includes: long calendar life (greater than 10 years); sufficient cycle life; reliable operation under hot and cold temperatures; safe performance under extreme conditions; end-of-life recycling. To achieve aggressive fuel economy standards, carmakers are developing technologies to reduce fuel consumption, including hybridization and electrification. Cost and affordabilitymore » factors will be determined by these relevant technical issues which will provide for the successful implementation of lithium ion batteries for application in future generations of electrified vehicles.« less
Strangeness Production in the ALICE Experiment at the LHC
NASA Astrophysics Data System (ADS)
Johnson, Harold; Fenner, Kiara; Harton, Austin; Garcia-Solis, Edmundo; Soltz, Ron
2015-04-01
The study of strange particle production is an important tool in understanding the properties of a hot and dense medium, the quark-gluon plasma, created in heavy-ion collisions at ultra-relativistic energies. This quark-gluon plasma (QGP) is believed to have been present just after the big bang. The standard model of physics contains six types of quarks. Strange quarks are not among the valence quarks found in protons and neutrons. Strange quark production is sensitive to the extremely high temperatures of the QGP. CERN's Large Hadron Collider accelerates particles to nearly the speed of light before colliding them to create this QGP state. In the results of high-energy particle collisions, hadrons are formed out of quarks and gluons when cooling from extremely high temperatures. Jets are a highly collimated cone of particles coming from the hadronization of a single quark or gluon. Understanding jet interactions may give us clues about the QGP. Using FastJet (a popular jet finder algorithm), we extracted strangeness, or strange particle characteristics of jets contained within proton-proton collisions during our research at CERN. We have identified jets with and without strange particles in proton-proton collisions and we will present a comparison of pT spectra in both cases. This material is based upon work supported by the National Science Foundation under grants PHY-1305280 and PHY-1407051.
NASA Astrophysics Data System (ADS)
Delachat, F.; Le Drogoff, B.; Constancias, C.; Delprat, S.; Gautier, E.; Chaker, M.; Margot, J.
2016-01-01
In this work, we demonstrate a full process for fabricating high aspect ratio diffraction optics for extreme ultraviolet lithography. The transmissive optics consists in nanometer scale tungsten patterns standing on flat, ultrathin (100 nm) and highly transparent (>85% at 13.5 nm) silicon membranes (diameter of 1 mm). These tungsten patterns were achieved using an innovative pseudo-Bosch etching process based on an inductively coupled plasma ignited in a mixture of SF6 and C4F8. Circular ultra-thin Si membranes were fabricated through a state-of-the-art method using direct-bonding with thermal difference. The silicon membranes were sputter-coated with a few hundred nanometers (100-300 nm) of stress-controlled tungsten and a very thin layer of chromium. Nanoscale features were written in a thin resist layer by electron beam lithography and transferred onto tungsten by plasma etching of both the chromium hard mask and the tungsten layer. This etching process results in highly anisotropic tungsten features at room temperature. The homogeneity and the aspect ratio of the advanced pattern transfer on the membranes were characterized with scanning electron microscopy after focus ion beam milling. An aspect ratio of about 6 for 35 nm size pattern is successfully obtained on a 1 mm diameter 100 nm thick Si membrane. The whole fabrication process is fully compatible with standard industrial semiconductor technology.
Explosive X-point collapse in relativistic magnetically dominated plasma
NASA Astrophysics Data System (ADS)
Lyutikov, Maxim; Sironi, Lorenzo; Komissarov, Serguei S.; Porth, Oliver
2017-12-01
The extreme properties of the gamma-ray flares in the Crab nebula present a clear challenge to our ideas on the nature of particle acceleration in relativistic astrophysical plasma. It seems highly unlikely that standard mechanisms of stochastic type are at work here and hence the attention of theorists has switched to linear acceleration in magnetic reconnection events. In this series of papers, we attempt to develop a theory of explosive magnetic reconnection in highly magnetized relativistic plasma which can explain the extreme parameters of the Crab flares. In the first paper, we focus on the properties of the X-point collapse. Using analytical and numerical methods (fluid and particle-in-cell simulations) we extend Syrovatsky's classical model of such collapse to the relativistic regime. We find that the collapse can lead to the reconnection rate approaching the speed of light on macroscopic scales. During the collapse, the plasma particles are accelerated by charge-starved electric fields, which can reach (and even exceed) values of the local magnetic field. The explosive stage of reconnection produces non-thermal power-law tails with slopes that depend on the average magnetization . For sufficiently high magnetizations and vanishing guide field, the non-thermal particle spectrum consists of two components: a low-energy population with soft spectrum that dominates the number census; and a high-energy population with hard spectrum that possesses all the properties needed to explain the Crab flares.
A new method for calculating ecological flow: Distribution flow method
NASA Astrophysics Data System (ADS)
Tan, Guangming; Yi, Ran; Chang, Jianbo; Shu, Caiwen; Yin, Zhi; Han, Shasha; Feng, Zhiyong; Lyu, Yiwei
2018-04-01
A distribution flow method (DFM) and its ecological flow index and evaluation grade standard are proposed to study the ecological flow of rivers based on broadening kernel density estimation. The proposed DFM and its ecological flow index and evaluation grade standard are applied into the calculation of ecological flow in the middle reaches of the Yangtze River and compared with traditional calculation method of hydrological ecological flow, method of flow evaluation, and calculation result of fish ecological flow. Results show that the DFM considers the intra- and inter-annual variations in natural runoff, thereby reducing the influence of extreme flow and uneven flow distributions during the year. This method also satisfies the actual runoff demand of river ecosystems, demonstrates superiority over the traditional hydrological methods, and shows a high space-time applicability and application value.
Test-retest and interrater reliability of the functional lower extremity evaluation.
Haitz, Karyn; Shultz, Rebecca; Hodgins, Melissa; Matheson, Gordon O
2014-12-01
Repeated-measures clinical measurement reliability study. To establish the reliability and face validity of the Functional Lower Extremity Evaluation (FLEE). The FLEE is a 45-minute battery of 8 standardized functional performance tests that measures 3 components of lower extremity function: control, power, and endurance. The reliability and normative values for the FLEE in healthy athletes are unknown. A face validity survey for the FLEE was sent to sports medicine personnel to evaluate the level of importance and frequency of clinical usage of each test included in the FLEE. The FLEE was then administered and rated for 40 uninjured athletes. To assess test-retest reliability, each athlete was tested twice, 1 week apart, by the same rater. To assess interrater reliability, 3 raters scored each athlete during 1 of the testing sessions. Intraclass correlation coefficients were used to assess the test-retest and interrater reliability of each of the FLEE tests. In the face validity survey, the FLEE tests were rated as highly important by 58% to 71% of respondents but frequently used by only 26% to 45% of respondents. Interrater reliability intraclass correlation coefficients ranged from 0.83 to 1.00, and test-retest reliability ranged from 0.71 to 0.95. The FLEE tests are considered clinically important for assessing lower extremity function by sports medicine personnel but are underused. The FLEE also is a reliable assessment tool. Future studies are required to determine if use of the FLEE to make return-to-play decisions may reduce reinjury rates.
Ekstrand, Elisabeth; Lexell, Jan; Brogårdh, Christina
2015-09-01
To evaluate the test-retest reliability of isometric and isokinetic muscle strength measurements in the upper extremity after stroke. A test-retest design. Forty-five persons with mild to moderate paresis in the upper extremity > 6 months post-stroke. Isometric arm strength (shoulder abduction, elbow flexion), isokinetic arm strength (elbow extension/flexion) and isometric grip strength were measured with electronic dynamometers. Reliability was evaluated with intra-class correlation coefficients (ICC), changes in the mean, standard error of measurements (SEM) and smallest real differences (SRD). Reliability was high (ICCs: 0.92-0.97). The absolute and relative (%) SEM ranged from 2.7 Nm (5.6%) to 3.0 Nm (9.4%) for isometric arm strength, 2.6 Nm (7.4%) to 2.9 Nm (12.6%) for isokinetic arm strength, and 22.3 N (7.6%) to 26.4 N (9.2%) for grip strength. The absolute and relative (%) SRD ranged from 7.5 Nm (15.5%) to 8.4 Nm (26.1%) for isometric arm strength, 7.1 Nm (20.6%) to 8.0 Nm (34.8%) for isokinetic arm strength, and 61.8 N (21.0%) to 73.3 N (25.6%) for grip strength. Muscle strength in the upper extremity can be reliably measured in persons with chronic stroke. Isometric measurements yield smaller measurement errors than isokinetic measurements and might be preferred, but the choice depends on the research question.
An Incremental Type-2 Meta-Cognitive Extreme Learning Machine.
Pratama, Mahardhika; Zhang, Guangquan; Er, Meng Joo; Anavatti, Sreenatha
2017-02-01
Existing extreme learning algorithm have not taken into account four issues: 1) complexity; 2) uncertainty; 3) concept drift; and 4) high dimensionality. A novel incremental type-2 meta-cognitive extreme learning machine (ELM) called evolving type-2 ELM (eT2ELM) is proposed to cope with the four issues in this paper. The eT2ELM presents three main pillars of human meta-cognition: 1) what-to-learn; 2) how-to-learn; and 3) when-to-learn. The what-to-learn component selects important training samples for model updates by virtue of the online certainty-based active learning method, which renders eT2ELM as a semi-supervised classifier. The how-to-learn element develops a synergy between extreme learning theory and the evolving concept, whereby the hidden nodes can be generated and pruned automatically from data streams with no tuning of hidden nodes. The when-to-learn constituent makes use of the standard sample reserved strategy. A generalized interval type-2 fuzzy neural network is also put forward as a cognitive component, in which a hidden node is built upon the interval type-2 multivariate Gaussian function while exploiting a subset of Chebyshev series in the output node. The efficacy of the proposed eT2ELM is numerically validated in 12 data streams containing various concept drifts. The numerical results are confirmed by thorough statistical tests, where the eT2ELM demonstrates the most encouraging numerical results in delivering reliable prediction, while sustaining low complexity.
NASA Astrophysics Data System (ADS)
Oikonomou, Foteini; Murase, Kohta; Kotera, Kumiko
2014-08-01
High frequency peaked, high redshift blazars, are extreme in the sense that their spectrum is particularly hard and peaks at TeV energies. Standard leptonic scenarios require peculiar source parameters and/or a special setup in order to account for these observations. Electromagnetic cascades seeded by ultra-high energy cosmic rays (UHECR) in the intergalactic medium have also been invoked, assuming a very low intergalactic magnetic field (IGMF). Here we study the synchrotron emission of UHECR secondaries produced in blazars located in magnetised environments, and show that it can provide an alternative explanation to these challenged channels, for sources embedded in structured regions with magnetic field strengths of the order of 10-7 G. To demonstrate this, we focus on three extreme blazars: 1ES 0229+200, RGB J0710+591, and 1ES 1218+304. We model the expected gamma-ray signal from these sources through a combination of numerical Monte Carlo simulations and solving the kinetic equations of the particles in our simulations, and explore the UHECR source and intergalactic medium parameter space to test the robustness of the emission. We show that the generated synchrotron-pair halo and echo flux at the peak energy is not sensitive to variations in the overall IGMF strength. This signal is unavoidable in contrast to the inverse Compton-pair halo and echo intensity, which is appealing in view of the large uncertainties on the IGMF in voids of large scale structure. It is also shown that the variability of blazar gamma-ray emission can be accommodated by the synchrotron emission of secondary products of UHE neutral beams if these are emitted by UHECR accelerators inside magnetised regions.
Risk factors for upper-extremity musculoskeletal disorders in the working population
Roquelaure, Yves; Ha, Catherine; Rouillon, Clarisse; Fouquet, Natacha; Leclerc, Annette; Descatha, Alexis; Touranchet, Annie; Goldberg, Marcel; Imbernon, Ellen
2009-01-01
SUMMARY Objective The study aimed to assess the relative importance of personal and occupational risk factors for upper-extremity musculoskeletal disorders (UEMSDs) in the working population. Methods A total of 3,710 workers (58% of men) participating in a surveillance program of MSDs in a French region in 2002–2005 were included. UEMSDs were diagnosed by 83 trained occupational physicians performing a standardized physical examination. Personal factors and work exposure were assessed by a self-administered questionnaire. Statistical associations between MSDs, personal and occupational factors were analyzed using logistic regression modeling. Results A total of 472 workers suffered from at least one UEMSD. The risk of UEMSDs increased with age for both genders (P<0.001) (O.R. up to 4.9 in men and 5.0 and in women) and in cases of prior history of UEMSDs (OR 3.1 and 5.0, P<0.001). In men, UEMSDs were associated with obesity (OR 2.2, P=0.014), high level of physical demand (OR 2.0, P<0.001), high repetitiveness of the task (OR 1.5, P=0.027), postures with the arms at or above shoulder level (OR 1.7, P=0.009) or with full elbow flexion (OR 1.6, P=0.006), and high psychological demand (O.R. 1.5, P=0.005). In women, UEMSDs were associated with diabetes mellitus (O.R. 4.9, P=0.001), postures with extreme wrist bending (OR 2.0, P<0.001), use of vibrating hand tools (O.R. 2.2, P=0.025) and low level of decision authority (OR 1.4, P=0.042). Conclusion The study showed that personal and work-related physical and psychosocial factors were strongly associated with clinically-diagnosed UEMSDs. PMID:19790112
Sakari, Ritva; Rantakokko, Merja; Portegijs, Erja; Iwarsson, Susanne; Sipilä, Sarianna; Viljanen, Anne; Rantanen, Taina
2017-06-01
The aim of this study was to analyze whether the associations between perceived environmental and individual characteristics and perceived walking limitations in older people differ between those with intact and those with poorer lower extremity performance. Persons aged 75 to 90 ( N = 834) participated in interviews and performance tests in their homes. Standard questionnaires were used to obtain walking difficulties; environmental barriers to and, facilitators of, mobility; and perceived individual hindrances to outdoor mobility. Lower extremity performance was tested using Short Physical Performance Battery (SPPB). Among those with poorer lower extremity performance, the likelihood for advanced walking limitations was, in particular, related to perceived poor safety in the environment, and among those with intact performance to perceived social issues, such as lack of company, as well as to long distances. The environmental correlates of walking limitations seem to depend on the level of lower extremity performance.
NASA Technical Reports Server (NTRS)
Rhoads, James E.; Rigby, Jane Rebecca; Malhotra, Sangeeta; Allam, Sahar; Carilli, Chris; Combes, Francoise; Finkelstein, Keely; Finkelstein, Steven; Frye, Brenda; Gerin, Maryvonne;
2014-01-01
We report on two regularly rotating galaxies at redshift z approx. = 2, using high-resolution spectra of the bright [C microns] 158 micrometers emission line from the HIFI instrument on the Herschel Space Observatory. Both SDSS090122.37+181432.3 ("S0901") and SDSSJ120602.09+514229.5 ("the Clone") are strongly lensed and show the double-horned line profile that is typical of rotating gas disks. Using a parametric disk model to fit the emission line profiles, we find that S0901 has a rotation speed of v sin(i) approx. = 120 +/- 7 kms(sup -1) and a gas velocity dispersion of (standard deviation)g < 23 km s(sup -1) (1(standard deviation)). The best-fitting model for the Clone is a rotationally supported disk having v sin(i) approx. = 79 +/- 11 km s(sup -1) and (standard deviation)g 4 kms(sup -1) (1(standard deviation)). However, the Clone is also consistent with a family of dispersion-dominated models having (standard deviation)g = 92 +/- 20 km s(sup -1). Our results showcase the potential of the [C microns] line as a kinematic probe of high-redshift galaxy dynamics: [C microns] is bright, accessible to heterodyne receivers with exquisite velocity resolution, and traces dense star-forming interstellar gas. Future [C microns] line observations with ALMA would offer the further advantage of spatial resolution, allowing a clearer separation between rotation and velocity dispersion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Küchler, R.; Experimental Physics VI, Center for Electronic Correlations and Magnetism, University of Augsburg, Universitätsstrasse 2, 86135 Augsburg; Stingl, C.
2016-07-15
Thermal expansion and magnetostriction are directional dependent thermodynamic quantities. For the characterization of novel quantum phases of matter, it is required to study materials under multi-extreme conditions, in particular, down to very low temperatures, in very high magnetic fields or under high pressure. We developed a miniaturized capacitive dilatometer suitable for temperatures down to 20 mK and usage in high magnetic fields, which exerts a large spring force between 40 to 75 N on the sample. This corresponds to a uniaxial stress up to 3 kbar for a sample with cross section of (0.5 mm){sup 2}. We describe design andmore » performance test of the dilatometer which resolves length changes with high resolution of 0.02 Å at low temperatures. The miniaturized device can be utilized in any standard cryostat, including dilution refrigerators or the commercial physical property measurement system.« less
Pseudo-differential CMOS analog front-end circuit for wide-bandwidth optical probe current sensor
NASA Astrophysics Data System (ADS)
Uekura, Takaharu; Oyanagi, Kousuke; Sonehara, Makoto; Sato, Toshiro; Miyaji, Kousuke
2018-04-01
In this paper, we present a pseudo-differential analog front-end (AFE) circuit for a novel optical probe current sensor (OPCS) aimed for high-frequency power electronics. It employs a regulated cascode transimpedance amplifier (RGC-TIA) to achieve a high gain and a large bandwidth without using an extremely high performance operational amplifier. The AFE circuit is designed in a 0.18 µm standard CMOS technology achieving a high transimpedance gain of 120 dB Ω and high cut off frequency of 16 MHz. The measured slew rate is 70 V/µs and the input referred current noise is 1.02 pA/\\sqrt{\\text{Hz}} . The magnetic resolution and bandwidth of OPCS are estimated to be 1.29 mTrms and 16 MHz, respectively; the bandwidth is higher than that of the reported Hall effect current sensor.
Measurement Properties of the Lower Extremity Functional Scale: A Systematic Review.
Mehta, Saurabh P; Fulton, Allison; Quach, Cedric; Thistle, Megan; Toledo, Cesar; Evans, Neil A
2016-03-01
Systematic review of measurement properties. Many primary studies have examined the measurement properties, such as reliability, validity, and sensitivity to change, of the Lower Extremity Functional Scale (LEFS) in different clinical populations. A systematic review summarizing these properties for the LEFS may provide an important resource. To locate and synthesize evidence on the measurement properties of the LEFS and to discuss the clinical implications of the evidence. A literature search was conducted in 4 databases (PubMed, MEDLINE, Embase, and CINAHL), using predefined search terms. Two reviewers performed a critical appraisal of the included studies using a standardized assessment form. A total of 27 studies were included in the review, of which 18 achieved a very good to excellent methodological quality level. The LEFS scores demonstrated excellent test-retest reliability (intraclass correlation coefficients ranging between 0.85 and 0.99) and demonstrated the expected relationships with measures assessing similar constructs (Pearson correlation coefficient values of greater than 0.7). The responsiveness of the LEFS scores was excellent, as suggested by consistently high effect sizes (greater than 0.8) in patients with different lower extremity conditions. Minimal detectable change at the 90% confidence level (MDC90) for the LEFS scores varied between 8.1 and 15.3 across different reassessment intervals in a wide range of patient populations. The pooled estimate of the MDC90 was 6 points and the minimal clinically important difference was 9 points in patients with lower extremity musculoskeletal conditions, which are indicative of true change and clinically meaningful change, respectively. The results of this review support the reliability, validity, and responsiveness of the LEFS scores for assessing functional impairment in a wide array of patient groups with lower extremity musculoskeletal conditions.
2010-04-01
Water Kit (dry system) installed as standard Abyss second stage with integrated 30-inch braided intermediate pressure hose as standard No user...diaphragm system) installed as standard Abyss second stage with integrated 30-inch braided intermediate pressure hose as standard No user adjustments...1st Stage Regulator with Abyss 2nd Stage and Integrated Intermediate Pressure Hose ..………………………….. A-2 A3 Modified Mares Proton Ice Extreme V32
Future changes in hydro-climatic extremes in the Upper Indus, Ganges, and Brahmaputra River basins
Lutz, Arthur F.; Nepal, Santosh; Khanal, Sonu; Pradhananga, Saurav; Shrestha, Arun B.; Immerzeel, Walter W.
2017-01-01
Future hydrological extremes, such as floods and droughts, may pose serious threats for the livelihoods in the upstream domains of the Indus, Ganges, Brahmaputra. For this reason, the impacts of climate change on future hydrological extremes is investigated in these river basins. We use a fully-distributed cryospheric-hydrological model to simulate current and future hydrological fluxes and force the model with an ensemble of 8 downscaled General Circulation Models (GCMs) that are selected from the RCP4.5 and RCP8.5 scenarios. The model is calibrated on observed daily discharge and geodetic mass balances. The climate forcing and the outputs of the hydrological model are used to evaluate future changes in climatic extremes, and hydrological extremes by focusing on high and low flows. The outcomes show an increase in the magnitude of climatic means and extremes towards the end of the 21st century where climatic extremes tend to increase stronger than climatic means. Future mean discharge and high flow conditions will very likely increase. These increases might mainly be the result of increasing precipitation extremes. To some extent temperature extremes might also contribute to increasing discharge extremes, although this is highly dependent on magnitude of change in temperature extremes. Low flow conditions may occur less frequently, although the uncertainties in low flow projections can be high. The results of this study may contribute to improved understanding on the implications of climate change for the occurrence of future hydrological extremes in the Hindu Kush–Himalayan region. PMID:29287098
Probabilistic forecasting of extreme weather events based on extreme value theory
NASA Astrophysics Data System (ADS)
Van De Vyver, Hans; Van Schaeybroeck, Bert
2016-04-01
Extreme events in weather and climate such as high wind gusts, heavy precipitation or extreme temperatures are commonly associated with high impacts on both environment and society. Forecasting extreme weather events is difficult, and very high-resolution models are needed to describe explicitly extreme weather phenomena. A prediction system for such events should therefore preferably be probabilistic in nature. Probabilistic forecasts and state estimations are nowadays common in the numerical weather prediction community. In this work, we develop a new probabilistic framework based on extreme value theory that aims to provide early warnings up to several days in advance. We consider the combined events when an observation variable Y (for instance wind speed) exceeds a high threshold y and its corresponding deterministic forecasts X also exceeds a high forecast threshold y. More specifically two problems are addressed:} We consider pairs (X,Y) of extreme events where X represents a deterministic forecast, and Y the observation variable (for instance wind speed). More specifically two problems are addressed: Given a high forecast X=x_0, what is the probability that Y>y? In other words: provide inference on the conditional probability: [ Pr{Y>y|X=x_0}. ] Given a probabilistic model for Problem 1, what is the impact on the verification analysis of extreme events. These problems can be solved with bivariate extremes (Coles, 2001), and the verification analysis in (Ferro, 2007). We apply the Ramos and Ledford (2009) parametric model for bivariate tail estimation of the pair (X,Y). The model accommodates different types of extremal dependence and asymmetry within a parsimonious representation. Results are presented using the ensemble reforecast system of the European Centre of Weather Forecasts (Hagedorn, 2008). Coles, S. (2001) An Introduction to Statistical modelling of Extreme Values. Springer-Verlag.Ferro, C.A.T. (2007) A probability model for verifying deterministic forecasts of extreme events. Wea. Forecasting {22}, 1089-1100.Hagedorn, R. (2008) Using the ECMWF reforecast dataset to calibrate EPS forecasts. ECMWF Newsletter, {117}, 8-13.Ramos, A., Ledford, A. (2009) A new class of models for bivariate joint tails. J.R. Statist. Soc. B {71}, 219-241.
A Tandetron as proton injector for the eye tumor therapy in Berlin
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roehrich, J.; Damerow, T.; Hahn, W.
2012-02-15
The therapy of eye tumors with fast protons is an excellent tool giving very high local control rates. At the Helmholtz-Zentrum Berlin (HZB) almost 1800 patients were treated since 1998. A 2 MV Tandetron was installed as injector for the k = 132 HZB cyclotron. Using the standard 358 duoplasmatron ion source with direct extraction of negative hydrogen ions an extremely stable proton beam can be delivered, both on the short-term and the long-term scale. The hair-needle filaments made from thoriated tungsten wires have safe operation times of more than 1000 h.
Kaposi’s sarcoma (KS), a disease characterized by the development of malignant tumors usually in the lower extremities, is a major complication of HIV/AIDS. KS continues to be a problem even with the use of highly active antiretroviral therapy (HAART), today’s standard of care for patients with HIV/AIDS. CCR investigators recently investigated the effects of interleukin-12 (IL-12) on this malignancy in HIV/AIDS patients when combined with pegylated liposomal doxorubicin, an anthracycline. The findings appear in the September 10, 2007, online edition of Blood.
Endler, Peter Christian; Matzer, Wolfgang; Reich, Christian; Reischl, Thomas; Hartmann, Anna Maria; Thieves, Karin; Pfleger, Andrea; Hofäcker, Jürgen; Lothaller, Harald; Scherer-Pongratz, Waltraud
2011-01-01
The influence of a homeopathic high dilution of gibberellic acid on wheat growth was studied at different seasons of the year. Seedlings were allowed to develop under standardized conditions for 7 days; plants were harvested and stalk lengths were measured. The data obtained confirm previous findings, that ultrahigh diluted potentized gibberellic acid affects stalk growth. Furthermore, the outcome of the study suggests that experiments utilizing the bioassay presented should best be performed in autumn season. In winter and spring, respectively, no reliable effects were found. PMID:22125426
NASA Astrophysics Data System (ADS)
Krause, O.; Bouchiat, V.; Bonnot, A. M.
2007-03-01
Due to their extreme aspect ratios and exceptional mechanical properties Carbon Nanotubes terminated silicon probes have proven to be the ''ideal'' probe for Atomic Force Microscopy. But especially for the manufacturing and use of Single Walled Carbon Nanotubes there are serious problems, which have not been solved until today. Here, Single and Double Wall Carbon Nanotubes, batch processed and used as deposited by Chemical Vapor Deposition without any postprocessing, are compared to standard and high resolution silicon probes concerning resolution, scanning speed and lifetime behavior.
Unique thermocouple to measure the temperatures of squibs, igniters, propellants, and rocket nozzles
NASA Astrophysics Data System (ADS)
Nanigian, Jacob; Nanigian, Dan
2006-05-01
The temperatures produced by the various components in the propulsion system of rockets and missiles determine the performance of the rocket. Since these temperatures occur very rapidly and under extreme conditions, standard thermocouples fail before any meaningful temperatures are measured. This paper describes the features of a special family of high performance thermocouples, which can measure these transient temperatures with millisecond response times and under the most severe conditions of erosion. Examples of igniter, propellant and rocket nozzle temperatures are included in this paper. Also included is heat flux measurements made by these sensors in rocket applications.
NASA Astrophysics Data System (ADS)
Gouweleeuw, Ben; Kvas, Andreas; Gruber, Christian; Mayer-Gürr, Torsten; Flechtner, Frank; Hasan, Mehedi; Güntner, Andreas
2017-04-01
Since April 2002, the Gravity Recovery and Climate Experiment (GRACE) satellite mission has been churning out water storage anomaly data, which has been shown to be a unique descriptor of large-scale hydrological extreme events. Nonetheless, efforts to assess the comprehensive information from GRACE on total water storage variations for near-real time flood or drought monitoring have been limited so far, primarily due to its coarse temporal (weekly to monthly) and spatial (> 150.000 km2) resolution and the latency of standard products of about 2 months,. Pending the status of the aging GRACE satellite mission, the Horizon 2020 funded EGSIEM (European Gravity Service for Improved Emergency Management) project is scheduled to launch a 6 month duration near-real time test run of GRACE gravity field data from April 2017 onward, which will provide daily gridded data with a latency of 5 days. This fast availability allows the monitoring of total water storage variations related to hydrological extreme events, as they occur, as opposed to a 'confirmation after occurrence', which is the current situation. This contribution proposes a global GRACE-derived gridded wetness indicator, expressed as a gravity anomaly in dimensionless units of standard deviation. Results of a retrospective evaluation (April 2002-December 2015) of the proposed index against databases of hydrological extremes will be presented. It is shown that signals for large extreme floods related to heavy/monsoonal rainfall are picked up really well in the Southern Hemisphere and lower Northern Hemisphere (Africa, S-America, Australia, S-Asia), while extreme floods in the Northern Hemisphere (Russia) related to snow melt are often not. The latter is possibly related to a lack of mass movement over longer distances, e.g. when melt water is not drained due to river ice blocking.
A Tool for Rating the Resilience of Critical Infrastructures in Extreme Fires
2014-05-01
provide a tool for NRC to help the Canadian industry to develop extreme fire protection materials and technologies for critical infrastructures. Future...supported by the Canadian Safety and Security Program (CSSP) which is led by Defence Research and Development Canada’s Centre for Security Science, in...in oil refinery and chemical industry facilities. The only available standard in North America that addresses the transportation infrastructure is
NASA Astrophysics Data System (ADS)
Tejedor, E.; Saz, M. A.; Esper, J.; Cuadrat, J. M.; de Luis, M.
2017-08-01
Drought recurrence in the Mediterranean is regarded as a fundamental factor for socioeconomic development and the resilience of natural systems in context of global change. However, knowledge of past droughts has been hampered by the absence of high-resolution proxies. We present a drought reconstruction for the northeast of the Iberian Peninsula based on a new dendrochronology network considering the Standardized Evapotranspiration Precipitation Index (SPEI). A total of 774 latewood width series from 387 trees of P. sylvestris and P. uncinata was combined in an interregional chronology. The new chronology, calibrated against gridded climate data, reveals a robust relationship with the SPEI representing drought conditions of July and August. We developed a summer drought reconstruction for the period 1734-2013 representative for the northeastern and central Iberian Peninsula. We identified 16 extremely dry and 17 extremely wet summers and four decadal scale dry and wet periods, including 2003-2013 as the driest episode of the reconstruction.
Kuban, Karl C. K.; Allred, Elizabeth N.; O’Shea, T. Michael; Paneth, Nigel; Pagano, Marcello; Dammann, Olaf; Leviton, Alan; Du Plessis, Adré; Westra, Sjirk J.; Miller, Cindy R.; Bassan, Haim; Krishnamoorthy, Kalpathy; Junewick, Joseph; Olomu, Nicholas; Romano, Elaine; Seibert, Joanna; Engelke, Steve; Karna, Padmani; Batton, Daniel; O’Connor, Sunila E.; Keller, Cecelia E.
2009-01-01
Our prospective cohort study of extremely low gestational age newborns evaluated the association of neonatal head ultrasound abnormalities with cerebral palsy at age 2 years. Cranial ultrasounds in 1053 infants were read with respect to intraventricular hemorrhage, ventriculomegaly, and echolucency, by multiple sonologists. Standardized neurological examinations classified cerebral palsy, and functional impairment was assessed. Forty-four percent with ventriculomegaly and 52% with echolucency developed cerebral palsy. Compared with no ultrasound abnormalities, children with echolucency were 24 times more likely to have quadriparesis and 29 times more likely to have hemiparesis. Children with ventriculomegaly were 17 times more likely to have quadriparesis or hemiparesis. Forty-three percent of children with cerebral palsy had normal head ultrasound. Focal white matter damage (echolucency) and diffuse damage (late ventriculomegaly) are associated with a high probability of cerebral palsy, especially quadriparesis. Nearly half the cerebral palsy identified at 2 years is not preceded by a neonatal brain ultrasound abnormality. PMID:19168819
Does nonstationarity in rainfall require nonstationary intensity-duration-frequency curves?
NASA Astrophysics Data System (ADS)
Ganguli, Poulomi; Coulibaly, Paulin
2017-12-01
In Canada, risk of flooding due to heavy rainfall has risen in recent decades; the most notable recent examples include the July 2013 storm in the Greater Toronto region and the May 2017 flood of the Toronto Islands. We investigate nonstationarity and trends in the short-duration precipitation extremes in selected urbanized locations in Southern Ontario, Canada, and evaluate the potential of nonstationary intensity-duration-frequency (IDF) curves, which form an input to civil infrastructural design. Despite apparent signals of nonstationarity in precipitation extremes in all locations, the stationary vs. nonstationary models do not exhibit any significant differences in the design storm intensity, especially for short recurrence intervals (up to 10 years). The signatures of nonstationarity in rainfall extremes do not necessarily imply the use of nonstationary IDFs for design considerations. When comparing the proposed IDFs with current design standards, for return periods (10 years or less) typical for urban drainage design, current design standards require an update of up to 7 %, whereas for longer recurrence intervals (50-100 years), ideal for critical civil infrastructural design, updates ranging between ˜ 2 and 44 % are suggested. We further emphasize that the above findings need re-evaluation in the light of climate change projections since the intensity and frequency of extreme precipitation are expected to intensify due to global warming.
Goliwas, Magdalena; Kocur, Piotr; Furmaniuk, Lech; Majchrzycki, Marian; Wiernicka, Marzena; Lewandowski, Jacek
2015-01-01
[Purpose] To assess the effects of sensorimotor foot stimulation on the symmetry of weight distribution on the feet of patients in the chronic post-stroke phase. [Subjects and Methods] This study was a prospective, single blind, randomized controlled trial. In the study we examined patients with chronic stroke (post-stroke duration > 1 year). They were randomly allocated to the study group (n=8) or to the control group (n=12). Both groups completed a standard six-week rehabilitation programme. In the study group, the standard rehabilitation programme was supplemented with sensorimotor foot stimulation training. Each patient underwent two assessments of symmetry of weight distribution on the lower extremities with and without visual control, on a treadmill, with stabilometry measurements, and under static conditions. [Results] Only the study group demonstrated a significant increase in the weight placed on the leg directly affected by stroke, and a reduction in asymmetry of weight-bearing on the lower extremities. [Conclusion] Sensorimotor stimulation of the feet enhanced of weight bearing on the foot on the side of the body directly affected by stroke, and a decreased asymmetry of weight distribution on the lower extremities of patients in the chronic post-stroke phase. PMID:26504326
Goliwas, Magdalena; Kocur, Piotr; Furmaniuk, Lech; Majchrzycki, Marian; Wiernicka, Marzena; Lewandowski, Jacek
2015-09-01
[Purpose] To assess the effects of sensorimotor foot stimulation on the symmetry of weight distribution on the feet of patients in the chronic post-stroke phase. [Subjects and Methods] This study was a prospective, single blind, randomized controlled trial. In the study we examined patients with chronic stroke (post-stroke duration > 1 year). They were randomly allocated to the study group (n=8) or to the control group (n=12). Both groups completed a standard six-week rehabilitation programme. In the study group, the standard rehabilitation programme was supplemented with sensorimotor foot stimulation training. Each patient underwent two assessments of symmetry of weight distribution on the lower extremities with and without visual control, on a treadmill, with stabilometry measurements, and under static conditions. [Results] Only the study group demonstrated a significant increase in the weight placed on the leg directly affected by stroke, and a reduction in asymmetry of weight-bearing on the lower extremities. [Conclusion] Sensorimotor stimulation of the feet enhanced of weight bearing on the foot on the side of the body directly affected by stroke, and a decreased asymmetry of weight distribution on the lower extremities of patients in the chronic post-stroke phase.
NASA Astrophysics Data System (ADS)
Wadey, M. P.; Brown, J. M.; Haigh, I. D.; Dolphin, T.; Wisse, P.
2015-04-01
The extreme sea levels and waves experienced around the UK's coast during the 2013/2014 winter caused extensive coastal flooding and damage. In such circumstances, coastal managers seek to place such extremes in relation to the anticipated standards of flood protection, and the long-term recovery of the natural system. In this context, return periods are often used as a form of guidance. We therefore provide these levels for the winter storms, as well as discussing their application to the given data sets and case studies (two UK case study sites: Sefton, northwest England; and Suffolk, east England). We use tide gauge records and wave buoy data to compare the 2013/2014 storms with return periods from a national dataset, and also generate joint probabilities of sea level and waves, incorporating the recent events. The UK was hit at a national scale by the 2013/2014 storms, although the return periods differ with location. We also note that the 2013/2014 high water and waves were extreme due to the number of events, as well as the extremity of the 5 December 2013 "Xaver" storm, which had a very high return period at both case study sites. Our return period analysis shows that the national scale impact of this event is due to its coincidence with spring high tide at multiple locations as the tide and storm propagated across the continental shelf. Given that this event is such an outlier in the joint probability analyses of these observed data sets, and that the season saw several events in close succession, coastal defences appear to have provided a good level of protection. This type of assessment should be recorded alongside details of defence performance and upgrade, with other variables (e.g. river levels at estuarine locations) included and appropriate offsetting for linear trends (e.g. mean sea level rise) so that the storm-driven component of coastal flood events can be determined. Local offsetting of the mean trends in sea level allows long-term comparison of storm severity and also enables an assessment of how sea level rise is influencing return levels over time, which is important when considering long-term coastal resilience in strategic management plans.
In the United States, regional-scale air quality models are being used to identify emissions reductions needed to comply with the ozone National Ambient Air Quality Standard. Previous work has demonstrated that ozone extreme values (i.e., 4th highest ozone or Design Value) are c...
Cunningham, K.J.; Sukop, M.C.; Huang, H.; Alvarez, P.F.; Curran, H.A.; Renken, R.A.; Dixon, J.F.
2009-01-01
A combination of cyclostratigraphic, ichnologic, and borehole geophysical analyses of continuous core holes; tracer-test analyses; and lattice Boltzmann flow simulations was used to quantify biogenic macroporosity and permeability of the Biscayne aquifer, southeastern Florida. Biogenic macroporosity largely manifests as: (1) ichnogenic macroporosity primarily related to postdepositional burrowing activity by callianassid shrimp and fossilization of components of their complex burrow systems (Ophiomorpha); and (2) biomoldic macroporosity originating from dissolution of fossil hard parts, principally mollusk shells. Ophiomorpha-dominated ichno-fabric provides the greatest contribution to hydrologic characteristics in the Biscayne aquifer in a 345 km2 study area. Stratiform tabular-shaped units of thalassinidean-associated macroporosity are commonly confined to the lower part of upward-shallowing high-frequency cycles, throughout aggradational cycles, and, in one case, they stack vertically within the lower part of a high-frequency cycle set. Broad continuity of many of the macroporous units concentrates groundwater flow in extremely permeable passage-ways, thus making the aquifer vulnerable to long-distance transport of contaminants. Ichnogenic macroporosity represents an alternative pathway for concentrated groundwater flow that differs considerably from standard karst flow-system paradigms, which describe groundwater movement through fractures and cavernous dissolution features. Permeabilities were calculated using lattice Boltzmann methods (LBMs) applied to computer renderings assembled from X-ray computed tomography scans of various biogenic macroporous limestone samples. The highest simulated LBM permeabilities were about five orders of magnitude greater than standard laboratory measurements using air-permeability methods, which are limited in their application to extremely permeable macroporous rock samples. Based on their close conformance to analytical solutions for pipe flow, LBMs offer a new means of obtaining accurate permeability values for such materials. We suggest that the stratiform ichnogenic groundwater flow zones have permeabilities even more extreme (???2-5 orders of magnitude higher) than the Jurassic "super-K" zones of the giant Ghawar oil field. The flow zones of the Pleistocene Biscayne aquifer provide examples of ichnogenic macroporosity for comparative analysis of origin and evolution in other carbonate aquifers, as well as petroleum reservoirs. ?? 2008 Geological Society of America.
NASA Astrophysics Data System (ADS)
Wang, Zhu; Shi, Peijun; Zhang, Zhao; Meng, Yongchang; Luan, Yibo; Wang, Jiwei
2017-09-01
Separating out the influence of climatic trend, fluctuations and extreme events on crop yield is of paramount importance to climate change adaptation, resilience, and mitigation. Previous studies lack systematic and explicit assessment of these three fundamental aspects of climate change on crop yield. This research attempts to separate out the impacts on rice yields of climatic trend (linear trend change related to mean value), fluctuations (variability surpassing the "fluctuation threshold" which defined as one standard deviation (1 SD) of the residual between the original data series and the linear trend value for each climatic variable), and extreme events (identified by absolute criterion for each kind of extreme events related to crop yield). The main idea of the research method was to construct climate scenarios combined with crop system simulation model. Comparable climate scenarios were designed to express the impact of each climate change component and, were input to the crop system model (CERES-Rice), which calculated the related simulated yield gap to quantify the percentage impacts of climatic trend, fluctuations, and extreme events. Six Agro-Meteorological Stations (AMS) in Hunan province were selected to study the quantitatively impact of climatic trend, fluctuations and extreme events involving climatic variables (air temperature, precipitation, and sunshine duration) on early rice yield during 1981-2012. The results showed that extreme events were found to have the greatest impact on early rice yield (-2.59 to -15.89%). Followed by climatic fluctuations with a range of -2.60 to -4.46%, and then the climatic trend (4.91-2.12%). Furthermore, the influence of climatic trend on early rice yield presented "trade-offs" among various climate variables and AMS. Climatic trend and extreme events associated with air temperature showed larger effects on early rice yield than other climatic variables, particularly for high-temperature events (-2.11 to -12.99%). Finally, the methodology use to separate out the influences of the climatic trend, fluctuations, and extreme events on crop yield was proved to be feasible and robust. Designing different climate scenarios and feeding them into a crop system model is a potential way to evaluate the quantitative impact of each climate variable.
Lecina-Diaz, Judit; Alvarez, Albert; Retana, Javier
2014-01-01
Crown fires associated with extreme fire severity are extremely difficult to control. We have assessed fire severity using differenced Normalized Burn Ratio (dNBR) from Landsat imagery in 15 historical wildfires of Pinus halepensis Mill. We have considered a wide range of innovative topographic, fuel and fire behavior variables with the purposes of (1) determining the variables that influence fire severity patterns among fires (considering the 15 wildfires together) and (2) ascertaining whether different variables affect extreme fire severity within the three fire types (topographic, convective and wind-driven fires). The among-fires analysis showed that fires in less arid climates and with steeper slopes had more extreme severity. In less arid conditions there was more crown fuel accumulation and closer forest structures, promoting high vertical and horizontal fuel continuity and extreme fire severity. The analyses carried out for each fire separately (within fires) showed more extreme fire severity in areas in northern aspects, with steeper slopes, with high crown biomass and in climates with more water availability. In northern aspects solar radiation was lower and fuels had less water limitation to growth which, combined with steeper slopes, produced more extreme severity. In topographic fires there was more extreme severity in northern aspects with steeper slopes and in areas with more water availability and high crown biomass; in convection-dominated fires there was also more extreme fire severity in northern aspects with high biomass; while in wind-driven fires there was only a slight interaction between biomass and water availability. This latter pattern could be related to the fact that wind-driven fires spread with high wind speed, which could have minimized the effect of other variables. In the future, and as a consequence of climate change, new zones with high crown biomass accumulated in non-common drought areas will be available to burn as extreme severity wildfires. PMID:24465492
Lecina-Diaz, Judit; Alvarez, Albert; Retana, Javier
2014-01-01
Crown fires associated with extreme fire severity are extremely difficult to control. We have assessed fire severity using differenced Normalized Burn Ratio (dNBR) from Landsat imagery in 15 historical wildfires of Pinus halepensis Mill. We have considered a wide range of innovative topographic, fuel and fire behavior variables with the purposes of (1) determining the variables that influence fire severity patterns among fires (considering the 15 wildfires together) and (2) ascertaining whether different variables affect extreme fire severity within the three fire types (topographic, convective and wind-driven fires). The among-fires analysis showed that fires in less arid climates and with steeper slopes had more extreme severity. In less arid conditions there was more crown fuel accumulation and closer forest structures, promoting high vertical and horizontal fuel continuity and extreme fire severity. The analyses carried out for each fire separately (within fires) showed more extreme fire severity in areas in northern aspects, with steeper slopes, with high crown biomass and in climates with more water availability. In northern aspects solar radiation was lower and fuels had less water limitation to growth which, combined with steeper slopes, produced more extreme severity. In topographic fires there was more extreme severity in northern aspects with steeper slopes and in areas with more water availability and high crown biomass; in convection-dominated fires there was also more extreme fire severity in northern aspects with high biomass; while in wind-driven fires there was only a slight interaction between biomass and water availability. This latter pattern could be related to the fact that wind-driven fires spread with high wind speed, which could have minimized the effect of other variables. In the future, and as a consequence of climate change, new zones with high crown biomass accumulated in non-common drought areas will be available to burn as extreme severity wildfires.
High-resolution near real-time drought monitoring in South Asia
NASA Astrophysics Data System (ADS)
Aadhar, Saran; Mishra, Vimal
2017-10-01
Drought in South Asia affect food and water security and pose challenges for millions of people. For policy-making, planning, and management of water resources at sub-basin or administrative levels, high-resolution datasets of precipitation and air temperature are required in near-real time. We develop a high-resolution (0.05°) bias-corrected precipitation and temperature data that can be used to monitor near real-time drought conditions over South Asia. Moreover, the dataset can be used to monitor climatic extremes (heat and cold waves, dry and wet anomalies) in South Asia. A distribution mapping method was applied to correct bias in precipitation and air temperature, which performed well compared to the other bias correction method based on linear scaling. Bias-corrected precipitation and temperature data were used to estimate Standardized precipitation index (SPI) and Standardized Precipitation Evapotranspiration Index (SPEI) to assess the historical and current drought conditions in South Asia. We evaluated drought severity and extent against the satellite-based Normalized Difference Vegetation Index (NDVI) anomalies and satellite-driven Drought Severity Index (DSI) at 0.05°. The bias-corrected high-resolution data can effectively capture observed drought conditions as shown by the satellite-based drought estimates. High resolution near real-time dataset can provide valuable information for decision-making at district and sub-basin levels.
Qureshi, Muhammad Naveed Iqbal; Min, Beomjun; Jo, Hang Joon; Lee, Boreom
2016-01-01
The classification of neuroimaging data for the diagnosis of certain brain diseases is one of the main research goals of the neuroscience and clinical communities. In this study, we performed multiclass classification using a hierarchical extreme learning machine (H-ELM) classifier. We compared the performance of this classifier with that of a support vector machine (SVM) and basic extreme learning machine (ELM) for cortical MRI data from attention deficit/hyperactivity disorder (ADHD) patients. We used 159 structural MRI images of children from the publicly available ADHD-200 MRI dataset. The data consisted of three types, namely, typically developing (TDC), ADHD-inattentive (ADHD-I), and ADHD-combined (ADHD-C). We carried out feature selection by using standard SVM-based recursive feature elimination (RFE-SVM) that enabled us to achieve good classification accuracy (60.78%). In this study, we found the RFE-SVM feature selection approach in combination with H-ELM to effectively enable the acquisition of high multiclass classification accuracy rates for structural neuroimaging data. In addition, we found that the most important features for classification were the surface area of the superior frontal lobe, and the cortical thickness, volume, and mean surface area of the whole cortex. PMID:27500640
Abuasbi, Falastine; Lahham, Adnan; Abdel-Raziq, Issam Rashid
2018-05-01
In this study, levels of extremely low-frequency electric and magnetic fields originated from overhead power lines were investigated in the outdoor environment in Ramallah city, Palestine. Spot measurements were applied to record fields intensities over 6-min period. The Spectrum Analyzer NF-5035 was used to perform measurements at 1 m above ground level and directly underneath 40 randomly selected power lines distributed fairly within the city. Levels of electric fields varied depending on the line's category (power line, transformer or distributor), a minimum mean electric field of 3.9 V/m was found under a distributor line, and a maximum of 769.4 V/m under a high-voltage power line (66 kV). However, results of electric fields showed a log-normal distribution with the geometric mean and the geometric standard deviation of 35.9 and 2.8 V/m, respectively. Magnetic fields measured at power lines, on contrast, were not log-normally distributed; the minimum and maximum mean magnetic fields under power lines were 0.89 and 3.5 μT, respectively. As a result, none of the measured fields exceeded the ICNIRP's guidelines recommended for general public exposures to extremely low-frequency fields.
Qureshi, Muhammad Naveed Iqbal; Min, Beomjun; Jo, Hang Joon; Lee, Boreom
2016-01-01
The classification of neuroimaging data for the diagnosis of certain brain diseases is one of the main research goals of the neuroscience and clinical communities. In this study, we performed multiclass classification using a hierarchical extreme learning machine (H-ELM) classifier. We compared the performance of this classifier with that of a support vector machine (SVM) and basic extreme learning machine (ELM) for cortical MRI data from attention deficit/hyperactivity disorder (ADHD) patients. We used 159 structural MRI images of children from the publicly available ADHD-200 MRI dataset. The data consisted of three types, namely, typically developing (TDC), ADHD-inattentive (ADHD-I), and ADHD-combined (ADHD-C). We carried out feature selection by using standard SVM-based recursive feature elimination (RFE-SVM) that enabled us to achieve good classification accuracy (60.78%). In this study, we found the RFE-SVM feature selection approach in combination with H-ELM to effectively enable the acquisition of high multiclass classification accuracy rates for structural neuroimaging data. In addition, we found that the most important features for classification were the surface area of the superior frontal lobe, and the cortical thickness, volume, and mean surface area of the whole cortex.
NASA Astrophysics Data System (ADS)
Kallache, M.
2012-04-01
Droughts cause important losses. On the Iberian Peninsula, for example, non-irrigated agriculture and the tourism sector are affected in regular intervals. The goal of this study is the description of droughts and their dependence in the Duero basin in Central Spain. To do so, daily or monthly precipitation data is used. Here cumulative precipitation deficits below a threshold define meteorological droughts. This drought indicator is similar to the commonly used standard precipitation index. However, here the focus lies on the modeling of severe droughts, which is done by applying multivariate extreme value theory (MEVT) to model extreme drought events. Data from several stations are assessed jointly, thus the uncertainty of the results is reduced. Droughts are a complex phenomenon, their severity, spatial extension and duration has to be taken into account. Our approach captures severity and spatial extension. In general we find a high correlation between deficit volumes and drought duration, thus the duration is not explicitely modeled. We apply a MEVT model with asymmetric logistic dependence function, which is capable to model asymptotic dependence and independence (cf. Ramos and Ledford, 2009). To summarize the information on the dependence in the joint tail of the extreme drought events, we utilise the fragility index (Geluk et al., 2007). Results show that droughts also occur frequently in winter. Moreover, it is very common for one site to suffer dry conditions, whilst neighboring areas experience normal or even humid conditions. Interpolation is thus difficult. Bivariate extremal dependence is present in the data. However, most stations are at least asymptotically independent. The according fragility indices are important information for risk calculations. The emerging spatial patterns for bivariate dependence are mostly influenced by topography. When looking at the dependence between more than two stations, it shows that joint extremes can occur more often than randomly for up to 6 stations, this depends on the distance between the stations.
Data-assisted reduced-order modeling of extreme events in complex dynamical systems
Koumoutsakos, Petros
2018-01-01
The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse. PMID:29795631
Data-assisted reduced-order modeling of extreme events in complex dynamical systems.
Wan, Zhong Yi; Vlachas, Pantelis; Koumoutsakos, Petros; Sapsis, Themistoklis
2018-01-01
The prediction of extreme events, from avalanches and droughts to tsunamis and epidemics, depends on the formulation and analysis of relevant, complex dynamical systems. Such dynamical systems are characterized by high intrinsic dimensionality with extreme events having the form of rare transitions that are several standard deviations away from the mean. Such systems are not amenable to classical order-reduction methods through projection of the governing equations due to the large intrinsic dimensionality of the underlying attractor as well as the complexity of the transient events. Alternatively, data-driven techniques aim to quantify the dynamics of specific, critical modes by utilizing data-streams and by expanding the dimensionality of the reduced-order model using delayed coordinates. In turn, these methods have major limitations in regions of the phase space with sparse data, which is the case for extreme events. In this work, we develop a novel hybrid framework that complements an imperfect reduced order model, with data-streams that are integrated though a recurrent neural network (RNN) architecture. The reduced order model has the form of projected equations into a low-dimensional subspace that still contains important dynamical information about the system and it is expanded by a long short-term memory (LSTM) regularization. The LSTM-RNN is trained by analyzing the mismatch between the imperfect model and the data-streams, projected to the reduced-order space. The data-driven model assists the imperfect model in regions where data is available, while for locations where data is sparse the imperfect model still provides a baseline for the prediction of the system state. We assess the developed framework on two challenging prototype systems exhibiting extreme events. We show that the blended approach has improved performance compared with methods that use either data streams or the imperfect model alone. Notably the improvement is more significant in regions associated with extreme events, where data is sparse.
Damage Effects Identified By Scatter Evaluation Of Supersmooth Surfaces
NASA Astrophysics Data System (ADS)
Stowell, W. K.; Orazio, Fred D.
1983-12-01
The surface quality of optics used in an extremely sensitive laser instrument, such as a Ring Laser Gyro (RLG), is critical. This has led to the development of a Variable Angle Scatterometer at the Air Force Wright Aeronautical Laboratories at Wright-Patterson Air Force Base, which can detect low level light scatter from the high quality optics used in RLG's, without first overcoating with metals. With this instrument we have been able to identify damage effects that occur during the typical processing and handling of optics which cause wide variation in subsequent measurements depending on when, in the process, one takes data. These measurements indicate that techniques such as a Total Integrated Scatter (TIS) may be inadequate for standards on extremely low scatter optics because of the lack of sensitivity of the method on such surfaces. The general term for optical surfaces better than the lowest level of the scratch-dig standards has become "supersmooth", and is seen in technical literature as well as in advertising. A performance number, such as Bidirectional Radiation Distribution Function (BRDF), which can be measured from the uncoated optical surface by equipment such as the Variable Angle Scatterometer (VAS) is proposed as a method of generating better optical surface specifications. Data show that surfaces of average BRDF values near 10 parts per billion per steriadian (0.010 PPM/Sr) for 0-(301 = 0.5, are now possible and measurable.
Damage Effects Identified By Scatter Evaluation Of Supersmooth Surfaces
NASA Astrophysics Data System (ADS)
Stowell, W. K.
1984-10-01
The surface quality of optics used in an extremely sensitive laser instrument, such as a Ring Laser Gyro (RLG), is critical. This has led to the development of a Variable Angle Scatterometer at the Air Force Wright Aeronautical Laboratories at Wright-Patterson Air Force Base, which can detect low level light scatter from the high quality optics used in RLG's, without first overcoating with metals. With this instrument we have been able to identify damage effects that occur during the typical processing and handling of optics which cause wide variation in subsequent measurements depending on when, in the process, one takes data. These measurements indicate that techniques such as a Total Integrated Scatter (TIS) may be inadequate for standards on extremely low scatter optics because of the lack of sensitivity of the method on such surfaces. The general term for optical surfaces better than the lowest level of the scratch-dig standards has become "supersmooth", and is seen in technical literature as well as in advertising. A performance number, such as Bidirectional Radiation Distribution Function (BRDF), which can be measured from the uncoated optical surface by equipment such as the Variable Angle Scatterometer (VAS) is proposed as a method of generating better optical surface specifications. Data show that surfaces of average BRDF values near 10 parts per billion per steriadian (0.010 PPM/Sr) for 0-(301 = 0.5, are now possible and measurable.
Extreme events, trends, and variability in Northern Hemisphere lake-ice phenology (1855-2005)
Benson, Barbara J.; Magnuson, John J.; Jensen, Olaf P.; Card, Virginia M.; Hodgkins, Glenn; Korhonen, Johanna; Livingstone, David M.; Stewart, Kenton M.; Weyhenmeyer, Gesa A.; Granin, Nick G.
2012-01-01
Often extreme events, more than changes in mean conditions, have the greatest impact on the environment and human well-being. Here we examine changes in the occurrence of extremes in the timing of the annual formation and disappearance of lake ice in the Northern Hemisphere. Both changes in the mean condition and in variability around the mean condition can alter the probability of extreme events. Using long-term ice phenology data covering two periods 1855–6 to 2004–5 and 1905–6 to 2004–5 for a total of 75 lakes, we examined patterns in long-term trends and variability in the context of understanding the occurrence of extreme events. We also examined patterns in trends for a 30-year subset (1975–6 to 2004–5) of the 100-year data set. Trends for ice variables in the recent 30-year period were steeper than those in the 100- and 150-year periods, and trends in the 150-year period were steeper than in the 100-year period. Ranges of rates of change (days per decade) among time periods based on linear regression were 0.3−1.6 later for freeze, 0.5−1.9 earlier for breakup, and 0.7−4.3 shorter for duration. Mostly, standard deviation did not change, or it decreased in the 150-year and 100-year periods. During the recent 50-year period, standard deviation calculated in 10-year windows increased for all ice measures. For the 150-year and 100-year periods changes in the mean ice dates rather than changes in variability most strongly influenced the significant increases in the frequency of extreme lake ice events associated with warmer conditions and decreases in the frequency of extreme events associated with cooler conditions.
Greven, Corina U; Merwood, Andrew; van der Meer, Jolanda M J; Haworth, Claire M A; Rommelse, Nanda; Buitelaar, Jan K
2016-04-01
Although attention deficit hyperactivity disorder (ADHD) is thought to reflect a continuously distributed quantitative trait, it is assessed through binary diagnosis or skewed measures biased towards its high, symptomatic extreme. A growing trend is to study the positive tail of normally distributed traits, a promising avenue, for example, to study high intelligence to increase power for gene-hunting for intelligence. However, the emergence of such a 'positive genetics' model has been tempered for ADHD due to poor phenotypic resolution at the low extreme. Overcoming this methodological limitation, we conduct the first study to assess the aetiologies of low extreme ADHD traits. In a population-representative sample of 2,143 twins, the Strength and Weaknesses of ADHD Symptoms and Normal behaviour (SWAN) questionnaire was used to assess ADHD traits on a continuum from low to high. Aetiological influences on extreme ADHD traits were estimated using DeFries-Fulker extremes analysis. ADHD traits were related to behavioural, cognitive and home environmental outcomes using regression. Low extreme ADHD traits were significantly influenced by shared environmental factors (23-35%) but were not significantly heritable. In contrast, high-extreme ADHD traits showed significant heritability (39-51%) but no shared environmental influences. Compared to individuals with high extreme or with average levels of ADHD traits, individuals with low extreme ADHD traits showed fewer internalizing and externalizing behaviour problems, better cognitive performance and more positive behaviours and positive home environmental outcomes. Shared environmental influences on low extreme ADHD traits may reflect passive gene-environment correlation, which arises because parents provide environments as well as passing on genes. Studying the low extreme opens new avenues to study mechanisms underlying previously neglected positive behaviours. This is different from the current deficit-based model of intervention, but congruent with a population-level approach to improving youth wellbeing. © 2015 The Authors. Journal of Child Psychology and Psychiatry published by John Wiley & Sons Ltd on behalf of Association for Child and Adolescent Mental Health.
Highly Accreting Quasars at High Redshift
NASA Astrophysics Data System (ADS)
Martínez-Aldama, Mary L.; Del Olmo, Ascensión; Marziani, Paola; Sulentic, Jack W.; Negrete, C. Alenka; Dultzin, Deborah; Perea, Jaime; D'Onofrio, Mauro
2017-12-01
We present preliminary results of a spectroscopic analysis for a sample of type 1 highly accreting quasars (LLedd>0.2) at high redshift, z 2-3. The quasars were observed with the OSIRIS spectrograph on the GTC 10.4 m telescope located at the Observatorio del Roque de los Muchachos in La Palma. The highly accreting quasars were identified using the 4D Eigenvector 1 formalism, which is able to organize type 1 quasars over a broad range of redshift and luminosity. The kinematic and physical properties of the broad line region have been derived by fitting the profiles of strong UV emission lines such as AlIII, SiIII and CIII. The majority of our sources show strong blueshifts in the high-ionization lines and high Eddington ratios which are related with the productions of outflows. The importance of highly accreting quasars goes beyond a detailed understanding of their physics: their extreme Eddington ratio makes them candidates standard candles for cosmological studies.
Almonroeder, Thomas Gus; Kernozek, Thomas; Cobb, Stephen; Slavens, Brooke; Wang, Jinsung; Huddleston, Wendy
2018-05-01
Study Design Cross-sectional study. Background The drop vertical jump task is commonly used to screen for anterior cruciate ligament injury risk; however, its predictive validity is limited. The limited predictive validity of the drop vertical jump task may be due to not imposing the cognitive demands that reflect sports participation. Objectives To investigate the influence of additional cognitive demands on lower extremity mechanics during execution of the drop vertical jump task. Methods Twenty uninjured women (age range, 18-25 years) were required to perform the standard drop vertical jump task, as well as drop vertical jumps that included additional cognitive demands. The additional cognitive demands were related to attending to an overhead goal (ball suspended overhead) and/or temporal constraints on movement selection (decision making). Three-dimensional ground reaction forces and lower extremity mechanics were compared between conditions. Results The inclusion of the overhead goal resulted in higher peak vertical ground reaction forces and lower peak knee flexion angles in comparison to the standard drop vertical jump task. In addition, participants demonstrated greater peak knee abduction angles when trials incorporated temporal constraints on decision making and/or required participants to attend to an overhead goal, in comparison to the standard drop vertical jump task. Conclusion Imposing additional cognitive demands during execution of the drop vertical jump task influenced lower extremity mechanics in a manner that suggested increased loading of the anterior cruciate ligament. Tasks utilized in anterior cruciate ligament injury risk screening may benefit from more closely reflecting the cognitive demands of the sports environment. J Orthop Sports Phys Ther 2018;48(5):381-387. Epub 10 Jan 2018. doi:10.2519/jospt.2018.7739.
Arctic daily temperature and precipitation extremes: Observed and simulated physical behavior
NASA Astrophysics Data System (ADS)
Glisan, Justin Michael
Simulations using a six-member ensemble of Pan-Arctic WRF (PAW) were produced on two Arctic domains with 50-km resolution to analyze precipitation and temperature extremes for various periods. The first study used a domain developed for the Regional Arctic Climate Model (RACM). Initial simulations revealed deep atmospheric circulation biases over the northern Pacific Ocean, manifested in pressure, geopotential height, and temperature fields. Possible remedies to correct these large biases, such as modifying the physical domain or using different initial/boundary conditions, were unsuccessful. Spectral (interior) nudging was introduced as a way of constraining the model to be more consistent with observed behavior. However, such control over numerical model behavior raises concerns over how much nudging may affect unforced variability and extremes. Strong nudging may reduce or filter out extreme events, since the nudging pushes the model toward a relatively smooth, large-scale state. The question then becomes---what is the minimum spectral nudging needed to correct biases while not limiting the simulation of extreme events? To determine this, we use varying degrees of spectral nudging, using WRF's standard nudging as a reference point during January and July 2007. Results suggest that there is a marked lack of sensitivity to varying degrees of nudging. Moreover, given that nudging is an artificial forcing applied in the model, an important outcome of this work is that nudging strength apparently can be considerably smaller than WRF's standard strength and still produce reliable simulations. In the remaining studies, we used the same PAW setup to analyze daily precipitation extremes simulated over a 19-year period on the CORDEX Arctic domain for winter and summer. We defined these seasons as the three-month period leading up to and including the climatological sea ice maximum and minimum, respectively. Analysis focused on four North American regions defined using climatological records, regional weather patterns, and geographical/topographical features. We compared simulated extremes with those occurring at corresponding observing stations in the U.S. National Climate Data Center's (NCDC's) Global Summary of the Day. Our analysis focused on variations in features of the extremes such as magnitudes, spatial scales, and temporal regimes. Using composites of extreme events, we also analyzed the processes producing these extremes, comparing circulation, pressure, temperature and humidity fields from the ERA-Interim reanalysis and the model output. The analysis revealed the importance of atmospheric convection in the Arctic for some extreme precipitation events and the overall importance of topographic precipitation. The analysis established the physical credibility of the simulations for extreme behavior, laying a foundation for examining projected changes in extreme precipitation. It also highlighted the utility of the model for extracting behavior that one cannot discern directly from the observations, such as summer convective precipitation.
The effect of prenatal support on birth outcomes in an urban midwestern county.
Schlenker, Thomas; Dresang, Lee T; Ndiaye, Mamadou; Buckingham, William R; Leavitt, Judith W
2012-12-01
In Dane County, Wisconsin, the black-white infant mortality gap started decreasing from 2000 and was eliminated from 2004 to 2007. Unfortunately, it has reappeared since 2008. This paper examines risk factors and levels of prenatal care to identify key contributors to the dramatic decline and recent increase in black infant mortality and extremely premature birth rates. This retrospective cohort study analyzed approximately 100,000 Dane County birth, fetal, and infant death records from 1990 to 2007. Levels of prenatal care received were categorized as "less-than-standard," "standard routine" or "intensive." US Census data analysis identified demographic and socioeconomic changes. Infant mortality rates and extremely premature ( < or = 28 weeks gestation) birth rates were main outcome measures. Contributions to improved outcomes were measured by calculating relative risk, risk difference and population attributable fraction (PAF). Mean income and food stamp use by race were analyzed as indicators of general socioeconomic changes suspected to be responsible for worsening outcomes since 2008. Risk of extremely premature delivery for black women receiving standard routine care and intensive care decreased from 1990-2000 to 2001-2007 by 77.8% (95% CI = 49.9-90.1%) and 57.3% (95% CI = 27.6-74.8%) respectively. Women receiving less-than-standard care showed no significant improvement over time. Racial gaps in mean income and food stamp use narrowed 2002-2007 and widened since 2008. Prenatal support played an important role in improving black birth outcomes and eliminating the Dane County black-white infant mortality gap. Increasing socioeconomic disparities with worsening US economy since 2008 likely contributed to the gap's reappearance.
The promise and challenge of high-throughput sequencing of the antibody repertoire
Georgiou, George; Ippolito, Gregory C; Beausang, John; Busse, Christian E; Wardemann, Hedda; Quake, Stephen R
2014-01-01
Efforts to determine the antibody repertoire encoded by B cells in the blood or lymphoid organs using high-throughput DNA sequencing technologies have been advancing at an extremely rapid pace and are transforming our understanding of humoral immune responses. Information gained from high-throughput DNA sequencing of immunoglobulin genes (Ig-seq) can be applied to detect B-cell malignancies with high sensitivity, to discover antibodies specific for antigens of interest, to guide vaccine development and to understand autoimmunity. Rapid progress in the development of experimental protocols and informatics analysis tools is helping to reduce sequencing artifacts, to achieve more precise quantification of clonal diversity and to extract the most pertinent biological information. That said, broader application of Ig-seq, especially in clinical settings, will require the development of a standardized experimental design framework that will enable the sharing and meta-analysis of sequencing data generated by different laboratories. PMID:24441474
NASA Astrophysics Data System (ADS)
Horton, R. M.; Coffel, E.; Kushnir, Y.
2014-12-01
Recent years have seen an increasing focus on extreme high temperature events, as our understanding of societal vulnerability to such extremes has grown. Less climate research has been devoted to heat indices that consider the joint hazard posed by high temperatures and high humidity, even though heat indices are being prioritized by utility providers and public health officials. This paper evaluates how well CMIP5 models are able to reproduce the large-scale features and surface conditions associated with joint high heat and humidity events in the Northeast U.S. Projected changes in heat indices are also shown both for the full set of CMIP5 models and for a subset of models that best reproduce the statistics of historical high heat index events. The importance of considering the relationship between 1) temperature and humidity extremes and 2) projected changes in extreme temperature and humidity extremes, rather than investigating each variable independently, will be emphasized. Potential impacts of the findings on human mortality and energy consumption will be briefly discussed.
Chiba, Satoshi
1999-04-01
An endemic land snail genus Mandarina of the oceanic Bonin (Ogasawara) Islands shows exceptionally rapid evolution not only of morphological and ecological traits, but of DNA sequence. A phylogenetic relationship based on mitochondrial DNA (mtDNA) sequences suggests that morphological differences equivalent to the differences between families were produced between Mandarina and its ancestor during the Pleistocene. The inferred phylogeny shows that species with similar morphologies and life habitats appeared repeatedly and independently in different lineages and islands at different times. Sequential adaptive radiations occurred in different islands of the Bonin Islands and species occupying arboreal, semiarboreal, and terrestrial habitat arose independently in each island. Because of a close relationship between shell morphology and life habitat, independent evolution of the same life habitat in different islands created species possesing the same shell morphology in different islands and lineages. This rapid evolution produced some incongruences between phylogenetic relationship and species taxonomy. Levels of sequence divergence of mtDNA among the species of Mandarina is extremely high. The maximum level of sequence divergence at 16S and 12S ribosomal RNA sequence within Mandarina are 18.7% and 17.7%, respectively, and this suggests that evolution of mtDNA of Mandarina is extremely rapid, more than 20 times faster than the standard rate in other animals. The present examination reveals that evolution of morphological and ecological traits occurs at extremely high rates in the time of adaptive radiation, especially in fragmented environments. © 1999 The Society for the Study of Evolution.
The NuSTAR view on hard-TeV BL Lacs
NASA Astrophysics Data System (ADS)
Costamante, L.; Bonnoli, G.; Tavecchio, F.; Ghisellini, G.; Tagliaferri, G.; Khangulyan, D.
2018-07-01
Hard-TeV BL Lacs are a new type of blazars characterized by a hard intrinsic TeV spectrum, locating the peak of their gamma-ray emission in the spectral energy distribution (SED) above 2-10 TeV. Such high energies are problematic for the Compton emission, using a standard one-zone leptonic model. We study six examples of this new type of BL Lacs in the hard X-ray band with NuSTAR. Together with simultaneous observations with the Neil Gehrels Swift Observatory, we fully constrain the peak of the synchrotron emission in their SED, and test the leptonic synchrotron self-Compton (SSC) model. We confirm the extreme nature of five objects also in the synchrotron emission. We do not find evidence of additional emission components in the hard X-ray band. We find that a one-zone SSC model can in principle reproduce the extreme properties of both peaks in the SED, from X-ray up to TeV energies, but at the cost of (i) extreme electron energies with very low radiative efficiency, (ii) conditions heavily out of equipartition (by three to five orders of magnitude), and (iii) not accounting for the simultaneous UV data, which then should belong to a different emission component, possibly the same as the far-IR (WISE) data. We find evidence of this separation of the UV and X-ray emission in at least two objects. In any case, the TeV electrons must not `see' the UV or lower energy photons, even if coming from different zones/populations, or the increased radiative cooling would steepen the very high energies spectrum.
NASA Astrophysics Data System (ADS)
Shouquan Cheng, Chad; Li, Qian; Li, Guilong
2010-05-01
The synoptic weather typing approach has become popular in evaluating the impacts of climate change on a variety of environmental problems. One of the reasons is its ability to categorize a complex set of meteorological variables as a coherent index, which can facilitate analyses of local climate change impacts. The weather typing method has been successfully applied in Environment Canada for several research projects to analyze climatic change impacts on a number of extreme weather events, such as freezing rain, heavy rainfall, high-/low-flow events, air pollution, and human health. These studies comprise of three major parts: (1) historical simulation modeling to verify the extreme weather events, (2) statistical downscaling to provide station-scale future hourly/daily climate data, and (3) projections of changes in frequency and intensity of future extreme weather events in this century. To achieve these goals, in addition to synoptic weather typing, the modeling conceptualizations in meteorology and hydrology and a number of linear/nonlinear regression techniques were applied. Furthermore, a formal model result verification process has been built into each of the three parts of the projects. The results of the verification, based on historical observations of the outcome variables predicted by the models, showed very good agreement. The modeled results from these projects found that the frequency and intensity of future extreme weather events are projected to significantly increase under a changing climate in this century. This talk will introduce these research projects and outline the modeling exercise and result verification process. The major findings on future projections from the studies will be summarized in the presentation as well. One of the major conclusions from the studies is that the procedures (including synoptic weather typing) used in the studies are useful for climate change impact analysis on future extreme weather events. The implication of the significant increases in frequency and intensity of future extreme weather events would be useful to be considered when revising engineering infrastructure design standards and developing adaptation strategies and policies.
Risk factors for lower extremity injuries in elite female soccer players.
Nilstad, Agnethe; Andersen, Thor Einar; Bahr, Roald; Holme, Ingar; Steffen, Kathrin
2014-04-01
The incidence of lower extremity injuries in female soccer players is high, but the risk factors for injuries are unknown. To investigate risk factors for lower extremity injuries in elite female soccer players. Cohort study; Level of evidence, 3. Players in the Norwegian elite female soccer league (N = 12 teams) participated in baseline screening tests before the 2009 competitive soccer season. The screening included tests assessing maximal lower extremity strength, dynamic balance, knee valgus angles in a drop-jump landing, knee joint laxity, generalized joint laxity, and foot pronation. Also included was a questionnaire to collect information on demographic data, elite-level experience, and injury history. Time-loss injuries and exposure in training and matches were recorded prospectively in the subsequent soccer season using weekly text messaging. Players reporting an injury were contacted to collect data regarding injury circumstances. Univariate and multivariate regression analyses were used to calculate odds ratios (ORs) and 95% confidence intervals (CIs) for ±1 standard deviation of change. In total, 173 players underwent complete screening tests and registration of injuries and exposure throughout the season. A total of 171 injuries in 107 players (62%) were recorded; ligament and muscle injuries were the most frequent. Multivariate analyses showed that a greater body mass index (BMI) (OR, 1.51; 95% CI, 1.21-1.90; P = .001) was the only factor significantly associated with new lower extremity injuries. A greater BMI was associated with new thigh injuries (OR, 1.51; 95% CI, 1.08-2.11; P = .01), a lower knee valgus angle in a drop-jump landing was associated with new ankle injuries (OR, 0.64; 95% CI, 0.41-1.00; P = .04), and a previous knee injury was associated with new lower leg and foot injuries (OR, 3.57; 95% CI, 1.27-9.99; P = .02), whereas none of the factors investigated influenced the risk of new knee injuries. A greater BMI was associated with lower extremity injuries in elite female soccer players. Increased knowledge on risk factors for lower extremity injuries enables more targeted prevention strategies with the aim of reducing injury rates in female soccer players.
Functional outcomes and life satisfaction in long-term survivors of pediatric sarcomas.
Gerber, Lynn H; Hoffman, Karen; Chaudhry, Usha; Augustine, Elizabeth; Parks, Rebecca; Bernad, Martha; Mackall, Crystal; Steinberg, Seth; Mansky, Patrick
2006-12-01
To describe the inter-relationships among impairments, performance, and disabilities in survivors of pediatric sarcoma and to identify measurements that profile survivors at risk for functional loss. Prospective, cross-sectional. Research facility. Thirty-two participants in National Cancer Institute clinical trials. Not applicable. Range of motion (ROM), strength, limb volume, grip strength, walk velocity, Assessment of Motor and Process Skills (AMPS); Human Activity Profile (HAP), Sickness Impact Profile (SIP), standard form of the Medical Outcomes Study 36-Item Short-Form Health Survey (SF-36); and vocational attitudes and leisure satisfaction. Twenty of 30 survivors tested had moderate or severe loss of ROM; 13 of 31 tested had 90% or less of predicted walk velocity; all of whom had trunk or lower-extremity lesions. Women with decreased ROM (r=.50, P=.06) or strength (r=.74, P=.002) had slow gait velocity. Sixteen of 31 tested were more than 1 standard deviation below normal grip strength. Eighteen had increased limb volume. These 18 had low physical competence (SF-36) (r=-.70, P=.001) and high SIP scores (r=.73, P=.005). AMPS scores were lower than those of the matched normed sample (P<.001). HAP identified 15 of 30 who had moderately or severely reduced activity. Leisure satisfaction was higher in the subjects (P<.001). Eight reported cancer had negatively impacted work and 17 reported that it negatively impacted vocational plans. Survivors with lower-extremity or truncal lesions and women with decreased ROM and strength likely have slow walk velocity, low exercise tolerance, and high risk for functional loss. They should be identified using ROM, strength, limb volume, and walk time measures.
Liang, Eryuan; Eckstein, Dieter
2009-09-01
Shrubs and dwarf shrubs are wider spread on the Tibetan Plateau than trees and hence offer a unique opportunity to expand the present dendrochronological network into extreme environments beyond the survival limit of trees. Alpine shrublands on the Tibetan Plateau are characterized by rhododendron species. The dendrochronological potential of one alpine rhododendron species and its growth response to the extreme environment on the south-east Tibetan Plateau were investigated. Twenty stem discs of the alpine snowy rhododendron (Rhododendron nivale) were collected close to the tongue of the Zuoqiupu Glacier in south-east Tibet, China. The skeleton plot technique was used for inter-comparison between samples to detect the growth pattern of each stem section. The ring-width chronology was developed by fitting a negative exponential function or a straight line of any slope. Bootstrapping correlations were calculated between the standard chronology and monthly climate data. The wood of snowy rhododendron is diffuse-porous with evenly distributed small-diameter vessels. It has well-defined growth rings. Most stem sections can be visually and statistically cross-dated. The resulting 75-year-long standard ring-width chronology is highly correlated with a timberline fir chronology about 200 km apart, providing a high degree of confidence in the cross-dating. The climate/growth association of alpine snowy rhododendron and of this timberline fir is similar, reflecting an impact of monthly mean minimum temperatures in November of the previous year and in July during the year of ring formation. The alpine snowy rhododendron offers new research directions to investigate the environmental history of the Tibetan Plateau in those regions where up to now there was no chance of applying dendrochronology.
NASA Astrophysics Data System (ADS)
Matsuno, Tadafumi; Aoki, Wako; Beers, Timothy C.; Lee, Young Sun; Honda, Satoshi
2017-08-01
We present elemental abundances for eight unevolved extremely metal-poor (EMP) stars with {T}{eff}> 5500 {{K}}, among which seven have [{Fe}/{{H}}]< -3.5. The sample is selected from the Sloan Digital Sky Survey/Sloan Extension for Galactic Understanding and Exploration (SDSS/SEGUE) and our previous high-resolution spectroscopic follow-up with the Subaru Telescope. Several methods to derive stellar parameters are compared, and no significant offset in the derived parameters is found in most cases. From an abundance analysis relative to the standard EMP star G64-12, an average Li abundance for stars with [{Fe}/{{H}}]< -3.5 is A({Li})=1.90, with a standard deviation of σ =0.10 dex. This result confirms that lower Li abundances are found at lower metallicity, as suggested by previous studies, and demonstrates that the star-to-star scatter is small. The small observed scatter could be a strong constraint on Li-depletion mechanisms proposed for explaining the low Li abundance at lower metallicity. Our analysis for other elements obtained the following results: (I) a statistically significant scatter in [{{X}}/{Fe}] for Na, Mg, Cr, Ti, Sr, and Ba, and an apparent bimodality in [{Na}/{Fe}] with a separation of ˜ 0.8 {dex}, (II) an absence of a sharp drop in the metallicity distribution, and (III) the existence of a CEMP-s star at [{Fe}/{{H}}]≃ -3.6 and possibly at [{Fe}/{{H}}]≃ -4.0, which may provide a constraint on the mixing efficiency of unevolved stars during their main-sequence phase. Based on data collected with the Subaru Telescope, which is operated by the National Astronomical Observatory of Japan.
A Stellar-mass Black Hole in the Ultra-luminous X-ray Source M82 X-1
NASA Technical Reports Server (NTRS)
Okajima, Takashi; Ebisawa, Ken; Kawaguchi, Toshihiro
2007-01-01
We have analyzed the archival XMM-Newton data of the archetypal Ultra-Luminous X-ray Source (ULX) M82 X-1 with an LO5 ksec exposure when the source was in the steady state. Thanks to the high photon statistics from the large effective area and long exposure, we were able to discriminate different X-ray continuum spectral models. Neither the standard accretion disk model (where the radial dependency of the disk effective temperature is T(r) proportional to r(sup -3/4)) nor a power-law model gives a satisfactory fit. In fact, observed curvature of the M82 X-1 spectrum was just between those of the two models. When the exponent of the radial dependence (p in T(r) proportional to r(sup -P)) of the disk temperature is allowed to be free, we obtained p = 0.61 (sup +0.03)(sub -0.02). Such a reduction of p from the standard value 3/4 under extremely high mass accretion rates is predicted from the accretion disk theory as a consequence of the radial energy advection. Thus, the accretion disk in M82 X-1 is considered to be in the Slim disk state, where an optically thick Advection Dominant Accretion Flow (ADAF) is taking place. We have applied a theoretical slim disk spectral model to M82 X-1, and estimated the black hole mass approximately equal to 19 - 32 solar mass. We conclude that M82 X-1 is a stellar black hole which has been produced through evolution of an extremely massive star, shining at a several times the super-Eddington luminosity.
Effects of Extreme Temperatures on Cause-Specific Cardiovascular Mortality in China
Wang, Xuying; Li, Guoxing; Liu, Liqun; Westerdahl, Dane; Jin, Xiaobin; Pan, Xiaochuan
2015-01-01
Objective: Limited evidence is available for the effects of extreme temperatures on cause-specific cardiovascular mortality in China. Methods: We collected data from Beijing and Shanghai, China, during 2007–2009, including the daily mortality of cardiovascular disease, cerebrovascular disease, ischemic heart disease and hypertensive disease, as well as air pollution concentrations and weather conditions. We used Poisson regression with a distributed lag non-linear model to examine the effects of extremely high and low ambient temperatures on cause-specific cardiovascular mortality. Results: For all cause-specific cardiovascular mortality, Beijing had stronger cold and hot effects than those in Shanghai. The cold effects on cause-specific cardiovascular mortality reached the strongest at lag 0–27, while the hot effects reached the strongest at lag 0–14. The effects of extremely low and high temperatures differed by mortality types in the two cities. Hypertensive disease in Beijing was particularly susceptible to both extremely high and low temperatures; while for Shanghai, people with ischemic heart disease showed the greatest relative risk (RRs = 1.16, 95% CI: 1.03, 1.34) to extremely low temperature. Conclusion: People with hypertensive disease were particularly susceptible to extremely low and high temperatures in Beijing. People with ischemic heart disease in Shanghai showed greater susceptibility to extremely cold days. PMID:26703637
Effects of Extreme Temperatures on Cause-Specific Cardiovascular Mortality in China.
Wang, Xuying; Li, Guoxing; Liu, Liqun; Westerdahl, Dane; Jin, Xiaobin; Pan, Xiaochuan
2015-12-21
Limited evidence is available for the effects of extreme temperatures on cause-specific cardiovascular mortality in China. We collected data from Beijing and Shanghai, China, during 2007-2009, including the daily mortality of cardiovascular disease, cerebrovascular disease, ischemic heart disease and hypertensive disease, as well as air pollution concentrations and weather conditions. We used Poisson regression with a distributed lag non-linear model to examine the effects of extremely high and low ambient temperatures on cause-specific cardiovascular mortality. For all cause-specific cardiovascular mortality, Beijing had stronger cold and hot effects than those in Shanghai. The cold effects on cause-specific cardiovascular mortality reached the strongest at lag 0-27, while the hot effects reached the strongest at lag 0-14. The effects of extremely low and high temperatures differed by mortality types in the two cities. Hypertensive disease in Beijing was particularly susceptible to both extremely high and low temperatures; while for Shanghai, people with ischemic heart disease showed the greatest relative risk (RRs = 1.16, 95% CI: 1.03, 1.34) to extremely low temperature. People with hypertensive disease were particularly susceptible to extremely low and high temperatures in Beijing. People with ischemic heart disease in Shanghai showed greater susceptibility to extremely cold days.
Hayes, Heather A; Gappmaier, Eduard; LaStayo, Paul C
2011-03-01
Resistance exercise via negative, eccentrically induced work (RENEW) has been shown to be associated with improvements in strength, mobility, and balance in multiple clinical populations. However, RENEW has not been reported for individuals with multiple sclerosis (MS). Nineteen individuals with MS (8 men, 11 women; age mean = 49 ± 11 years; Expanded Disability Status Scale [EDSS] mean = 5.2 ± 0.9) were randomized into either standard exercise (STAND) or standard exercise and RENEW training (RENEW) for 3×/week for 12 weeks. Outcome measures were lower extremity strength (hip/knee flexion and extension, ankle plantar and dorsiflexion, and the sum of these individual values [sum strength]); Timed Up and Go (TUG), 10-m walk, self-selected pace (TMWSS) and maximal-pace (TMWMP), stair ascent (S-A) and descent (S-D) and 6-Minute Walk Test (6MWT), Berg Balance Scale (BBS), Fatigue Severity Scale (FSS). No significant time effects or interactions were observed for strength, TUG, TMWSS, TMWMP, or 6MWT. However, the mean difference in sum strength in the RENEW group was 38.60 (representing a 15% increase) compared to the sum strength observed in the STAND group with a mean difference of 5.58 (a 2% increase). A significant interaction was observed for S-A, S-D, and BBS as the STAND group improved whereas the RENEW group did not improve in these measures. Contrary to results in other populations, the addition of eccentric training to standard exercises did not result in significantly greater lower extremity strength gains in this group of individuals with MS. Further this training was not as effective as standard exercise alone in improving balance or the ability to ascend and descend stairs. Following data collection, reassessment of required sample size indicates we were likely underpowered to detect strength differences between groups.
Dudek, Dominika; Siwek, Marcin; Jaeschke, Rafał; Drozdowicz, Katarzyna; Styczeń, Krzysztof; Arciszewska, Aleksandra; Chrobak, Adrian A; Rybakowski, Janusz K
2016-06-01
We hypothesised that men and women who engage in extreme or high-risk sports would score higher on standardised measures of bipolarity and impulsivity compared to age and gender matched controls. Four-hundred and eighty extreme or high-risk athletes (255 males and 225 females) and 235 age-matched control persons (107 males and 128 females) were enrolled into the web-based case-control study. The Mood Disorder Questionnaire (MDQ) and Barratt Impulsiveness Scale (BIS-11) were administered to screen for bipolarity and impulsive behaviours, respectively. Results indicated that extreme or high-risk athletes had significantly higher scores of bipolarity and impulsivity, and lower scores on cognitive complexity of the BIS-11, compared to controls. Further, there were positive correlations between the MDQ and BIS-11 scores. These results showed greater rates of bipolarity and impulsivity, in the extreme or high-risk athletes, suggesting these measures are sensitive to high-risk behaviours.
Yin, Qian; Wang, Jinfeng
2017-02-23
Although many studies have examined the effects of heat waves on the excess mortality risk (ER) posed by cardiovascular disease (CVD), scant attention has been paid to the effects of various combinations of differing heat wave temperatures and durations. We investigated such effects in Beijing, a city of over 20 million residents. A generalized additive model (GAM) was used to analyze the ER of consecutive days' exposure to extreme high temperatures. A key finding was that when extremely high temperatures occur continuously, at varying temperature thresholds and durations, the adverse effects on CVD mortality vary significantly. The longer the heat wave lasts, the greater the mortality risk is. When the daily maximum temperature exceeded 35 °C from the fourth day onward, the ER attributed to consecutive days' high temperature exposure saw an increase to about 10% (p < 0.05), and at the fifth day, the ER even reached 51%. For the thresholds of 32 °C, 33 °C, and 34 °C, from the fifth day onward, the ER also rose sharply (16, 29, and 31%, respectively; p < 0.05). In addition, extreme high temperatures appeared to contribute to a higher proportion of CVD deaths among elderly persons, females and outdoor workers. When the daily maximum temperature was higher than 33 °C from the tenth consecutive day onward, the ER of CVD death among these groups was 94, 104 and 149%, respectively (p < 0.05), which is considerably higher than the ER for the overall population (87%; p < 0.05). The results of this study may assist governments in setting standards for heat waves, creating more accurate heat alerts, and taking measures to prevent or reduce temperature-related deaths, especially against the backdrop of global warming.
Gao, Yuan; Zhang, Haijun; Zou, Lili; Wu, Ping; Yu, Zhengkun; Lu, Xianbo; Chen, Jiping
2016-04-05
Analysis of short-chain chlorinated paraffins (SCCPs) is extremely difficult because of their complex compositions with thousands of isomers and homologues. A novel analytical method, deuterodechlorination combined with high resolution gas chromatography-high resolution mass spectrometry (HRGC-HRMS), was developed. A protocol is applied in the deuterodechlorination of SCCPs with LiAlD4, and the formed deuterated n-alkanes of different alkane chains can be distinguished readily from each other on the basis of their retention time and fragment mass ([M](+)) by HRGC-HRMS. An internal standard quantification of individual SCCP congeners was achieved, in which branched C10-CPs and branched C12-CPs were used as the extraction and reaction internal standards, respectively. A maximum factor of 1.26 of the target SCCP concentrations were determined by this method, and the relative standard deviations for quantification of total SCCPs were within 10%. This method was applied to determine the congener compositions of SCCPs in commercial chlorinated paraffins and environmental and biota samples after method validation. Low-chlorinated SCCP congeners (Cl1-4) were found to account for 32.4%-62.4% of the total SCCPs. The present method provides an attractive perspective for further studies on the toxicological and environmental characteristics of SCCPs.
Evaluation of ground-based particulate matter in association with measurements from space
NASA Astrophysics Data System (ADS)
Nakata, Makiko; Yoshida, Akihito; Sano, Itaru; Mukai, Sonoyo
2017-10-01
Air pollution is problem of deep concern to human health. In Japan, the air pollution levels experienced during the recent period of rapid economic growth have been reduced. However, fine particulate matter (PM2.5) has not yet reached the environmental standards at many monitoring stations. The Japanese environmental quality standard for PM2.5 that was ratified in 2009 lags about four decades behind other air pollutants, including sulfur dioxide, nitrogen dioxide, carbon monoxide, photochemical oxidants, and suspended particulate matter. Recently, trans-national air pollutants have been observed to cause high concentrations of PM2.5 in Japan. To obtain wide distribution of PM2.5, the satellite based PM2.5 products are extremely useful. We investigate PM2.5 concentrations measured using ground samplers in Japan and the satellite based PM2.5 products, taking into consideration various geographical and weather conditions.
Druzhinin, V N; Shardakova, É F; Cherniĭ, A N
2014-01-01
The studies using multiple X-ray methods covered influence of complex containing working process and occupational environment factors on locomotory apparatus of upper limbs and cervical spine in female seamers engaged into various productions. Comparative analysis involved results of regular (standard X-ray) and special X-ray methods (stereoroentgenography, high definition roentgenography, roentgen densitometry, roentgenogrammetry) in 370 examinees with early and moderate clinical symptoms of occupationally mediated diseases of the stated areas. X-ray studies of locomotory apparatus of upper limbs and cervical spine in clothing manufacture workers, with special diagnostic methods, enabled to determine incidence and severity of functional and structural changes more reliably than via standard examination. The changes revealed were assigned mostly in "early" and "moderate" categories and matched with occupational peculiarities of the workers examined.
NASA Astrophysics Data System (ADS)
Santillan, Julius Joseph; Itani, Toshiro
2013-06-01
The characterization of the resist dissolution is one fundamental area of research that has been continuously investigated. This paper focuses on the preliminary work on the application the high speed atomic force microscope (HS-AFM) for the in situ dissolution analysis half-pitch (hp) lines and spaces (L/S) at standard developer concentration. In earlier works, this has been difficult but through extensive optimization and the use of carbon nano fiber-tipped cantilevers, the dissolution characterization of a 32 nm hp L/S pattern at 0.26 N aqueous tetramethylammonium hydroxide developer (standard developer concentration) was successfully achieved. Based on the results obtained using the EIDEC standard resist (ESR1) it was found that regardless of analysis condition such as resist pattern configuration (isolated or L/S pattern) and developer concentration (diluted or standard), similar dissolution characteristics in the form of resist swelling of exposed areas was observed. Moreover, further investigations using other types of model resist polymer platforms such as poly(hydroxystyrene) (PHS)-based and hybrid (PHS-methacryl)-based model resists have confirmed that dissolution behavior is not affected by the analysis conditions applied.
A biographical study of food choice capacity: standards, circumstances, and food management skills.
Bisogni, Carole A; Jastran, Margaret; Shen, Luana; Devine, Carol M
2005-01-01
Conceptual understanding of how management of food and eating is linked to life course events and experiences. Individual qualitative interviews with adults in upstate New York. Fourteen men and 11 women with moderate to low incomes. PHENOMENON: Food choice capacity. Constant comparative method. A conceptual model of food choice capacity emerged. Food choice capacity represented participants' confidence in meeting their standards for food and eating given their food management skills and circumstances. Standards (expectations for how participants felt they should eat) were based on life course events and experiences. Food management skills (mental and physical talents to keep food costs down and prepare meals) were sources of self-esteem for many participants. Most participants had faced challenging and changing circumstances (income, employment, social support, roles, health conditions). Participants linked strong food management skills with high levels of food choice capacity, except in the case of extreme financial circumstances or the absence of strong standards. Recognizing people's experiences and perspectives in food choice is important. Characterizing food management skills as durable, adaptive resources positions them conceptually for researchers and in a way that practitioners can apply in developing programs for adults.
How Historical Information Can Improve Extreme Value Analysis of Coastal Water Levels
NASA Astrophysics Data System (ADS)
Le Cozannet, G.; Bulteau, T.; Idier, D.; Lambert, J.; Garcin, M.
2016-12-01
The knowledge of extreme coastal water levels is useful for coastal flooding studies or the design of coastal defences. While deriving such extremes with standard analyses using tide gauge measurements, one often needs to deal with limited effective duration of observation which can result in large statistical uncertainties. This is even truer when one faces outliers, those particularly extreme values distant from the others. In a recent work (Bulteau et al., 2015), we investigated how historical information of past events reported in archives can reduce statistical uncertainties and relativize such outlying observations. We adapted a Bayesian Markov Chain Monte Carlo method, initially developed in the hydrology field (Reis and Stedinger, 2005), to the specific case of coastal water levels. We applied this method to the site of La Rochelle (France), where the storm Xynthia in 2010 generated a water level considered so far as an outlier. Based on 30 years of tide gauge measurements and 8 historical events since 1890, the results showed a significant decrease in statistical uncertainties on return levels when historical information is used. Also, Xynthia's water level no longer appeared as an outlier and we could have reasonably predicted the annual exceedance probability of that level beforehand (predictive probability for 2010 based on data until the end of 2009 of the same order of magnitude as the standard estimative probability using data until the end of 2010). Such results illustrate the usefulness of historical information in extreme value analyses of coastal water levels, as well as the relevance of the proposed method to integrate heterogeneous data in such analyses.
Chang, Hyung Lan; Jung, Jin Hee; Kwak, Young Ho; Kim, Do Kyun; Lee, Jin Hee; Jung, Jae Yun; Kwon, Hyuksool; Paek, So Hyun; Park, Joong Wan; Shin, Jonghwan
2018-03-01
The aim of this study was to investigate the effectiveness of a quality improvement activity for pain management in patients with extremity injury in the emergency department (ED). This was a retrospective interventional study. The patient group consisted of those at least 19 years of age who visited the ED and were diagnosed with International Classification of Diseases codes S40-S99 (extremity injuries). The quality improvement activity consisted of three measures: a survey regarding activities, education, and the triage nurse's pain assessment, including change of pain documentation on electronic medical records. The intervention was conducted from January to April in 2014 and outcome was compared between May and August in 2013 and 2014. The primary outcome was the rate of analgesic prescription, and the secondary outcome was the time to analgesic prescription. A total of 1,739 patients were included, and 20.3% of 867 patients in the pre-intervention period, and 28.8% of 872 patients in the post-intervention period received analgesics (P< 0.001). The prescription rate of analgesics for moderate-to-severe injuries was 36.4% in 2013 and 44.5% in 2014 (P=0.026). The time to analgesics prescription was 116.6 minutes (standard deviation 225.6) in 2013 and 64 minutes (standard deviation 75.5) in 2014 for all extremity injuries. The pain scoring increased from 1.4% to 51.6%. ED-based quality improvement activities including education and change of pain score documentation can improve the rate of analgesic prescription and time to prescription for patients with extremity injury in the ED.
NASA Astrophysics Data System (ADS)
Menz, Christoph
2016-04-01
Climate change interferes with various aspects of the socio-economic system. One important aspect is its influence on animal husbandry, especially dairy faming. Dairy cows are usually kept in naturally ventilated barns (NVBs) which are particular vulnerable to extreme events due to their low adaptation capabilities. An effective adaptation to high outdoor temperatures for example, is only possible under certain wind and humidity conditions. High temperature extremes are expected to increase in number and strength under climate change. To assess the impact of this change on NVBs and dairy cows also the changes in wind and humidity needs to be considered. Hence we need to consider the multivariate structure of future temperature extremes. The OptiBarn project aims to develop sustainable adaptation strategies for dairy housings under climate change for Europe, by considering the multivariate structure of high temperature extremes. In a first step we identify various multivariate high temperature extremes for three core regions in Europe. With respect to dairy cows in NVBs we will focus on the wind and humidity field during high temperature events. In a second step we will use the CORDEX-EUR-11 ensemble to evaluate the capability of the RCMs to model such events and assess their future change potential. By transferring the outdoor conditions to indoor climate and animal wellbeing the results of this assessment can be used to develop technical, architectural and animal specific adaptation strategies for high temperature extremes.
NASA Astrophysics Data System (ADS)
Varanasi, Rao; Mesawich, Michael; Connor, Patrick; Johnson, Lawrence
2017-03-01
Two versions of a specific 2nm rated filter containing filtration medium and all other components produced from high density polyethylene (HDPE), one subjected to standard cleaning, the other to specialized ultra-cleaning, were evaluated in terms of their cleanliness characteristics, and also defectivity of wafers processed with photoresist filtered through each. With respect to inherent cleanliness, the ultraclean version exhibited a 70% reduction in total metal extractables and 90% reduction in organics extractables compared to the standard clean version. In terms of particulate cleanliness, the ultraclean version achieved stability of effluent particles 30nm and larger in about half the time required by the standard clean version, also exhibiting effluent levels at stability almost 90% lower. In evaluating defectivity of blanket wafers processed with photoresist filtered through either version, initial defect density while using the ultraclean version was about half that observed when the standard clean version was in service, with defectivity also falling more rapidly during subsequent usage of the ultraclean version compared to the standard clean version. Similar behavior was observed for patterned wafers, where the enhanced defect reduction was primarily of bridging defects. The filter evaluation and actual process-oriented results demonstrate the extreme value in using filtration designed possessing the optimal intrinsic characteristics, but with further improvements possible through enhanced cleaning processes
NASA Astrophysics Data System (ADS)
Sun, Qiaohong; Miao, Chiyuan; Qiao, Yuanyuan; Duan, Qingyun
2017-12-01
The El Niño-Southern Oscillation (ENSO) and local temperature are important drivers of extreme precipitation. Understanding the impact of ENSO and temperature on the risk of extreme precipitation over global land will provide a foundation for risk assessment and climate-adaptive design of infrastructure in a changing climate. In this study, nonstationary generalized extreme value distributions were used to model extreme precipitation over global land for the period 1979-2015, with ENSO indicator and temperature as covariates. Risk factors were estimated to quantify the contrast between the influence of different ENSO phases and temperature. The results show that extreme precipitation is dominated by ENSO over 22% of global land and by temperature over 26% of global land. With a warming climate, the risk of high-intensity daily extreme precipitation increases at high latitudes but decreases in tropical regions. For ENSO, large parts of North America, southern South America, and southeastern and northeastern China are shown to suffer greater risk in El Niño years, with more than double the chance of intense extreme precipitation in El Niño years compared with La Niña years. Moreover, regions with more intense precipitation are more sensitive to ENSO. Global climate models were used to investigate the changing relationship between extreme precipitation and the covariates. The risk of extreme, high-intensity precipitation increases across high latitudes of the Northern Hemisphere but decreases in middle and lower latitudes under a warming climate scenario, and will likely trigger increases in severe flooding and droughts across the globe. However, there is some uncertainties associated with the influence of ENSO on predictions of future extreme precipitation, with the spatial extent and risk varying among the different models.
Crick, Alex J; Cammarota, Eugenia; Moulang, Katie; Kotar, Jurij; Cicuta, Pietro
2015-01-01
Live optical microscopy has become an essential tool for studying the dynamical behaviors and variability of single cells, and cell-cell interactions. However, experiments and data analysis in this area are often extremely labor intensive, and it has often not been achievable or practical to perform properly standardized experiments on a statistically viable scale. We have addressed this challenge by developing automated live imaging platforms, to help standardize experiments, increasing throughput, and unlocking previously impossible ones. Our real-time cell tracking programs communicate in feedback with microscope and camera control software, and they are highly customizable, flexible, and efficient. As examples of our current research which utilize these automated platforms, we describe two quite different applications: egress-invasion interactions of malaria parasites and red blood cells, and imaging of immune cells which possess high motility and internal dynamics. The automated imaging platforms are able to track a large number of motile cells simultaneously, over hours or even days at a time, greatly increasing data throughput and opening up new experimental possibilities. Copyright © 2015 Elsevier Inc. All rights reserved.
The influence of selective chemical doping on clean, low-carrier density SiC epitaxial graphene
NASA Astrophysics Data System (ADS)
Chuang, Chiashain; Yang, Yanfei; Huang, Lung-I.; Liang, Chi-Te; Elmquist, Randolph E.; National Institute of of Standards; Technology Collaboration; National Taiwan University, Department of Physics Collaboration
2015-03-01
The charge-transfer effect of ambient air on magneto-transport in polymer-free SiC graphene was investigated. Interestingly, adsorption of atmospheric gas molecules on clean epitaxial graphene can reduce the carrier density to near charge neutrality, allowing observation of highly precise v = 2 quantum Hall plateaus. The atmospheric adsorbates were reproducibly removed and pure gases (N2, O2, CO2, H2O) were used to form new individual adsorbates on SiC graphene. Our experimental results (τt/τq ~ 2) support the theoretical predictions for the ratio of transport relaxation time τt to quantum lifetime τq in clean graphene. The analysis of Shubnikov-de Haas oscillations at intermediate doping levels indicates that the carrier scattering is reduced by water and oxygen so as to increase both the classical and quantum mobility. This study points to the key dopant gases in ambient air and also paves the way towards extremely precise quantized Hall resistance standards in epitaxial graphene systems with carrier density tuned by exposure to highly pure gases and vacuum annealing treatment. National Institute of Standard and Technology.
A retrospective analysis of American football hyperthermia deaths in the United States
NASA Astrophysics Data System (ADS)
Grundstein, Andrew J.; Ramseyer, Craig; Zhao, Fang; Pesses, Jordan L.; Akers, Pete; Qureshi, Aneela; Becker, Laura; Knox, John A.; Petro, Myron
2012-01-01
Over the period 1980-2009, there were 58 documented hyperthermia deaths of American-style football players in the United States. This study examines the geography, timing, and meteorological conditions present during the onset of hyperthermia, using the most complete dataset available. Deaths are concentrated in the eastern quadrant of the United States and are most common during August. Over half the deaths occurred during morning practices when high humidity levels were common. The athletes were typically large (79% with a body mass index >30) and mostly (86%) played linemen positions. Meteorological conditions were atypically hot and humid by local standards on most days with fatalities. Further, all deaths occurred under conditions defined as high or extreme by the American College of Sports Medicine using the wet bulb globe temperature (WBGT), but under lower threat levels using the heat index (HI). Football-specific thresholds based on clothing (full football uniform, practice uniform, or shorts) were also examined. The thresholds matched well with data from athletes wearing practice uniforms but poorly for those in shorts only. Too few cases of athletes in full pads were available to draw any broad conclusions. We recommend that coaches carefully monitor players, particularly large linemen, early in the pre-season on days with wet bulb globe temperatures that are categorized as high or extreme. Also, as most of the deaths were among young athletes, longer acclimatization periods may be needed.
Microbial diversity of extreme habitats in human homes.
Savage, Amy M; Hills, Justin; Driscoll, Katherine; Fergus, Daniel J; Grunden, Amy M; Dunn, Robert R
2016-01-01
High-throughput sequencing techniques have opened up the world of microbial diversity to scientists, and a flurry of studies in the most remote and extreme habitats on earth have begun to elucidate the key roles of microbes in ecosystems with extreme conditions. These same environmental extremes can also be found closer to humans, even in our homes. Here, we used high-throughput sequencing techniques to assess bacterial and archaeal diversity in the extreme environments inside human homes (e.g., dishwashers, hot water heaters, washing machine bleach reservoirs, etc.). We focused on habitats in the home with extreme temperature, pH, and chemical environmental conditions. We found a lower diversity of microbes in these extreme home environments compared to less extreme habitats in the home. However, we were nonetheless able to detect sequences from a relatively diverse array of bacteria and archaea. Habitats with extreme temperatures alone appeared to be able to support a greater diversity of microbes than habitats with extreme pH or extreme chemical environments alone. Microbial diversity was lowest when habitats had both extreme temperature and one of these other extremes. In habitats with both extreme temperatures and extreme pH, taxa with known associations with extreme conditions dominated. Our findings highlight the importance of examining interactive effects of multiple environmental extremes on microbial communities. Inasmuch as taxa from extreme environments can be both beneficial and harmful to humans, our findings also suggest future work to understand both the threats and opportunities posed by the life in these habitats.
A new synoptic scale resolving global climate simulation using the Community Earth System Model
NASA Astrophysics Data System (ADS)
Small, R. Justin; Bacmeister, Julio; Bailey, David; Baker, Allison; Bishop, Stuart; Bryan, Frank; Caron, Julie; Dennis, John; Gent, Peter; Hsu, Hsiao-ming; Jochum, Markus; Lawrence, David; Muñoz, Ernesto; diNezio, Pedro; Scheitlin, Tim; Tomas, Robert; Tribbia, Joseph; Tseng, Yu-heng; Vertenstein, Mariana
2014-12-01
High-resolution global climate modeling holds the promise of capturing planetary-scale climate modes and small-scale (regional and sometimes extreme) features simultaneously, including their mutual interaction. This paper discusses a new state-of-the-art high-resolution Community Earth System Model (CESM) simulation that was performed with these goals in mind. The atmospheric component was at 0.25° grid spacing, and ocean component at 0.1°. One hundred years of "present-day" simulation were completed. Major results were that annual mean sea surface temperature (SST) in the equatorial Pacific and El-Niño Southern Oscillation variability were well simulated compared to standard resolution models. Tropical and southern Atlantic SST also had much reduced bias compared to previous versions of the model. In addition, the high resolution of the model enabled small-scale features of the climate system to be represented, such as air-sea interaction over ocean frontal zones, mesoscale systems generated by the Rockies, and Tropical Cyclones. Associated single component runs and standard resolution coupled runs are used to help attribute the strengths and weaknesses of the fully coupled run. The high-resolution run employed 23,404 cores, costing 250 thousand processor-hours per simulated year and made about two simulated years per day on the NCAR-Wyoming supercomputer "Yellowstone."
Pelotti, P; Ciminari, R; Bacci, G; Avella, M; Briccoli, A
1988-01-01
The value of stratigraphy and pulmonary CT in the initial work-up of osteosarcoma of the extremities is assessed with reference to 217 patients encountered in the Bone Tumour Centre of Rizzoli Orthopaedic Institute in May 1983-May 1986. Stratigraphy revealed lung metastases not identified by standard radiography in 4 patients (1.8%), while CT revealed metastases not identified by either standard X-rays or stratigraphy in a further 6 cases (2.7%). It is concluded that the increase in the percentage of cures (about 30%) reported in the last 10 years in osteosarcoma cases given adjuvant chemotherapy cannot be explained by any difference in initial selection due to the use of these techniques that were not adopted in the historical series.
NASA Astrophysics Data System (ADS)
Cadoni, Ezio
2018-03-01
The aim of this paper is the description of the mechanical characterization of alloys under extreme conditions of temperature and loading. In fact, in the frame of the Cost Action CA15102 “Solutions for Critical Raw Materials Under Extreme Conditions (CRM-EXTREME)” this aspect is crucial and many industrial applications have to consider the dynamic response of materials. Indeed, for a reduction and substitution of CRMs in alloys is necessary to design the materials and understand if the new materials behave better or if the substitution or reduction badly affect their performance. For this reason, a deep knowledge of the mechanical behaviour at high strain-rates of considered materials is required. In general, machinery manufacturing industry or transport industry as well as energy industry have important dynamic phenomena that are simultaneously affected by extended strain, high strain-rate, damage and pressure, as well as conspicuous temperature gradients. The experimental results in extreme conditions of high strain rate and high temperature of an austenitic stainless steel as well as a high-chromium tempered martensitic reduced activation steel Eurofer97 are presented.
New options for vascularized bone reconstruction in the upper extremity.
Houdek, Matthew T; Wagner, Eric R; Wyles, Cody C; Nanos, George P; Moran, Steven L
2015-02-01
Originally described in the 1970s, vascularized bone grafting has become a critical component in the treatment of bony defects and non-unions. Although well established in the lower extremity, recent years have seen many novel techniques described to treat a variety of challenging upper extremity pathologies. Here the authors review the use of different techniques of vascularized bone grafts for the upper extremity bone pathologies. The vascularized fibula remains the gold standard for the treatment of large bone defects of the humerus and forearm, while also playing a role in carpal reconstruction; however, two other important options for larger defects include the vascularized scapula graft and the Capanna technique. Smaller upper extremity bone defects and non-unions can be treated with the medial femoral condyle (MFC) free flap or a vascularized rib transfer. In carpal non-unions, both pedicled distal radius flaps and free MFC flaps are viable options. Finally, in skeletally immature patients, vascularized fibular head epiphyseal transfer can provide growth potential in addition to skeletal reconstruction.
de los Reyes-Guzmán, Ana; Dimbwadyo-Terrer, Iris; Trincado-Alonso, Fernando; Monasterio-Huelin, Félix; Torricelli, Diego; Gil-Agudo, Angel
2014-08-01
Quantitative measures of human movement quality are important for discriminating healthy and pathological conditions and for expressing the outcomes and clinically important changes in subjects' functional state. However the most frequently used instruments for the upper extremity functional assessment are clinical scales, that previously have been standardized and validated, but have a high subjective component depending on the observer who scores the test. But they are not enough to assess motor strategies used during movements, and their use in combination with other more objective measures is necessary. The objective of the present review is to provide an overview on objective metrics found in literature with the aim of quantifying the upper extremity performance during functional tasks, regardless of the equipment or system used for registering kinematic data. A search in Medline, Google Scholar and IEEE Xplore databases was performed following a combination of a series of keywords. The full scientific papers that fulfilled the inclusion criteria were included in the review. A set of kinematic metrics was found in literature in relation to joint displacements, analysis of hand trajectories and velocity profiles. These metrics were classified into different categories according to the movement characteristic that was being measured. These kinematic metrics provide the starting point for a proposed objective metrics for the functional assessment of the upper extremity in people with movement disorders as a consequence of neurological injuries. Potential areas of future and further research are presented in the Discussion section. Copyright © 2014 Elsevier Ltd. All rights reserved.
Factoring socioeconomic status into cardiac performance profiling for hospitals: does it matter?
Alter, David A; Austin, Peter C; Naylor, C David; Tu, Jack V
2002-01-01
Critics of "scorecard medicine" often highlight the incompleteness of risk-adjustment methods used when accounting for baseline patient differences. Although socioeconomic status is a highly important determinant of adverse outcome for patients admitted to the hospital with acute myocardial infarction, it has not been used in most risk-adjustment models for cardiovascular report cards. To determine the incremental impact of socioeconomic status adjustments on age, sex, and illness severity for hospital-specific 30-day mortality rates after acute myocardial infarction. The authors compared the absolute and relative hospital-specific 30-day acute myocardial infarction mortality rates in 169 hospitals throughout Ontario between April 1, 1994 and March 31, 1997. Patient socioeconomic status was characterized by median neighborhood income using postal codes and 1996 Canadian census data. They examined two risk-adjustment models: the first adjusted for age, sex, and illness severity (standard), whereas the second adjusted for age, sex, illness severity, and median neighborhood income level (socioeconomic status). There was an extremely strong correlation between 'standard' and 'socioeconomic status' risk-adjusted mortality rates (r = 0.99). Absolute differences in 30-day risk-adjusted mortality rates between the socioeconomic status and standard risk-adjustment models were small (median, 0.1%; 25th-75th percentile, 0.1-0.2). The agreement in the quintile rankings of hospitals between the socioeconomic status and standard risk-adjustment models was high (weighted kappa = 0.93). Despite its importance as a determinant of patient outcomes, the effect of socioeconomic status on hospital-specific mortality rates over and above standard risk-adjustment methods for acute myocardial infarction hospital profiling in Ontario was negligible.
Parra-Robles, Juan; Cross, Albert R; Santyr, Giles E
2005-05-01
Hyperpolarized noble gases (HNGs) provide exciting possibilities for MR imaging at ultra-low magnetic field strengths (<0.15 T) due to the extremely high polarizations available from optical pumping. The fringe field of many superconductive magnets used in clinical MR imaging can provide a stable magnetic field for this purpose. In addition to offering the benefit of HNG MR imaging alongside conventional high field proton MRI, this approach offers the other useful advantage of providing different field strengths at different distances from the magnet. However, the extremely strong field gradients associated with the fringe field present a major challenge for imaging since impractically high active shim currents would be required to achieve the necessary homogeneity. In this work, a simple passive shimming method based on the placement of a small number of ferromagnetic pieces is proposed to reduce the fringe field inhomogeneities to a level that can be corrected using standard active shims. The method explicitly takes into account the strong variations of the field over the volume of the ferromagnetic pieces used to shim. The method is used to obtain spectra in the fringe field of a high-field (1.89 T) superconducting magnet from hyperpolarized 129Xe gas samples at two different ultra-low field strengths (8.5 and 17 mT). The linewidths of spectra measured from imaging phantoms (30 Hz) indicate a homogeneity sufficient for MRI of the rat lung.
[Increased glucose uptake by seborrheic keratosis on PET scan].
Merklen-Djafri, C; Truntzer, P; Hassler, S; Cribier, B
2017-05-01
Positron emission tomography (PET) is an examination based upon the uptake of a radioactive tracer by hypermetabolic cells. It is primarily used in tandem with tomodensitometry (PET-TDM) for cancer staging because of its high sensitivity and specificity for the detection of metastases. However, unusually high uptake may occur with benign tumours, including skin tumours. Herein, we report an extremely rare case of pathological uptake levels resulting from seborrhoeic keratosis. A 55-year-old male patient with oesophageal squamous-cell carcinoma was referred to us following the discovery of an area of high marker uptake following PET-TDM and corresponding to a pigmented skin lesion. No other areas of suspect high uptake were seen. The lesion was surgically excised and histological examination indicated seborrhoeic keratosis. The histological appearance was that of standard seborrhoeic keratosis without any notable mitotic activity. PET-TDM is an examination that enables diagnosis of malignancy. However, rare cases have been described of increased marker uptake by benign cutaneous tumours such as histiocytofibroma, pilomatricoma and condyloma. To date, there have only been only very few cases of increased uptake due to seborrhoeic keratosis. This extremely unusual case of increased glucose uptake in PET-TDM due to seborrhoeic keratosis confirms that the hypermetabolic activity detected by this examination is not necessarily synonymous with malignancy and that confirmation by clinical and histological findings is essential. The reasons for increased metabolic activity within such benign tumours are not known. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Choi, Youngmin; Hwang, Yujin; Park, Minchan; Lee, Jaekeun; Choi, Cheol; Jung, Mihee; Oh, Jemyung; Lee, Jung Eun
2011-01-01
The tribological behavior of graphite and Ag nanoparticles as solid additive to base oil was evaluated on a four-ball test machine and a disc-on-disc tribotester. Extreme pressure and anti-wear results are shown according to ASTM D4172 and D2783 standard methods. It is found that Ag nanoparticles have better anti-wear behavior, and especially the smaller size nanoparticle have better anti-wear behavior.
Towards a well-founded and reproducible snow load map for Austria
NASA Astrophysics Data System (ADS)
Winkler, Michael; Schellander, Harald
2017-04-01
"EN 1991-1-3 Eurocode 1: Part 1-3: Snow Loads" provides standard for the determination of the snow load to be used for the structural design of buildings etc. Since 2006 national specifications for Austria define a snow load map with four "load zones", allowing the calculation of the characteristic ground snow load sk for locations below 1500 m asl. A quadratic regression between altitude and sk is used, as suggested by EN 1991-1-3. The actual snow load map is based on best meteorological practice, but still it is somewhat subjective and non-reproducible. Underlying snow data series often end in the 1980s; in the best case data until about 2005 is used. Moreover, extreme value statistics only rely on the Gumbel distribution and the way in which snow depths are converted to snow loads is generally unknown. This might be enough reasons to rethink the snow load standard for Austria, all the more since today's situation is different to what it was some 15 years ago: Firstly, Austria is rich of multi-decadal, high quality snow depth measurements. These data are not well represented in the actual standard. Secondly, semi-empirical snow models allow sufficiently precise calculations of snow water equivalents and snow loads from snow depth measurements without the need of other parameters like temperature etc. which often are not available at the snow measurement sites. With the help of these tools, modelling of daily snow load series from daily snow depth measurements is possible. Finally, extreme value statistics nowadays offers convincing methods to calculate snow depths and loads with a return period of 50 years, which is the base of sk, and allows reproducible spatial extrapolation. The project introduced here will investigate these issues in order to update the Austrian snow load standard by providing a well-founded and reproducible snow load map for Austria. Not least, we seek for contact with standards bodies of neighboring countries to find intersections as well as to avoid inconsistencies and duplications of effort.
NASA Technical Reports Server (NTRS)
Prasad, Narasimha; Trivedi, Sudhir; Chen, Henry; Kutcher, Susan; Zhang, Dajie; Singh, Jogender
2017-01-01
Advances in radiation shielding technologies are needed to protect humans and electronic components from all threats of space radiation over long durations. In this paper, we report on the use of the innovative and novel fabrication technology known as Field Assisted Sintering Technology (FAST) to fabricate lightweight material with enhanced radiation shielding strength to safeguard humans and electronics suitable for next generation space exploration missions. The base materials we investigated were aluminum (Al), the current standard material for space hardware, and Ultra-High Molecular Weight Polyethylene (UHMWPE), which has high hydrogen content and resistance to nuclear reaction from neutrons, making it a good shielding material for both gamma radiation and particles. UHMWPE also has high resistance to corrosive chemicals, extremely low moisture sensitivity, very low coefficient of friction, and high resistance to abrasion. We reinforced the base materials by adding high density (ie, high atomic weight) metallic material into the composite. These filler materials included: boron carbide (B4C), tungsten (W), tungsten carbide (WC) and gadolinium (Gd).
Modeling and evaluation of a high-resolution CMOS detector for cone-beam CT of the extremities.
Cao, Qian; Sisniega, Alejandro; Brehler, Michael; Stayman, J Webster; Yorkston, John; Siewerdsen, Jeffrey H; Zbijewski, Wojciech
2018-01-01
Quantitative assessment of trabecular bone microarchitecture in extremity cone-beam CT (CBCT) would benefit from the high spatial resolution, low electronic noise, and fast scan time provided by complementary metal-oxide semiconductor (CMOS) x-ray detectors. We investigate the performance of CMOS sensors in extremity CBCT, in particular with respect to potential advantages of thin (<0.7 mm) scintillators offering higher spatial resolution. A cascaded systems model of a CMOS x-ray detector incorporating the effects of CsI:Tl scintillator thickness was developed. Simulation studies were performed using nominal extremity CBCT acquisition protocols (90 kVp, 0.126 mAs/projection). A range of scintillator thickness (0.35-0.75 mm), pixel size (0.05-0.4 mm), focal spot size (0.05-0.7 mm), magnification (1.1-2.1), and dose (15-40 mGy) was considered. The detectability index was evaluated for both CMOS and a-Si:H flat-panel detector (FPD) configurations for a range of imaging tasks emphasizing spatial frequencies associated with feature size aobj. Experimental validation was performed on a CBCT test bench in the geometry of a compact orthopedic CBCT system (SAD = 43.1 cm, SDD = 56.0 cm, matching that of the Carestream OnSight 3D system). The test-bench studies involved a 0.3 mm focal spot x-ray source and two CMOS detectors (Dalsa Xineos-3030HR, 0.099 mm pixel pitch) - one with the standard CsI:Tl thickness of 0.7 mm (C700) and one with a custom 0.4 mm thick scintillator (C400). Measurements of modulation transfer function (MTF), detective quantum efficiency (DQE), and CBCT scans of a cadaveric knee (15 mGy) were obtained for each detector. Optimal detectability for high-frequency tasks (feature size of ~0.06 mm, consistent with the size of trabeculae) was ~4× for the C700 CMOS detector compared to the a-Si:H FPD at nominal system geometry of extremity CBCT. This is due to ~5× lower electronic noise of a CMOS sensor, which enables input quantum-limited imaging at smaller pixel size. Optimal pixel size for high-frequency tasks was <0.1 mm for a CMOS, compared to ~0.14 mm for an a-Si:H FPD. For this fine pixel pitch, detectability of fine features could be improved by using a thinner scintillator to reduce light spread blur. A 22% increase in detectability of 0.06 mm features was found for the C400 configuration compared to C700. An improvement in the frequency at 50% modulation (f 50 ) of MTF was measured, increasing from 1.8 lp/mm for C700 to 2.5 lp/mm for C400. The C400 configuration also achieved equivalent or better DQE as C700 for frequencies above ~2 mm -1 . Images of cadaver specimens confirmed improved visualization of trabeculae with the C400 sensor. The small pixel size of CMOS detectors yields improved performance in high-resolution extremity CBCT compared to a-Si:H FPDs, particularly when coupled with a custom 0.4 mm thick scintillator. The results indicate that adoption of a CMOS detector in extremity CBCT can benefit applications in quantitative imaging of trabecular microstructure in humans. © 2017 American Association of Physicists in Medicine.
Virtual reality technology prevents accidents in extreme situations
NASA Astrophysics Data System (ADS)
Badihi, Y.; Reiff, M. N.; Beychok, S.
2012-03-01
This research is aimed at examining the added value of using Virtual Reality (VR) in a driving simulator to prevent road accidents, specifically by improving drivers' skills when confronted with extreme situations. In an experiment, subjects completed a driving scenario using two platforms: A 3-D Virtual Reality display system using an HMD (Head-Mounted Display), and a standard computerized display system based on a standard computer monitor. The results show that the average rate of errors (deviating from the driving path) in a VR environment is significantly lower than in the standard one. In addition, there was no compensation between speed and accuracy in completing the driving mission. On the contrary: The average speed was even slightly faster in the VR simulation than in the standard environment. Thus, generally, despite the lower rate of deviation in VR setting, it is not achieved by driving slower. When the subjects were asked about their personal experiences from the training session, most of the subjects responded that among other things, the VR session caused them to feel a higher sense of commitment to the task and their performance. Some of them even stated that the VR session gave them a real sensation of driving.
Smartphone assessment of knee flexion compared to radiographic standards.
Dietz, Matthew J; Sprando, Daniel; Hanselman, Andrew E; Regier, Michael D; Frye, Benjamin M
2017-03-01
Measuring knee range of motion (ROM) is an important assessment for the outcomes of total knee arthroplasty. Recent technological advances have led to the development and use of accelerometer-based smartphone applications to measure knee ROM. The purpose of this study was to develop, standardize, and validate methods of utilizing smartphone accelerometer technology compared to radiographic standards, visual estimation, and goniometric evaluation. Participants used visual estimation, a long-arm goniometer, and a smartphone accelerometer to determine range of motion of a cadaveric lower extremity; these results were compared to radiographs taken at the same angles. The optimal smartphone position was determined to be on top of the leg at the distal femur and proximal tibia location. Between methods, it was found that the smartphone and goniometer were comparably reliable in measuring knee flexion (ICC=0.94; 95% CI: 0.91-0.96). Visual estimation was found to be the least reliable method of measurement. The results suggested that the smartphone accelerometer was non-inferior when compared to the other measurement techniques, demonstrated similar deviations from radiographic standards, and did not appear to be influenced by the person performing the measurements or the girth of the extremity. Copyright © 2016 Elsevier B.V. All rights reserved.
Smartphone Assessment of Knee Flexion Compared to Radiographic Standards
Dietz, Matthew J.; Sprando, Daniel; Hanselman, Andrew E.; Regier, Michael D.; Frye, Benjamin M.
2017-01-01
Purpose Measuring knee range of motion (ROM) is an important assessment for the outcomes of total knee arthroplasty. Recent technological advances have led to the development and use of accelerometer-based smartphone applications to measure knee ROM. The purpose of this study was to develop, standardize, and validate methods of utilizing smartphone accelerometer technology compared to radiographic standards, visual estimation, and goniometric evaluation. Methods Participants used visual estimation, a long-arm goniometer, and a smartphone accelerometer to determine range of motion of a cadaveric lower extremity; these results were compared to radiographs taken at the same angles. Results The optimal smartphone position was determined to be on top of the leg at the distal femur and proximal tibia location. Between methods, it was found that the smartphone and goniometer were comparably reliable in measuring knee flexion (ICC = 0.94; 95% CI: 0.91–0.96). Visual estimation was found to be the least reliable method of measurement. Conclusions The results suggested that the smartphone accelerometer was non-inferior when compared to the other measurement techniques, demonstrated similar deviations from radiographic standards, and did not appear to be influenced by the person performing the measurements or the girth of the extremity. PMID:28179062
NASA Astrophysics Data System (ADS)
Li, Donghuan; Zhou, Tianjun; Zou, Liwei; Zhang, Wenxia; Zhang, Lixia
2018-02-01
Extreme high-temperature events have large socioeconomic and human health impacts. East Asia (EA) is a populous region, and it is crucial to assess the changes in extreme high-temperature events in this region under different climate change scenarios. The Community Earth System Model low-warming experiment data were applied to investigate the changes in the mean and extreme high temperatures in EA under 1.5°C and 2°C warming conditions above preindustrial levels. The results show that the magnitude of warming in EA is approximately 0.2°C higher than the global mean. Most populous subregions, including eastern China, the Korean Peninsula, and Japan, will see more intense, more frequent, and longer-lasting extreme temperature events under 1.5°C and 2°C warming. The 0.5°C lower warming will help avoid 35%-46% of the increases in extreme high-temperature events in terms of intensity, frequency, and duration in EA with maximal avoidance values (37%-49%) occurring in Mongolia. Thus, it is beneficial for EA to limit the warming target to 1.5°C rather than 2°C.
2008-10-01
the standard model characterization procedure is based on creep and recovery tests, where loading and unloading occurs at a fast rate of 1.0 MPa/s...σ − g[ǫ] and on d̊g[ǫ] dǫ = E, where g̊ is defined as the equilibrium stress g[ ] for extremely fast loading. For this case, the stress-strain curves...Strain S tr es s Strain Rate Slow Strain Rate Medium Strain Rate Fast Plastic Flow Fully Established Figure 2.10: Stress Strain Curve Schematic
Use of Magnetic Resonance Imaging to Monitor Iron Overload
Wood, John C.
2014-01-01
SYNOPSIS Treatment of iron overload requires robust estimates of total body iron burden and its response to iron chelation therapy. Compliance with chelation therapy varies considerably among patients and individual reporting is notoriously unreliable. Even with perfect compliance, intersubject variability in chelator effectiveness is extremely high, necessitating reliable iron estimates to guide dose titration. In addition, each chelator has a unique profile with respect to clearing iron stores from different organs. This chapter will present the tools available to clinicians monitoring their patients, focusing on non-invasive magnetic resonance imaging methods because they have become the de-facto standard of care. PMID:25064711
Flexible displays as key for high-value and unique automotive design
NASA Astrophysics Data System (ADS)
Isele, Robert
2011-03-01
Within the last few years' car industry changed very fast. Information and Communication became more important and displays are now standard in nearly every car. But this is not the only trend which could be recognized in this industry. CO2 emission, fuel price as well as the increasing traffic inside the Mega Cities initialized a big change in the behavior of the customers. The big battle for the car industry will enter the interior extremely fast, and the premium cars need ore innovative design icons. Flexible Displays are one big step that enables totally different designs and a new value of the driver experience.
CO2 laser oscillators for laser radar applications
NASA Technical Reports Server (NTRS)
Freed, C.
1990-01-01
This paper reviews the spectral purity, frequency stability, and long-term stabilization of newly developed CO2 isotope lasers. Extremely high spectral purity, and short-term stability of less than 1.5 x 10 to the -13th have been achieved. A brief description on using CO2 isotope lasers as secondary frequency standards and in optical radar is given. The design and output characteristics of a single frequency, TEM00q mode, variable pulse width, hybrid TE CO2 laser system is also described. The frequency chirp in the output has been measured and almost completely eliminated by means of a novel technique.
Lohrmann, David; YoussefAgha, Ahmed; Jayawardene, Wasantha
2014-04-01
We determined current trends and patterns in overweight, obesity, and extreme high obesity among Pennsylvania pre-kindergarten (pre-K) to 12th grade students and simulated future trends. We analyzed body mass index (BMI) of pre-K to 12th grade students from 43 of 67 Pennsylvania counties in 2007 to 2011 to determine trends and to discern transition patterns among BMI status categories for 2009 to 2011. Vinsem simulation, confirmed by Markov chain modeling, generated future prevalence trends. Combined rates of overweight, obesity, and extreme high obesity decreased among secondary school students across the 5 years, and among elementary students, first increased and then markedly decreased. BMI status remained constant for approximately 80% of normal and extreme high obese students, but both decreased and increased among students who initially were overweight and obese; the increase in BMI remained significant. Overall trends in child and adolescent BMI status seemed positive. BMI transition patterns indicated that although overweight and obesity prevalence leveled off, extreme high obesity, especially among elementary students, is projected to increase substantially over time. If current transition patterns continue, the prevalence of overweight, obesity, and extreme high obesity among Pennsylvania students in 2031 is projected to be 16.0%, 6.6%, and 23.2%, respectively.
Local intensity adaptive image coding
NASA Technical Reports Server (NTRS)
Huck, Friedrich O.
1989-01-01
The objective of preprocessing for machine vision is to extract intrinsic target properties. The most important properties ordinarily are structure and reflectance. Illumination in space, however, is a significant problem as the extreme range of light intensity, stretching from deep shadow to highly reflective surfaces in direct sunlight, impairs the effectiveness of standard approaches to machine vision. To overcome this critical constraint, an image coding scheme is being investigated which combines local intensity adaptivity, image enhancement, and data compression. It is very effective under the highly variant illumination that can exist within a single frame or field of view, and it is very robust to noise at low illuminations. Some of the theory and salient features of the coding scheme are reviewed. Its performance is characterized in a simulated space application, the research and development activities are described.
High Accuracy Temperature Measurements Using RTDs with Current Loop Conditioning
NASA Technical Reports Server (NTRS)
Hill, Gerald M.
1997-01-01
To measure temperatures with a greater degree of accuracy than is possible with thermocouples, RTDs (Resistive Temperature Detectors) are typically used. Calibration standards use specialized high precision RTD probes with accuracies approaching 0.001 F. These are extremely delicate devices, and far too costly to be used in test facility instrumentation. Less costly sensors which are designed for aeronautical wind tunnel testing are available and can be readily adapted to probes, rakes, and test rigs. With proper signal conditioning of the sensor, temperature accuracies of 0.1 F is obtainable. For reasons that will be explored in this paper, the Anderson current loop is the preferred method used for signal conditioning. This scheme has been used in NASA Lewis Research Center's 9 x 15 Low Speed Wind Tunnel, and is detailed.
Jorgensen, Martin Gronbech; Paramanathan, Sentha; Ryg, Jesper; Masud, Tahir; Andersen, Stig
2015-07-10
Reaction time (RT) has been associated with falls in older adults, but is not routinely tested in clinical practice. A simple, portable, inexpensive and reliable method for measuring RT is desirable for clinical settings. We therefore developed a custom software, which utilizes the portable and low-cost standard Nintendo Wii board (NWB) to record RT. The aims in the study were to (1) explore if the test could differentiate old and young adults, and (2) to study learning effects between test-sessions, and (3) to examine reproducibility. A young (n = 25, age 20-35 years, mean BMI of 22.6) and an old (n = 25, age ≥65 years, mean BMI of 26.3) study-population were enrolled in this within- and between-day reproducibility study. A standard NWB was used along with the custom software to obtain RT from participants in milliseconds. A mixed effect model was initially used to explore systematic differences associated with age, and test-session. Reproducibility was then expressed by Intraclass Correlation Coefficients (ICC), Coefficient of Variance (CV), and Typical Error (TE). The RT tests was able to differentiate the old group from the young group in both the upper extremity test (p < 0.001; -170.7 ms (95%CI -209.4; -132.0)) and the lower extremity test (p < 0.001; -224.3 ms (95%CI -274.6; -173.9)). Moreover, the mixed effect model showed no significant learning effect between sessions with exception of the lower extremity test between session one and three for the young group (-35,5 ms; 4.6%; p = 0.02). A good within- and between-day reproducibility (ICC: 0.76-0.87; CV: 8.5-12.9; TE: 45.7-95.1 ms) was achieved for both the upper and lower extremity test with the fastest of three trials in both groups. A low-cost and portable reaction test utilizing a standard Nintendo wii board showed good reproducibility, no or little systematic learning effects across test-sessions, and could differentiate between young and older adults in both upper and lower extremity tests.
An efficient Bayesian data-worth analysis using a multilevel Monte Carlo method
NASA Astrophysics Data System (ADS)
Lu, Dan; Ricciuto, Daniel; Evans, Katherine
2018-03-01
Improving the understanding of subsurface systems and thus reducing prediction uncertainty requires collection of data. As the collection of subsurface data is costly, it is important that the data collection scheme is cost-effective. Design of a cost-effective data collection scheme, i.e., data-worth analysis, requires quantifying model parameter, prediction, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface hydrological model simulations using standard Monte Carlo (MC) sampling or surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose an efficient Bayesian data-worth analysis using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce computational costs using multifidelity approximations. Since the Bayesian data-worth analysis involves a great deal of expectation estimation, the cost saving of the MLMC in the assessment can be outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it for a highly heterogeneous two-phase subsurface flow simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the standard MC estimation. But compared to the standard MC, the MLMC greatly reduces the computational costs.
49 CFR 213.367 - Special inspections.
Code of Federal Regulations, 2010 CFR
2010-10-01
..., DEPARTMENT OF TRANSPORTATION TRACK SAFETY STANDARDS Train Operations at Track Classes 6 and Higher § 213.367 Special inspections. In the event of fire, flood, severe storm, temperature extremes or other occurrence...
Roof Overhangs for Solar Houses
NASA Technical Reports Server (NTRS)
Gracey, W.
1985-01-01
Convenient graphical method determines both width and vertical position of overhangs for standard wall section having "typical" window arrangement. Overhangs for this wall section determined for two extremes of latitude in United States.
Lower extremity muscle activation during baseball pitching.
Campbell, Brian M; Stodden, David F; Nixon, Megan K
2010-04-01
The purpose of this study was to investigate muscle activation levels of select lower extremity muscles during the pitching motion. Bilateral surface electromyography data on 5 lower extremity muscles (biceps femoris, rectus femoris, gluteus maximus, vastus medialis, and gastrocnemius) were collected on 11 highly skilled baseball pitchers and compared with individual maximal voluntary isometric contraction (MVIC) data. The pitching motion was divided into 4 distinct phases: phase 1, initiation of pitching motion to maximum stride leg knee height; phase 2, maximum stride leg knee height to stride foot contact (SFC); phase 3, SFC to ball release; and phase 4, ball release to 0.5 seconds after ball release (follow-through). Results indicated that trail leg musculature elicited moderate to high activity levels during phases 2 and 3 (38-172% of MVIC). Muscle activity levels of the stride leg were moderate to high during phases 2-4 (23-170% of MVIC). These data indicate a high demand for lower extremity strength and endurance. Specifically, coaches should incorporate unilateral and bilateral lower extremity exercises for strength improvement or maintenance and to facilitate dynamic stabilization of the lower extremities during the pitching motion.
Generalist genes and high cognitive abilities.
Haworth, Claire M A; Dale, Philip S; Plomin, Robert
2009-07-01
The concept of generalist genes operating across diverse domains of cognitive abilities is now widely accepted. Much less is known about the etiology of the high extreme of performance. Is there more specialization at the high extreme? Using a representative sample of 4,000 12-year-old twin pairs from the UK Twins Early Development Study, we investigated the genetic and environmental overlap between web-based tests of general cognitive ability, reading, mathematics and language performance for the top 15% of the distribution using DF extremes analysis. Generalist genes are just as evident at the high extremes of performance as they are for the entire distribution of abilities and for cognitive disabilities. However, a smaller proportion of the phenotypic intercorrelations appears to be explained by genetic influences for high abilities.
Generalist genes and high cognitive abilities
Haworth, Claire M.A.; Dale, Philip S.; Plomin, Robert
2014-01-01
The concept of generalist genes operating across diverse domains of cognitive abilities is now widely accepted. Much less is known about the etiology of the high extreme of performance. Is there more specialization at the high extreme? Using a representative sample of 4000 12-year-old twin pairs from the UK Twins Early Development Study, we investigated the genetic and environmental overlap between web-based tests of general cognitive ability, reading, mathematics and language performance for the top 15% of the distribution using DF extremes analysis. Generalist genes are just as evident at the high extremes of performance as they are for the entire distribution of abilities and for cognitive disabilities. However, a smaller proportion of the phenotypic intercorrelations appears to be explained by genetic influences for high abilities. PMID:19377870
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kolda, Christopher
In this talk, I review recent work on using a generalization of the Next-to-Minimal Supersymmetric Standard Model (NMSSM), called the Singlet-extended Minimal Supersymmetric Standard Model (SMSSM), to raise the mass of the Standard Model-like Higgs boson without requiring extremely heavy top squarks or large stop mixing. In so doing, this model solves the little hierarchy problem of the minimal model (MSSM), at the expense of leaving the {mu}-problem of the MSSM unresolved. This talk is based on work published in Refs. [1, 2, 3].
Quantifying impacts of heat waves on power grid operation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ke, Xinda; Wu, Di; Rice, Jennie S.
Climate change is projected to cause an increase in the severity and frequency of extreme weather events such as heat waves and droughts. Such changes present planning and operating challenges and risks to many economic sectors. In the electricity sector, statistics of extreme events in the past have been used to help plan for future peak loads, determine associated infrastructure requirements, and evaluate operational risks, but industry-standard planning tools have yet to be coupled with or informed by temperature models to explore the impacts of the "new normal" on planning studies. For example, high ambient temperatures during heat waves reducemore » the output capacity and efficiency of gas fired combustion turbines just when they are needed most to meet peak demands. This paper describes the development and application of a production cost and unit commitment model coupled to high resolution, hourly temperature data and a temperature dependent load model. The coupled system has the ability to represent the impacts of hourly temperatures on load conditions and available capacity and efficiency of combustion turbines, and therefore capture the potential impacts on system reliability and production cost. Ongoing work expands this capability to address the impacts of water availability and temperature on power grid operation.« less
A divergence-cleaning scheme for cosmological SPMHD simulations
NASA Astrophysics Data System (ADS)
Stasyszyn, F. A.; Dolag, K.; Beck, A. M.
2013-01-01
In magnetohydrodynamics (MHD), the magnetic field is evolved by the induction equation and coupled to the gas dynamics by the Lorentz force. We perform numerical smoothed particle magnetohydrodynamics (SPMHD) simulations and study the influence of a numerical magnetic divergence. For instabilities arising from {nabla }\\cdot {boldsymbol B} related errors, we find the hyperbolic/parabolic cleaning scheme suggested by Dedner et al. to give good results and prevent numerical artefacts from growing. Additionally, we demonstrate that certain current SPMHD implementations of magnetic field regularizations give rise to unphysical instabilities in long-time simulations. We also find this effect when employing Euler potentials (divergenceless by definition), which are not able to follow the winding-up process of magnetic field lines properly. Furthermore, we present cosmological simulations of galaxy cluster formation at extremely high resolution including the evolution of magnetic fields. We show synthetic Faraday rotation maps and derive structure functions to compare them with observations. Comparing all the simulations with and without divergence cleaning, we are able to confirm the results of previous simulations performed with the standard implementation of MHD in SPMHD at normal resolution. However, at extremely high resolution, a cleaning scheme is needed to prevent the growth of numerical {nabla }\\cdot {boldsymbol B} errors at small scales.
Seasonal forecasting of high wind speeds over Western Europe
NASA Astrophysics Data System (ADS)
Palutikof, J. P.; Holt, T.
2003-04-01
As financial losses associated with extreme weather events escalate, there is interest from end users in the forestry and insurance industries, for example, in the development of seasonal forecasting models with a long lead time. This study uses exceedences of the 90th, 95th, and 99th percentiles of daily maximum wind speed over the period 1958 to present to derive predictands of winter wind extremes. The source data is the 6-hourly NCEP Reanalysis gridded surface wind field. Predictor variables include principal components of Atlantic sea surface temperature and several indices of climate variability, including the NAO and SOI. Lead times of up to a year are considered, in monthly increments. Three regression techniques are evaluated; multiple linear regression (MLR), principal component regression (PCR), and partial least squares regression (PLS). PCR and PLS proved considerably superior to MLR with much lower standard errors. PLS was chosen to formulate the predictive model since it offers more flexibility in experimental design and gave slightly better results than PCR. The results indicate that winter windiness can be predicted with considerable skill one year ahead for much of coastal Europe, but that this deteriorates rapidly in the hinterland. The experiment succeeded in highlighting PLS as a very useful method for developing more precise forecasting models, and in identifying areas of high predictability.
Intra-seasonal Characteristics of Wintertime Extreme Cold Events over South Korea
NASA Astrophysics Data System (ADS)
Park, Taewon; Jeong, Jeehoon; Choi, Jahyun
2017-04-01
The present study reveals the changes in the characteristics of extreme cold events over South Korea for boreal winter (November to March) in terms of the intra-seasonal variability of frequency, duration, and atmospheric circulation pattern. Influences of large-scale variabilities such as the Siberian High activity, the Arctic Oscillation (AO), and the Madden-Julian Oscillation (MJO) on extreme cold events are also investigated. In the early and the late of the winter during November and March, the upper-tropospheric wave-train for a life-cycle of the extreme cold events tends to pass quickly over East Asia. In addition, compared with the other months, the intensity of the Siberian High is weaker and the occurrences of strong negative AO are less frequent. It lead to events with weak amplitude and short duration. On the other hand, the amplified Siberian High and the strong negative AO occur more frequently in the mid of the winter from December to February. The extreme cold events are mainly characterized by a well-organized anticyclonic blocking around the Ural Mountain and the Subarctic. These large-scale circulation makes the extreme cold events for the midwinter last long with strong amplitude. The MJO phases 2-3 which provide a suitable condition for the amplification of extreme cold events occur frequently for November to January when the frequencies are more than twice those for February and March. While the extreme cold events during March have the least frequency, the weakest amplitude, and the shortest duration due to weak impacts of the abovementioned factors, the strong activities of the factors for January force the extreme cold events to be the most frequent, the strongest, and the longest among the boreal winter. Keywords extreme cold event, wave-train, blocking, Siberian High, AO, MJO
European roadmap on superconductive electronics - status and perspectives
NASA Astrophysics Data System (ADS)
Anders, S.; Blamire, M. G.; Buchholz, F.-Im.; Crété, D.-G.; Cristiano, R.; Febvre, P.; Fritzsch, L.; Herr, A.; Il'ichev, E.; Kohlmann, J.; Kunert, J.; Meyer, H.-G.; Niemeyer, J.; Ortlepp, T.; Rogalla, H.; Schurig, T.; Siegel, M.; Stolz, R.; Tarte, E.; ter Brake, H. J. M.; Toepfer, H.; Villegier, J.-C.; Zagoskin, A. M.; Zorin, A. B.
2010-12-01
Executive SummaryFor four decades semiconductor electronics has followed Moore’s law: with each generation of integration the circuit features became smaller, more complex and faster. This development is now reaching a wall so that smaller is no longer any faster. The clock rate has saturated at about 3-5 GHz and the parallel processor approach will soon reach its limit. The prime reason for the limitation the semiconductor electronics experiences is not the switching speed of the individual transistor, but its power dissipation and thus heat. Digital superconductive electronics is a circuit- and device-technology that is inherently faster at much less power dissipation than semiconductor electronics. It makes use of superconductors and Josephson junctions as circuit elements, which can provide extremely fast digital devices in a frequency range - dependent on the material - of hundreds of GHz: for example a flip-flop has been demonstrated that operated at 750 GHz. This digital technique is scalable and follows similar design rules as semiconductor devices. Its very low power dissipation of only 0.1 μW per gate at 100 GHz opens the possibility of three-dimensional integration. Circuits like microprocessors and analogue-to-digital converters for commercial and military applications have been demonstrated. In contrast to semiconductor circuits, the operation of superconducting circuits is based on naturally standardized digital pulses the area of which is exactly the flux quantum Φ0. The flux quantum is also the natural quantization unit for digital-to-analogue and analogue-to-digital converters. The latter application is so precise, that it is being used as voltage standard and that the physical unit ‘Volt’ is defined by means of this standard. Apart from its outstanding features for digital electronics, superconductive electronics provides also the most sensitive sensor for magnetic fields: the Superconducting Quantum Interference Device (SQUID). Amongst many other applications SQUIDs are used as sensors for magnetic heart and brain signals in medical applications, as sensor for geological surveying and food-processing and for non-destructive testing. As amplifiers of electrical signals, SQUIDs can nearly reach the theoretical limit given by Quantum Mechanics. A further important field of application is the detection of very weak signals by ‘transition-edge’ bolometers, superconducting nanowire single-photon detectors, and superconductive tunnel junctions. Their application as radiation detectors in a wide frequency range, from microwaves to X-rays is now standard. The very low losses of superconductors have led to commercial microwave filter designs that are now widely used in the USA in base stations for cellular phones and in military communication applications. The number of demonstrated applications is continuously increasing and there is no area in professional electronics, in which superconductive electronics cannot be applied and surpasses the performance of classical devices. Superconductive electronics has to be cooled to very low temperatures. Whereas this was a bottleneck in the past, cooling techniques have made a huge step forward in recent years: very compact systems with high reliability and a wide range of cooling power are available commercially, from microcoolers of match-box size with milli-Watt cooling power to high-reliability coolers of many Watts of cooling power for satellite applications. Superconductive electronics will not replace semiconductor electronics and similar room-temperature techniques in standard applications, but for those applications which require very high speed, low-power consumption, extreme sensitivity or extremely high precision, superconductive electronics is superior to all other available techniques. To strengthen the European competitiveness in superconductor electronics research projects have to be set-up in the following field: Ultra-sensitive sensing and imaging. Quantum measurement instrumentation. Advanced analogue-to-digital converters. Superconductive electronics technology.
High-Resolution Near Real-Time Drought Monitoring in South Asia
NASA Astrophysics Data System (ADS)
Aadhar, S.; Mishra, V.
2017-12-01
Drought in South Asia affect food and water security and pose challenges for millions of people. For policy-making, planning and management of water resources at the sub-basin or administrative levels, high-resolution datasets of precipitation and air temperature are required in near-real time. Here we develop a high resolution (0.05 degree) bias-corrected precipitation and temperature data that can be used to monitor near real-time drought conditions over South Asia. Moreover, the dataset can be used to monitor climatic extremes (heat waves, cold waves, dry and wet anomalies) in South Asia. A distribution mapping method was applied to correct bias in precipitation and air temperature (maximum and minimum), which performed well compared to the other bias correction method based on linear scaling. Bias-corrected precipitation and temperature data were used to estimate Standardized precipitation index (SPI) and Standardized Precipitation Evapotranspiration Index (SPEI) to assess the historical and current drought conditions in South Asia. We evaluated drought severity and extent against the satellite-based Normalized Difference Vegetation Index (NDVI) anomalies and satellite-driven Drought Severity Index (DSI) at 0.05˚. We find that the bias-corrected high-resolution data can effectively capture observed drought conditions as shown by the satellite-based drought estimates. High resolution near real-time dataset can provide valuable information for decision-making at district and sub- basin levels.
Summary of Meteorological Observations, Surface (SMOS), Barbers Point, Hawaii.
1984-09-01
available. Also provided are the means and standard deviations for each month and annual (all months). The extremes for a month are not printed nor...January 1964. When 90 or more of the daily observations of peak gust wind data are available for a month, the extreme is selected and printed . These...ASHEVILLE, NC PERCENTAGE FREQUENCY OF WIND DIRECTION AND SPEED (FROM HOURLY OBSERVATIONS) STATUSI STATIM usA. V U0*t5 CLA mi6 (O t ST PE ND MEAN (KNTS) 1
Low-cost method for producing extreme ultraviolet lithography optics
Folta, James A [Livermore, CA; Montcalm, Claude [Fort Collins, CO; Taylor, John S [Livermore, CA; Spiller, Eberhard A [Mt. Kisco, NY
2003-11-21
Spherical and non-spherical optical elements produced by standard optical figuring and polishing techniques are extremely expensive. Such surfaces can be cheaply produced by diamond turning; however, the roughness in the diamond turned surface prevent their use for EUV lithography. These ripples are smoothed with a coating of polyimide before applying a 60 period Mo/Si multilayer to reflect a wavelength of 134 .ANG. and have obtained peak reflectivities close to 63%. The savings in cost are about a factor of 100.
Accession Medical Standards Analysis and Research Activity
2010-01-01
prosthetic implants and diseases of the musculoskeletal system and impairments and diseases of the spine, skull, limbs, and extremities; though...osteoporosis, pathologic fractures , bone cysts, and aseptic necrosis. Please note, when a majority of codes examined out to the fourth digit do not have a...including traumatic amputation, scrotum and testes 56 0.3 44 0.4 47 0.8 40 1.3 Late effect of fracture of lower extremities 89 0.5 64 0.5 44 0.8 22 0.7
Tornado risks and design windspeeds for the Oak Ridge Plant Site
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1975-08-01
The effects of tornadoes and other extreme winds should be considered in establishing design criteria for structures to resist wind loads. Design standards that are incorporated in building codes do not normally include the effects of tornadoes in their wind load criteria. Some tornado risk models ignore the presence of nontornadic extreme winds. The purpose of this study is to determine the probability of tornadic and straight winds exceeding a threshold value in the geographical region surrounding the Oak Ridge, Tennessee plant site.
Extreme values and fat tails of multifractal fluctuations
NASA Astrophysics Data System (ADS)
Muzy, J. F.; Bacry, E.; Kozhemyak, A.
2006-06-01
In this paper we discuss the problem of the estimation of extreme event occurrence probability for data drawn from some multifractal process. We also study the heavy (power-law) tail behavior of probability density function associated with such data. We show that because of strong correlations, the standard extreme value approach is not valid and classical tail exponent estimators should be interpreted cautiously. Extreme statistics associated with multifractal random processes turn out to be characterized by non-self-averaging properties. Our considerations rely upon some analogy between random multiplicative cascades and the physics of disordered systems and also on recent mathematical results about the so-called multifractal formalism. Applied to financial time series, our findings allow us to propose an unified framework that accounts for the observed multiscaling properties of return fluctuations, the volatility clustering phenomenon and the observed “inverse cubic law” of the return pdf tails.
Montpetit, Kathleen; Haley, Stephen; Bilodeau, Nathalie; Ni, Pengsheng; Tian, Feng; Gorton, George; Mulcahey, M J
2011-02-01
This article reports on the content range and measurement precision of an upper extremity (UE) computer adaptive testing (CAT) platform of physical function in children with cerebral palsy. Upper extremity items representing skills of all abilities were administered to 305 parents. These responses were compared with two traditional standardized measures: Pediatric Outcomes Data Collection Instrument and Functional Independence Measure for Children. The UE CAT correlated strongly with the upper extremity component of these measures and had greater precision when describing individual functional ability. The UE item bank has wider range with items populating the lower end of the ability spectrum. This new UE item bank and CAT have the capability to quickly assess children of all ages and abilities with good precision and, most importantly, with items that are meaningful and appropriate for their age and level of physical function.
Extreme value laws for fractal intensity functions in dynamical systems: Minkowski analysis
NASA Astrophysics Data System (ADS)
Mantica, Giorgio; Perotti, Luca
2016-09-01
Typically, in the dynamical theory of extremal events, the function that gauges the intensity of a phenomenon is assumed to be convex and maximal, or singular, at a single, or at most a finite collection of points in phase-space. In this paper we generalize this situation to fractal landscapes, i.e. intensity functions characterized by an uncountable set of singularities, located on a Cantor set. This reveals the dynamical rôle of classical quantities like the Minkowski dimension and content, whose definition we extend to account for singular continuous invariant measures. We also introduce the concept of extremely rare event, quantified by non-standard Minkowski constants and we study its consequences to extreme value statistics. Limit laws are derived from formal calculations and are verified by numerical experiments. Dedicated to the memory of Joseph Ford, on the twentieth anniversary of his departure.
NASA Technical Reports Server (NTRS)
Hock, R. A.; Woods, T. N.; Crotser, D.; Eparvier, F. G.; Woodraska, D. L.; Chamberlin, P. C.; Woods, E. C.
2010-01-01
The NASA Solar Dynamics Observatory (SDO), scheduled for launch in early 2010, incorporates a suite of instruments including the Extreme Ultraviolet Variability Experiment (EVE). EVE has multiple instruments including the Multiple Extreme ultraviolet Grating Spectrographs (MEGS) A, B, and P instruments, the Solar Aspect Monitor (SAM), and the Extreme ultraviolet SpectroPhotometer (ESP). The radiometric calibration of EVE, necessary to convert the instrument counts to physical units, was performed at the National Institute of Standards and Technology (NIST) Synchrotron Ultraviolet Radiation Facility (SURF III) located in Gaithersburg, Maryland. This paper presents the results and derived accuracy of this radiometric calibration for the MEGS A, B, P, and SAM instruments, while the calibration of the ESP instrument is addressed by Didkovsky et al. . In addition, solar measurements that were taken on 14 April 2008, during the NASA 36.240 sounding-rocket flight, are shown for the prototype EVE instruments.
Aziz, Faisal; Lehman, Erik; Blebea, John; Lurie, Fedor
2017-01-01
Background Deep venous thrombosis after any surgical operations is considered a preventable complication. Lower extremity bypass surgery is a commonly performed operation to improve blood flow to lower extremities in patients with severe peripheral arterial disease. Despite advances in endovascular surgery, lower extremity arterial bypass remains the gold standard treatment for severe, symptomatic peripheral arterial disease. The purpose of this study is to identify the clinical risk factors associated with development of deep venous thrombosis after lower extremity bypass surgery. Methods The American College of Surgeons' NSQIP database was utilized and all lower extremity bypass procedures performed in 2013 were examined. Patient and procedural characteristics were evaluated. Univariate and multivariate logistic regression analysis was used to determine independent risk factors for the development of postoperative deep venous thrombosis. Results A total of 2646 patients (65% males and 35% females) underwent lower extremity open revascularization during the year 2013. The following factors were found to be significantly associated with postoperative deep venous thrombosis: transfusion >4 units of packed red blood cells (odds ratio (OR) = 5.21, confidence interval (CI) = 1.29-22.81, p = 0.03), postoperative urinary tract infection (OR = 12.59, CI = 4.12-38.48, p < 0.01), length of hospital stay >28 days (OR = 9.30, CI = 2.79-30.92, p < 0.01), bleeding (OR = 2.93, CI = 1.27-6.73, p = 0.01), deep wound infection (OR = 3.21, CI = 1.37-7.56, p < 0.01), and unplanned reoperation (OR = 4.57, CI = 2.03-10.26, p < 0.01). Of these, multivariable analysis identified the factors independently associated with development of deep venous thrombosis after lower extremity bypass surgery to be unplanned reoperation (OR = 3.57, CI = 1.54-8.30, p < 0.01), reintubation (OR = 8.93, CI = 2.66-29.97, p < 0.01), and urinary tract infection (OR = 7.64, CI = 2.27-25.73, p < 0.01). Presence of all three factors was associated with a 54% incidence of deep venous thrombosis. Conclusions Development of deep venous thrombosis after lower extremity bypass is a serious but infrequent complication. Patients who require unplanned return to the operating room, reintubation, or develop a postoperative urinary tract are at high risk for developing postoperative deep venous thrombosis. Increased monitoring of these patients and ensuring adequate deep venous thrombosis prophylaxis for such patients is suggested.
Xian, Siyuan; Yin, Jie; Lin, Ning; Oppenheimer, Michael
2018-01-01
Coastal flood protection measures have been widely implemented to improve flood resilience. However, protection levels vary among coastal megacities globally. This study compares the distinct flood protection standards for two coastal megacities, New York City and Shanghai, and investigates potential influences such as risk factors and past flood events. Extreme value analysis reveals that, compared to NYC, Shanghai faces a significantly higher flood hazard. Flood inundation analysis indicates that Shanghai has a higher exposure to extreme flooding. Meanwhile, Shanghai's urban development, population, and economy have increased much faster than NYC's over the last three decades. These risk factors provide part of the explanation for the implementation of a relatively high level of protection (e.g. reinforced concrete sea-wall designed for a 200-year flood return level) in Shanghai and low protection (e.g. vertical brick and stone walls and sand dunes) in NYC. However, individual extreme flood events (typhoons in 1962, 1974, and 1981) seem to have had a greater impact on flood protection decision-making in Shanghai, while NYC responded significantly less to past events (with the exception of Hurricane Sandy). Climate change, sea level rise, and ongoing coastal development are rapidly changing the hazard and risk calculus for both cities and both would benefit from a more systematic and dynamic approach to coastal protection. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Nymmik, Rikho
Space environment models are intended for fairly describing the quantitative behavior of nature space environment. Usually, they are constructed on the basis of some experimental data set generalization, which is characteristic of the conditions that were taking place during measurement period. It is often to see that such models state and postulate realities of the past. The typical example of this point of view is the situation around extremely SEP events. During dozens of years models of such events have been based on the largest occurrences observed, which features were measured by some instruments with the reliability that was not always analyzed. It is obvious, that this way does not agree with reality, because any new extreme event conflicts with it. From this follow that space environment models can not be created by using numerical observed data only, when such data are changing in time, or have the probability nature. The model's goal is not only describing the average environment characteristics, but the predicting of extreme ones too. Such a prediction can only be result of analyzing the causes that stimulate environment change and taking them into account in model parameters. In this report we present the analysis of radiation environment formed by solar-generated high energy particles. A progresses and failures of SEP event modeling attempts are also shown and analyzed.
NASA Technical Reports Server (NTRS)
Kascak, Peter; Jansen, Ralph; Dever, Timothy; Nagorny, Aleksandr; Loparo, Kenneth
2013-01-01
In standard motor applications, rotor suspension with traditional mechanical bearings represents the most economical solution. However, in certain high performance applications, rotor suspension without contacting bearings is either required or highly beneficial. Examples include applications requiring very high speed or extreme environment operation, or with limited access for maintenance. This paper expands upon a novel bearingless motor concept, in which two motors with opposing conical air-gaps are used to achieve full five-axis levitation and rotation of the rotor. Force in this motor is created by deliberately leaving the motor s pole-pairs unconnected, which allows the creation of different d-axis flux in each pole pair. This flux imbalance is used to create lateral force. This approach is different than previous bearingless motor designs, which require separate windings for levitation and rotation. This paper examines the predicted and achieved suspension performance of a fully levitated prototype bearingless system.
Micro-Spec: A High Performance Compact Spectrometer for Submillimeter Astronomy
NASA Technical Reports Server (NTRS)
Hsieh, Wen-Ting; Moseley, Harvey; Stevenson, Thomas; Brown, Ari; Patel, Amil; U-Yen, Kongpop; Ehsan, Negar; Caltado, Giuseppe; Wollock, Edward
2012-01-01
We describe the micro-Spec, an extremely compact high performance spectrometer for the submillimeter and millimeter spectral ranges. We have designed a fully integrated submillimeter spectrometer based on superconducting microstrip technology and fabricated its critical elements. Using low loss transmission lines, we can produce a fully integrated high resolution submillimeter spectrometer on a single four inch Si wafer. A resolution of 500 can readily be achieved with standard fabrication tolerance, higher with phase trimming. All functions of the spectrometer are integrated - light is coupled to the micro strip circuit with a planar antenna, the spectra discrimination is achieved using a synthetic grating, orders are separated using a built-in planar filter, and the light is detected using photon counting Microwave Kinetic Inductance Detectors (MKID). We will discus the design principle of the instrument, describe its technical advantages, and report the progress on the development of the instrument.
Mu-Spec: A High Performance Compact Spectrometer for Submillimeter Astronomy
NASA Technical Reports Server (NTRS)
Hsieh, Wen-Ting; Moseley, Harvey; Stevenson, Thomas; Brown, Ari; Patel, Amil; U-yen, Kongpop; Ehsan, Negar; Cataldo, Giuseppe; Wollack, Ed
2012-01-01
We describe the Mu-Spec, an extremely compact high performance spectrometer for the submillimeter and millimeter spectral ranges. We have designed a fully integrated submillimeter spectrometer based on superconducting microstrip technology and fabricated its critical elements. Using low loss transmission lines, we can produce a fully integrated high resolution submillimeter spectrometer on a single four inch Si wafer. A resolution of 500 can readily be achieved with standard fabrication tolerance, higher with phase trimming. All functions of the spectrometer are integrated - light is coupled to the microstrip circuit with a planar antenna, the spectra discrimination is achieved using a synthetic grating, orders are separated using a built-in planar filter, and the light is detected using photon counting Microwave Kinetic Inductance Detectors (MKID). We will discus the design principle of the instrument, describe its technical advantages, and report the progress on the development of the instrument.
Work-related symptoms and checkstand configuration: an experimental study.
Harber, P; Bloswick, D; Luo, J; Beck, J; Greer, D; Peña, L F
1993-07-01
Supermarket checkers are known to be at risk of upper-extremity cumulative trauma disorders. Forty-two experienced checkers checked a standard "market basket" of items on an experimental checkstand. The counter height could be adjusted (high = 35.5, low = 31.5 inches), and the pre-scan queuing area length (between conveyor belt and laser scanner) could be set to "near" or "far" lengths. Each subject scanned under the high-near, high-far, low-near, and low-far conditions in random order. Seven ordinal symptom scales were used to describe comfort. Analysis showed that both counter height and queuing length had significant effects on symptoms. Furthermore, the height of the subject affected the degree and direction of the impact of the checkstand configuration differences. The study suggests that optimization of design may be experimentally evaluated, that modification of postural as well as frequency loading may be beneficial, and that adjustability for the individual may be advisable.
Sudakov, S K; Nazarova, G A; Alekseeva, E V; Bashkatova, V G
2013-07-01
We compared individual anxiety assessed by three standard tests, open-field test, elevated plus-maze test, and Vogel conflict drinking test, in the same animals. No significant correlations between the main anxiety parameters were found in these three experimental models. Groups of animals with high and low anxiety rats were formed by a single parameter and subsequent selection of two extreme groups (10%). It was found that none of the tests could be used for reliable estimation of individual anxiety in rats. The individual anxiety level with high degree of confidence was determined in high-anxiety and low-anxiety rats demonstrating behavioral parameters above and below the mean values in all tests used. Therefore, several tests should be used for evaluation of the individual anxiety or sensitivity to emotional stress.
Change in mean temperature as a predictor of extreme temperature change in the Asia-Pacific region
NASA Astrophysics Data System (ADS)
Griffiths, G. M.; Chambers, L. E.; Haylock, M. R.; Manton, M. J.; Nicholls, N.; Baek, H.-J.; Choi, Y.; della-Marta, P. M.; Gosai, A.; Iga, N.; Lata, R.; Laurent, V.; Maitrepierre, L.; Nakamigawa, H.; Ouprasitwong, N.; Solofa, D.; Tahani, L.; Thuy, D. T.; Tibig, L.; Trewin, B.; Vediapan, K.; Zhai, P.
2005-08-01
Trends (1961-2003) in daily maximum and minimum temperatures, extremes and variance were found to be spatially coherent across the Asia-Pacific region. The majority of stations exhibited significant trends: increases in mean maximum and mean minimum temperature, decreases in cold nights and cool days, and increases in warm nights. No station showed a significant increase in cold days or cold nights, but a few sites showed significant decreases in hot days and warm nights. Significant decreases were observed in both maximum and minimum temperature standard deviation in China, Korea and some stations in Japan (probably reflecting urbanization effects), but also for some Thailand and coastal Australian sites. The South Pacific convergence zone (SPCZ) region between Fiji and the Solomon Islands showed a significant increase in maximum temperature variability.Correlations between mean temperature and the frequency of extreme temperatures were strongest in the tropical Pacific Ocean from French Polynesia to Papua New Guinea, Malaysia, the Philippines, Thailand and southern Japan. Correlations were weaker at continental or higher latitude locations, which may partly reflect urbanization.For non-urban stations, the dominant distribution change for both maximum and minimum temperature involved a change in the mean, impacting on one or both extremes, with no change in standard deviation. This occurred from French Polynesia to Papua New Guinea (except for maximum temperature changes near the SPCZ), in Malaysia, the Philippines, and several outlying Japanese islands. For urbanized stations the dominant change was a change in the mean and variance, impacting on one or both extremes. This result was particularly evident for minimum temperature.The results presented here, for non-urban tropical and maritime locations in the Asia-Pacific region, support the hypothesis that changes in mean temperature may be used to predict changes in extreme temperatures. At urbanized or higher latitude locations, changes in variance should be incorporated.
Albadr, Musatafa Abbas Abbood; Tiun, Sabrina; Al-Dhief, Fahad Taha; Sammour, Mahmoud A M
2018-01-01
Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%.
Tiun, Sabrina; AL-Dhief, Fahad Taha; Sammour, Mahmoud A. M.
2018-01-01
Spoken Language Identification (LID) is the process of determining and classifying natural language from a given content and dataset. Typically, data must be processed to extract useful features to perform LID. The extracting features for LID, based on literature, is a mature process where the standard features for LID have already been developed using Mel-Frequency Cepstral Coefficients (MFCC), Shifted Delta Cepstral (SDC), the Gaussian Mixture Model (GMM) and ending with the i-vector based framework. However, the process of learning based on extract features remains to be improved (i.e. optimised) to capture all embedded knowledge on the extracted features. The Extreme Learning Machine (ELM) is an effective learning model used to perform classification and regression analysis and is extremely useful to train a single hidden layer neural network. Nevertheless, the learning process of this model is not entirely effective (i.e. optimised) due to the random selection of weights within the input hidden layer. In this study, the ELM is selected as a learning model for LID based on standard feature extraction. One of the optimisation approaches of ELM, the Self-Adjusting Extreme Learning Machine (SA-ELM) is selected as the benchmark and improved by altering the selection phase of the optimisation process. The selection process is performed incorporating both the Split-Ratio and K-Tournament methods, the improved SA-ELM is named Enhanced Self-Adjusting Extreme Learning Machine (ESA-ELM). The results are generated based on LID with the datasets created from eight different languages. The results of the study showed excellent superiority relating to the performance of the Enhanced Self-Adjusting Extreme Learning Machine LID (ESA-ELM LID) compared with the SA-ELM LID, with ESA-ELM LID achieving an accuracy of 96.25%, as compared to the accuracy of SA-ELM LID of only 95.00%. PMID:29672546
Gyorkos, Theresa W; Joseph, Serene A; Casapía, Martin
2009-06-01
Standard indicators are being used worldwide to track progress towards achieving the Millennium Development Goals (MDGs). These are usually at country level and do not accurately reflect within-country variability of progress towards the targets. This may lead to lack of attention and under-resourcing of the most vulnerable populations. Therefore, the objective of this study was to compare selected standard MDG indicators at country level and community level in Peru. As MDG indicators we selected: (i) moderate to severe and severe underweight in children under 5 years old; (ii) immunization against measles in 1-year olds; (iii) births attended by skilled health professionals and (iv) youth unemployment. Country-level data for Peru were obtained from United Nations published sources. Community-level data were obtained from a household survey conducted in 2005-2006 in Belén, a community of extreme poverty in the Amazon region. Belén indicators were consistently less favourable than country-level indicators, and indicators even differed between zones of high and low socioeconomic status within Belén itself. Compared to MDG indicators at the national level in Peru, the population of Belén experiences intra-country regional disparities in important health and social outcomes. Improving the coverage and quality of interventions and services in this community is essential. Other vulnerable populations in Peru should also be identified and targeted so that they can benefit from, and ultimately contribute to, progress in achieving the MDGs.
Carrer, Marco; Brunetti, Michele; Castagneri, Daniele
2016-01-01
Extreme climate events are of key importance for forest ecosystems. However, both the inherent infrequency, stochasticity and multiplicity of extreme climate events, and the array of biological responses, challenges investigations. To cope with the long life cycle of trees and the paucity of the extreme events themselves, our inferences should be based on long-term observations. In this context, tree rings and the related xylem anatomical traits represent promising sources of information, due to the wide time perspective and quality of the information they can provide. Here we test, on two high-elevation conifers (Larix decidua and Picea abies sampled at 2100 m a.s.l. in the Eastern Alps), the associations among temperature extremes during the growing season and xylem anatomical traits, specifically the number of cells per ring (CN), cell wall thickness (CWT), and cell diameter (CD). To better track the effect of extreme events over the growing season, tree rings were partitioned in 10 sectors. Climate variability has been reconstructed, for 1800–2011 at monthly resolution and for 1926–2011 at daily resolution, by exploiting the excellent availability of very long and high quality instrumental records available for the surrounding area, and taking into account the relationship between meteorological variables and site topographical settings. Summer temperature influenced anatomical traits of both species, and tree-ring anatomical profiles resulted as being associated to temperature extremes. Most of the extreme values in anatomical traits occurred with warm (positive extremes) or cold (negative) conditions. However, 0–34% of occurrences did not match a temperature extreme event. Specifically, CWT and CN extremes were more clearly associated to climate than CD, which presented a bias to track cold extremes. Dendroanatomical analysis, coupled to high-quality daily-resolved climate records, seems a promising approach to study the effects of extreme events on trees, but further investigations are needed to improve our comprehension of the critical role of such elusive events in forest ecosystems. PMID:27242880
Projected timing of perceivable changes in climate extremes for terrestrial and marine ecosystems.
Tan, Xuezhi; Gan, Thian Yew; Horton, Daniel E
2018-05-26
Human and natural systems have adapted to and evolved within historical climatic conditions. Anthropogenic climate change has the potential to alter these conditions such that onset of unprecedented climatic extremes will outpace evolutionary and adaptive capabilities. To assess whether and when future climate extremes exceed their historical windows of variability within impact-relevant socioeconomic, geopolitical, and ecological domains, we investigate the timing of perceivable changes (time of emergence; TOE) for 18 magnitude-, frequency-, and severity-based extreme temperature (10) and precipitation (8) indices using both multimodel and single-model multirealization ensembles. Under a high-emission scenario, we find that the signal of frequency- and severity-based temperature extremes is projected to rise above historical noise earliest in midlatitudes, whereas magnitude-based temperature extremes emerge first in low and high latitudes. Precipitation extremes demonstrate different emergence patterns, with severity-based indices first emerging over midlatitudes, and magnitude- and frequency-based indices emerging earliest in low and high latitudes. Applied to impact-relevant domains, simulated TOE patterns suggest (a) unprecedented consecutive dry day occurrence in >50% of 14 terrestrial biomes and 12 marine realms prior to 2100, (b) earlier perceivable changes in climate extremes in countries with lower per capita GDP, and (c) emergence of severe and frequent heat extremes well-before 2030 for the 590 most populous urban centers. Elucidating extreme-metric and domain-type TOE heterogeneities highlights the challenges adaptation planners face in confronting the consequences of elevated twenty-first century radiative forcing. © 2018 John Wiley & Sons Ltd.
16 CFR 1207.4 - Recommended standards for materials of manufacture.
Code of Federal Regulations, 2011 CFR
2011-01-01
... exposure to rain, snow, ice, sunlight, local, normal temperature extremes, local normal wind variations... be toxic to man or harmful to the environment under intended use and reasonably foreseeable abuse or...
16 CFR 1207.4 - Recommended standards for materials of manufacture.
Code of Federal Regulations, 2012 CFR
2012-01-01
... exposure to rain, snow, ice, sunlight, local, normal temperature extremes, local normal wind variations... be toxic to man or harmful to the environment under intended use and reasonably foreseeable abuse or...
16 CFR 1207.4 - Recommended standards for materials of manufacture.
Code of Federal Regulations, 2014 CFR
2014-01-01
... exposure to rain, snow, ice, sunlight, local, normal temperature extremes, local normal wind variations... be toxic to man or harmful to the environment under intended use and reasonably foreseeable abuse or...
16 CFR 1207.4 - Recommended standards for materials of manufacture.
Code of Federal Regulations, 2010 CFR
2010-01-01
... exposure to rain, snow, ice, sunlight, local, normal temperature extremes, local normal wind variations... be toxic to man or harmful to the environment under intended use and reasonably foreseeable abuse or...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1978-10-03
This report is a six-part statistical summary of surface weather observations for Torrejon AB, Madrid Spain. It contains the following parts: (A) Weather Conditions; Atmospheric Phenomena; (B) Precipitation, Snowfall and Snow Depth (daily amounts and extreme values); (C) Surface winds; (D) Ceiling Versus Visibility; Sky Cover; (E) Psychrometric Summaries (daily maximum and minimum temperatures, extreme maximum and minimum temperatures, psychrometric summary of wet-bulb temperature depression versus dry-bulb temperature, means and standard deviations of dry-bulb, wet-bulb and dew-point temperatures and relative humidity); and (F) Pressure Summary (means, standard, deviations, and observation counts of station pressure and sea-level pressure). Data in thismore » report are presented in tabular form, in most cases in percentage frequency of occurrence or cumulative percentage frequency of occurrence tables.« less
NASA Technical Reports Server (NTRS)
Wang, Guiling; Wang, Dagang; Trenberth, Kevin E.; Erfanian, Amir; Yu, Miao; Bosilovich, Michael G.; Parr, Dana T.
2017-01-01
Theoretical models predict that, in the absence of moisture limitation, extreme precipitation intensity could exponentially increase with temperatures at a rate determined by the Clausius-Clapeyron (C-C) relationship. Climate models project a continuous increase of precipitation extremes for the twenty-first century over most of the globe. However, some station observations suggest a negative scaling of extreme precipitation with very high temperatures, raising doubts about future increase of precipitation extremes. Here we show for the present-day climate over most of the globe,the curve relating daily precipitation extremes with local temperatures has a peak structure, increasing as expected at the low medium range of temperature variations but decreasing at high temperatures. However, this peak-shaped relationship does not imply a potential upper limit for future precipitation extremes. Climate models project both the peak of extreme precipitation and the temperature at which it peaks (T(sub peak)) will increase with warming; the two increases generally conform to the C-C scaling rate in mid- and high-latitudes,and to a super C-C scaling in most of the tropics. Because projected increases of local mean temperature (T(sub mean)) far exceed projected increases of T(sub peak) over land, the conventional approach of relating extreme precipitation to T(sub mean) produces a misleading sub-C-C scaling rate.
Infrared surface photometry of 3C 65: Stellar evolution and the Tolman signal
NASA Astrophysics Data System (ADS)
Rigler, M. A.; Lilly, S. J.
1994-06-01
We present an analysis of the infrared surface brightness profile of the high-redshift radio galaxy 3C 65 (z = 1.176), which is well fitted by a de Vaucouleurs r1/4 law. A model surface fitting routine yields characteristic photometric parameters comparable to those of low-redshift radio galaxies and brightest cluster members (BCMs) in standard cosmologies. The small displacement of this galaxy from the locus of low-redshift systems on the mur - log(re) plane suggests that little or no luminosity evolution is required in a cosmological model with (Omega0, lambda0 = (1,0), while a modest degree of luminosity evolution, accountable by passive evolution of the stellar population, is implied in models with (0, 0) or (0.1, 0.9). A nonexpanding cosmology is unlikely because it would require 3C 65 to lie at the extreme end of the distribution of properties of local gE galaxies, and the effects of plausible stellar and/or dynamic evolution would make 3C 65 even more extreme by the present epoch.
Prefrontal brain asymmetry and aggression in imprisoned violent offenders.
Keune, Philipp M; van der Heiden, Linda; Várkuti, Bálint; Konicar, Lilian; Veit, Ralf; Birbaumer, Niels
2012-05-02
Anterior brain asymmetry, assessed through the alpha and beta band in resting-state electroencephalogram (EEG) is associated with approach-related behavioral dispositions, particularly with aggression in the general population. To date, the association between frontal asymmetry and aggression has not been examined in highly aggressive groups. We examined the topographic characteristics of alpha and beta activity, the relation of both asymmetry metrics to trait aggression, and whether alpha asymmetry was extreme in anterior regions according to clinical standards in a group of imprisoned violent offenders. As expected, these individuals were characterized by stronger right than left-hemispheric alpha activity, which was putatively extreme in anterior regions in one third of the cases. We also report that in line with observations made in the general population, aggression was associated with stronger right-frontal alpha activity in these violent individuals. This suggests that frontal alpha asymmetry, as a correlate of trait aggression, might be utilizable as an outcome measure in studies which assess the effects of anti-aggressiveness training in violent offenders. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kumari, Komal; Donzis, Diego
2017-11-01
Highly resolved computational simulations on massively parallel machines are critical in understanding the physics of a vast number of complex phenomena in nature governed by partial differential equations. Simulations at extreme levels of parallelism present many challenges with communication between processing elements (PEs) being a major bottleneck. In order to fully exploit the computational power of exascale machines one needs to devise numerical schemes that relax global synchronizations across PEs. This asynchronous computations, however, have a degrading effect on the accuracy of standard numerical schemes.We have developed asynchrony-tolerant (AT) schemes that maintain order of accuracy despite relaxed communications. We show, analytically and numerically, that these schemes retain their numerical properties with multi-step higher order temporal Runge-Kutta schemes. We also show that for a range of optimized parameters,the computation time and error for AT schemes is less than their synchronous counterpart. Stability of the AT schemes which depends upon history and random nature of delays, are also discussed. Support from NSF is gratefully acknowledged.
Evaluation, management and prevention of lower extremity youth ice hockey injuries
Popkin, Charles A; Schulz, Brian M; Park, Caroline N; Bottiglieri, Thomas S; Lynch, T Sean
2016-01-01
Ice hockey is a fast-paced sport played by increasing numbers of children and adolescents in North America and around the world. Requiring a unique blend of skill, finesse, power and teamwork, ice hockey can become a lifelong recreational activity. Despite the rising popularity of the sport, there is ongoing concern about the high frequency of musculoskeletal injury associated with participation in ice hockey. Injury rates in ice hockey are among the highest in all competitive sports. Numerous research studies have been implemented to better understand the risks of injury. As a result, rule changes were adopted by the USA Hockey and Hockey Canada to raise the minimum age at which body checking is permitted to 13–14 years (Bantam level) from 11–12 years (Pee Wee). Continuing the education of coaches, parents and players on rules of safe play, and emphasizing the standards for proper equipment use are other strategies being implemented to make the game safer to play. The objective of this article was to review the evaluation, management and prevention of common lower extremity youth hockey injuries. PMID:27920584
Evaluation, management and prevention of lower extremity youth ice hockey injuries.
Popkin, Charles A; Schulz, Brian M; Park, Caroline N; Bottiglieri, Thomas S; Lynch, T Sean
2016-01-01
Ice hockey is a fast-paced sport played by increasing numbers of children and adolescents in North America and around the world. Requiring a unique blend of skill, finesse, power and teamwork, ice hockey can become a lifelong recreational activity. Despite the rising popularity of the sport, there is ongoing concern about the high frequency of musculoskeletal injury associated with participation in ice hockey. Injury rates in ice hockey are among the highest in all competitive sports. Numerous research studies have been implemented to better understand the risks of injury. As a result, rule changes were adopted by the USA Hockey and Hockey Canada to raise the minimum age at which body checking is permitted to 13-14 years (Bantam level) from 11-12 years (Pee Wee). Continuing the education of coaches, parents and players on rules of safe play, and emphasizing the standards for proper equipment use are other strategies being implemented to make the game safer to play. The objective of this article was to review the evaluation, management and prevention of common lower extremity youth hockey injuries.
Hoyt, Anne L; Bushman, Don; Lewis, Nathan; Faber, Robert
2012-01-01
How can a formulator have confidence that a preservative system will perform as expected under adverse conditions? Extreme conditions that can lead to the development of "off odors" in the product can be a serious challenge for companies providing home care products in the global market. Formulation and stability testing occur under controlled parameters that simulate limited environmental conditions and microbial challenges are typically performed with a standard inoculum level. While this is an acceptable and dependable process, it does not necessarily assess how well a preservative system can perform under extreme environmental conditions or against unusually high levels of bacterial challenges. This is especially true when formulations are diluted and stored by the end-user. By modifying microbial challenge testing of a liquid dishwashing product to include unexpected dilution schemes, increased microbial assaults, and elevated temperatures, a pattern of preservative efficacy was established. The resulting approach proved to be a useful tool when developing use directions, recommended dilution levels, the overall surfactant system, preservative type, and storage restrictions.
Fabrication of seamless calandria tubes by cold pilgering route using 3-pass and 2-pass schedules
NASA Astrophysics Data System (ADS)
Saibaba, N.
2008-12-01
Calandria tube is a large diameter, extremely thin walled zirconium alloy tube which has diameter to wall thickness ratio as high as 90-95. Such tubes are conventionally produced by the 'welded route', which involves extrusion of slabs followed by a series of hot and cold rolling passes, intermediate anneals, press forming of sheets into circular shape and closing the gap by TIG welding. Though pilgering is a well established process for the fabrication of seamless tubes, production of extremely thin walled tubes offers several challenges during pilgering. Nuclear fuel complex (NFC), Hyderabad, has successfully developed a process for the production of Zircaloy-4 calandria tubes by adopting the 'seamless route' which involves hot extrusion of mother blanks followed by three-pass pilgering or two-pass pilgering schedules. This paper deals with standardization of the seamless route processes for fabrication of calandria tubes, comparison between the tubes produced by 2-pass and 3-pass pilgering schedules, role of ultrasonic test charts for control of process parameters, development of new testing methods for burst testing and other properties.
Cross-timescale Interference and Rainfall Extreme Events in South Eastern South America
NASA Astrophysics Data System (ADS)
Munoz, Angel G.
The physical mechanisms and predictability associated with extreme daily rainfall in South East South America (SESA) are investigated for the December-February season. Through a k-mean analysis, a robust set of daily circulation regimes is identified and then it is used to link the frequency of rainfall extreme events with large-scale potential predictors at subseasonal-to-seasonal scales. This basic set of daily circulation regimes is related to the continental and oceanic phases of the South Atlantic Convergence Zone (SACZ) and wave train patterns superimposed on the Southern Hemisphere Polar Jet. Some of these recurrent synoptic circulation types are conducive to extreme rainfall events in the region through synoptic control of different meso-scale physical features and, at the same time, are influenced by climate phenomena that could be used as sources of potential predictability. Extremely high rainfall (as measured by the 95th- and 99th-percentiles) is preferentially associated with two of these weather types, which are characterized by moisture advection intrusions from lower latitudes and the Pacific; another three weather types, characterized by above-normal moisture advection toward lower latitudes or the Andes, are preferentially associated with dry days (days with no rain). The analysis permits the identification of several subseasonal-to-seasonal scale potential predictors that modulate the occurrence of circulation regimes conducive to extreme rainfall events in SESA. It is conjectured that a cross-timescale interference between the different climate drivers improves the predictive skill of extreme precipitation in the region. The potential and real predictive skill of the frequency of extreme rainfall is then evaluated, finding evidence indicating that mechanisms of climate variability at one timescale contribute to the predictability at another scale, i.e., taking into account the interference of different potential sources of predictability at different timescales increases the predictive skill. This fact is in agreement with the Cross-timescale Interference Conjecture proposed in the first part of the thesis. At seasonal scale, a combination of those weather types tends to outperform all the other potential predictors explored, i.e., sea surface temperature patterns, phases of the Madden-Julian Oscillation, and combinations of both. Spatially averaged Kendall’s τ improvements of 43% for the potential predictability and 23% for realtime predictions are attained with respect to standard models considering sea-surface temperature fields alone. A new subseasonal-to-seasonal predictive methodology for extreme rainfall events is proposed, based on probability forecasts of seasonal sequences of these weather types. The cross-validated realtime skill of the new probabilistic approach, as measured by the Hit Score and the Heidke Skill Score, is on the order of twice that associated with climatological values. The approach is designed to offer useful subseasonal-to-seasonal climate information to decision-makers interested not only in how many extreme events will happen in the season, but also in how, when and where those events will probably occur. In order to gain further understanding about how the cross-timescale interference occurs, an externally-forced Lorenz model is used to explore the impact of different kind of forcings, at inter-annual and decadal scales, in the establishment of constructive interactions associated with the simulated “extreme events”. Using a wavelet analysis, it is shown that this simple model is capable of reproducing the same kind of cross-timescale structures observed in the wavelet power spectrum of the Nino3.4 index only when it is externally forced by both inter-annual and decadal signals: the annual cycle and a decadal forcing associated with the natural solar variability. The nature of this interaction is non-linear, and it impacts both mean and extreme values in the time series. No predictive power was found when using metrics like standard deviation and auto-correlation. Nonetheless, it was proposed that an early warning signal for occurrence of extreme rainfall in SESA may be possible via a continuous monitoring of relative phases between the cross-timescale leading components.
Haigh, Ivan D.; Wadey, Matthew P.; Wahl, Thomas; Ozsoy, Ozgun; Nicholls, Robert J.; Brown, Jennifer M.; Horsburgh, Kevin; Gouldby, Ben
2016-01-01
In this paper we analyse the spatial footprint and temporal clustering of extreme sea level and skew surge events around the UK coast over the last 100 years (1915–2014). The vast majority of the extreme sea level events are generated by moderate, rather than extreme skew surges, combined with spring astronomical high tides. We distinguish four broad categories of spatial footprints of events and the distinct storm tracks that generated them. There have been rare events when extreme levels have occurred along two unconnected coastal regions during the same storm. The events that occur in closest succession (<4 days) typically impact different stretches of coastline. The spring/neap tidal cycle prevents successive extreme sea level events from happening within 4–8 days. Finally, the 2013/14 season was highly unusual in the context of the last 100 years from an extreme sea level perspective. PMID:27922630
Knowles, Martyn; Nation, David A; Timaran, David E; Gomez, Luis F; Baig, M Shadman; Valentine, R James; Timaran, Carlos H
2015-01-01
Fenestrated endovascular aortic aneurysm repair (FEVAR) is an alternative to open repair in patients with complex abdominal aortic aneurysms who are neither fit nor suitable for standard open or endovascular repair. Chimney and snorkel grafts are other endovascular alternatives but frequently require bilateral upper extremity access that has been associated with a 3% to 10% risk of stroke. However, upper extremity access is also frequently required for FEVAR because of the caudal orientation of the visceral vessels. The purpose of this study was to assess the use of upper extremity access for FEVAR and the associated morbidity. During a 5-year period, 148 patients underwent FEVAR, and upper extremity access for FEVAR was used in 98 (66%). Outcomes were compared between those who underwent upper extremity access and those who underwent femoral access alone. The primary end point was a cerebrovascular accident or transient ischemic attack, and the secondary end point was local access site complications. The mean number of fenestrated vessels was 3.07 ± 0.81 (median, 3) for a total of 457 vessels stented. Percutaneous upper extremity access was used in 12 patients (12%) and open access in 86 (88%). All patients who required a sheath size >7F underwent high brachial open access, with the exception of one patient who underwent percutaneous axillary access with a 12F sheath. The mean sheath size was 10.59F ± 2.51F (median, 12F), which was advanced into the descending thoracic aorta, allowing multiple wire and catheter exchanges. One hemorrhagic stroke (one of 98 [1%]) occurred in the upper extremity access group, and one ischemic stroke (one of 54 [2%]) occurred in the femoral-only access group (P = .67). The stroke in the upper extremity access group occurred 5 days after FEVAR and was related to uncontrolled hypertension, whereas the stroke in the femoral group occurred on postoperative day 3. Neither patient had signs or symptoms of a stroke immediately after FEVAR. The right upper extremity was accessed six times without a stroke (0%) compared with the left being accessed 92 times with one stroke (1%; P = .8). Four patients (4%) had local complications related to upper extremity access. One (1%) required exploration for an expanding hematoma after manual compression for a 7F sheath, one (1%) required exploration for hematoma and neurologic symptoms after open access for a 12F sheath, and two patients (2%) with small hematomas did not require intervention. Two (two of 12 [17%]) of these complications were in the percutaneous access group, which were significantly more frequent than in the open group (two of 86 [2%]; P = .02). Upper extremity access appears to be a safe and feasible approach for patients undergoing FEVAR. Open exposure in the upper extremity may be safer than percutaneous access during FEVAR. Unlike chimney and snorkel grafts, upper extremity access during FEVAR is not associated with an increased risk of stroke, despite the need for multiple visceral vessel stenting. Copyright © 2015 Society for Vascular Surgery. All rights reserved.
NASA Astrophysics Data System (ADS)
Li, Zhanling; Li, Zhanjie; Li, Chengcheng
2014-05-01
Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to 2008, while the intensity of such flow extremes is comparatively increasing especially for the higher return levels.
Very Low Mass Stars with Extremely Low Metallicity in the Milky Way's Halo
NASA Astrophysics Data System (ADS)
Aoki, Wako; Beers, Timothy C.; Takuma, Suda; Honda, Satoshi; Lee, Young Sun
2015-08-01
Large surveys and follow-up spectroscopic studies in the past few decades have been providing chemical abundance data for a growing number of very metal-poor ([Fe/H] <-2) stars. Most of them are red giants or main-sequence turn-off stars having masses near 0.8 solar masses. Lower mass stars with extremely low metallicity ([Fe/H] <-3) have yet to be well explored. Our high-resolution spectroscopic study for very metal-poor stars found with SDSS has identified four cool main-sequence stars with [Fe/H] <-2.5 among 137 objects (Aoki et al. 2013, AJ, 145, 13). The effective temperatures of these stars are 4500--5000 K, corresponding to a mass of around 0.5 solar masses. Our standard analysis of the high-resolution spectra based on 1D-LTE model atmospheres have obtained self-consistent chemical abundances for these objects, assuming small values of micro-turbulent velocities compared with giants and turn-off stars. The low temperature of the atmospheres of these objects enables us to measure their detailed chemical abundances. Interestingly, two of the four stars have extreme chemical abundance patterns: one has the largest excesses of heavy neutron-capture elements associated with the r-process abundance pattern known to date (Aoki et al. 2010, ApJL 723, L201), and the other exhibits low abundances of the alpha-elements and odd-Z elements, suggested to be the signatures of the yields of very massive stars ( >100 solar masses; Aoki et al. 2014, Science 345, 912). Although the sample size is still small, these results indicate the potential of very low-mass stars as probes to study the early stages of the Milky Way's halo formation.
Application of the Haines Index in the fire warning system
NASA Astrophysics Data System (ADS)
Kalin, Lovro; Marija, Mokoric; Tomislav, Kozaric
2016-04-01
Croatia, as all Mediterranean countries, is strongly affected by large wildfires, particularly in the coastal region. In the last two decades the number and intensity of fires has been significantly increased, which is unanimously associated with climate change, e.g. global warming. More extreme fires are observed, and the fire-fighting season has been expanded to June and September. The meteorological support for fire protection and planning is therefore even more important. At the Meteorological and Hydrological Service of Croatia a comprehensive monitoring and warning system has been established. It includes standard components, such as short term forecast of Fire Weather Index (FWI), but long range forecast as well. However, due to more frequent hot and dry seasons, FWI index often does not provide additional information of extremely high fire danger, since it regularly takes the highest values for long periods. Therefore the additional tools have been investigated. One of widely used meteorological products is the Haines index (HI). It provides information of potential fire growth, taking into account only the vertical instability of the atmosphere, and not the state of the fuel. Several analyses and studies carried out at the Service confirmed the correlation of high HI values with large and extreme fires. The Haines index forecast has been used at the Service for several years, employing European Centre for Medium Range Weather Forecast (ECMWF) global prediction model, as well as the limited-area Aladin model. The verification results show that these forecast are reliable, when compared to radiosonde measurements. All these results provided the introduction of the additional fire warnings, that are issued by the Service's Forecast Department.
Very Low-Mass Stars with Extremely Low Metallicity in the Milky Way's Halo
NASA Astrophysics Data System (ADS)
Aoki, Wako; Beers, Timothy C.; Suda, Takuma; Honda, Satoshi; Lee, Young Sun
2016-08-01
Large surveys and follow-up spectroscopic studies in the past few decades have been providing chemical abundance data for a growing number of very metal-poor ([Fe/H] <-2) stars. Most of them are red giants or main-sequence turn-off stars having masses near 0.8 solar masses. Lower mass stars with extremely low metallicity ([Fe/H] <-3) are yet to be explored. Our high-resolution spectroscopic study for very metal-poor stars found with SDSS has identified four cool main-sequence stars with [Fe/H] <-2.5 among 137 objects (Aoki et al. 2013). The effective temperatures of these stars are 4500-5000 K, corresponding to a mass of around 0.5 solar masses. Our standard analysis of the high-resolution spectra based on 1D-LTE model atmospheres has obtained self-consistent chemical abundances for these objects, assuming small values of micro-turbulent velocities compared with giants and turn-off stars. The low temperature of the atmospheres of these objects enables us to measure their detailed chemical abundances. Interestingly, two of the four stars have extreme chemical-abundance patterns: one has the largest excesses of heavy neutron-capture elements associated with the r-process abundance pattern known to date (Aoki et al. 2010), and the other exhibits low abundances of the α-elements and odd-Z elements, suggested to be signatures of the yields of very massive stars (> 100 solar masses; Aoki et al. 2014). Although the sample size is still small, these results indicate the potential of very low-mass stars as probes to study the early stages of the Milky Way's halo formation.
New insights in the bacterial spore resistance to extreme terrestrial and extraterrestrial factors
NASA Astrophysics Data System (ADS)
Moeller, Ralf; Horneck, Gerda; Reitz, Guenther
Based on their unique resistance to various space parameters, Bacillus endospores are one of the model systems used for astrobiological studies. The extremely high resistance of bacterial endospores to environmental stress factors has intrigued researchers since long time and many characteristic spore features, especially those involved in the protection of spore DNA, have already been uncovered. The disclosure of the complete genomic sequence of Bacillus subtilis 168, one of the often used astrobiological model system, and the rapid development of tran-scriptional microarray techniques have opened new opportunities of gaining further insights in the enigma of spore resistance. Spores of B. subtilis were exposed to various extreme ter-restrial and extraterrestrial stressors to reach a better understanding of the DNA protection and repair strategies, which them to cope with the induced DNA damage. Following physical stress factors of environmental importance -either on Earth or in space -were selected for this thesis: (i) mono-and polychromatic UV radiation, (ii) ionizing radiation, (iii) exposure to ultrahigh vacuum; and (iv) high shock pressures simulating meteorite impacts. To reach a most comprehensive understanding of spore resistance to those harsh terrestrial or simulated extraterrestrial conditions, a standardized experimental protocol of the preparation and ana-lyzing methods was established including the determination of the following spore responses: (i) survival, (ii) induced mutations, (iii) DNA damage, (iv) role of different repair pathways by use of a set of repair deficient mutants, and (v) transcriptional responses during spore germi-nation by use of genome-wide transcriptome analyses and confirmation by RT-PCR. From this comprehensive set of data on spore resistance to a variety of environmental stress parameters a model of a "built-in" transcriptional program of bacterial spores in response to DNA damaging treatments to ensure DNA restoration during germination has been developed.
NASA Astrophysics Data System (ADS)
Wen, Xian-Huan; Gómez-Hernández, J. Jaime
1998-03-01
The macrodispersion of an inert solute in a 2-D heterogeneous porous media is estimated numerically in a series of fields of varying heterogeneity. Four different random function (RF) models are used to model log-transmissivity (ln T) spatial variability, and for each of these models, ln T variance is varied from 0.1 to 2.0. The four RF models share the same univariate Gaussian histogram and the same isotropic covariance, but differ from one another in terms of the spatial connectivity patterns at extreme transmissivity values. More specifically, model A is a multivariate Gaussian model for which, by definition, extreme values (both high and low) are spatially uncorrelated. The other three models are non-multi-Gaussian: model B with high connectivity of high extreme values, model C with high connectivity of low extreme values, and model D with high connectivities of both high and low extreme values. Residence time distributions (RTDs) and macrodispersivities (longitudinal and transverse) are computed on ln T fields corresponding to the different RF models, for two different flow directions and at several scales. They are compared with each other, as well as with predicted values based on first-order analytical results. Numerically derived RTDs and macrodispersivities for the multi-Gaussian model are in good agreement with analytically derived values using first-order theories for log-transmissivity variance up to 2.0. The results from the non-multi-Gaussian models differ from each other and deviate largely from the multi-Gaussian results even when ln T variance is small. RTDs in non-multi-Gaussian realizations with high connectivity at high extreme values display earlier breakthrough than in multi-Gaussian realizations, whereas later breakthrough and longer tails are observed for RTDs from non-multi-Gaussian realizations with high connectivity at low extreme values. Longitudinal macrodispersivities in the non-multi-Gaussian realizations are, in general, larger than in the multi-Gaussian ones, while transverse macrodispersivities in the non-multi-Gaussian realizations can be larger or smaller than in the multi-Gaussian ones depending on the type of connectivity at extreme values. Comparing the numerical results for different flow directions, it is confirmed that macrodispersivities in multi-Gaussian realizations with isotropic spatial correlation are not flow direction-dependent. Macrodispersivities in the non-multi-Gaussian realizations, however, are flow direction-dependent although the covariance of ln T is isotropic (the same for all four models). It is important to account for high connectivities at extreme transmissivity values, a likely situation in some geological formations. Some of the discrepancies between first-order-based analytical results and field-scale tracer test data may be due to the existence of highly connected paths of extreme conductivity values.
Can quantile mapping improve precipitation extremes from regional climate models?
NASA Astrophysics Data System (ADS)
Tani, Satyanarayana; Gobiet, Andreas
2015-04-01
The ability of quantile mapping to accurately bias correct regard to precipitation extremes is investigated in this study. We developed new methods by extending standard quantile mapping (QMα) to improve the quality of bias corrected extreme precipitation events as simulated by regional climate model (RCM) output. The new QM version (QMβ) was developed by combining parametric and nonparametric bias correction methods. The new nonparametric method is tested with and without a controlling shape parameter (Qmβ1 and Qmβ0, respectively). Bias corrections are applied on hindcast simulations for a small ensemble of RCMs at six different locations over Europe. We examined the quality of the extremes through split sample and cross validation approaches of these three bias correction methods. This split-sample approach mimics the application to future climate scenarios. A cross validation framework with particular focus on new extremes was developed. Error characteristics, q-q plots and Mean Absolute Error (MAEx) skill scores are used for evaluation. We demonstrate the unstable behaviour of correction function at higher quantiles with QMα, whereas the correction functions with for QMβ0 and QMβ1 are smoother, with QMβ1 providing the most reasonable correction values. The result from q-q plots demonstrates that, all bias correction methods are capable of producing new extremes but QMβ1 reproduces new extremes with low biases in all seasons compared to QMα, QMβ0. Our results clearly demonstrate the inherent limitations of empirical bias correction methods employed for extremes, particularly new extremes, and our findings reveals that the new bias correction method (Qmß1) produces more reliable climate scenarios for new extremes. These findings present a methodology that can better capture future extreme precipitation events, which is necessary to improve regional climate change impact studies.
Metronidazole as a protector of cells from electromagnetic radiation of extremely high frequencies
NASA Astrophysics Data System (ADS)
Kuznetsov, Pavel E.; Malinina, Ulia A.; Popyhova, Era B.; Rogacheva, Svetlana M.; Somov, Alexander U.
2006-08-01
It is well known that weak electromagnetic fields of extremely high frequencies cause significant modification of the functional status of biological objects of different levels of organization. The aim of the work was to study the combinatory effect of metronidazole - the drug form of 1-(2'hydroxiethil)-2-methil-5-nitroimidazole - and electromagnetic radiation of extremely high frequencies (52...75 GHz) on the hemolytic stability of erythrocytes and hemotaxis activity of Infusoria Paramecium caudatum.
NASA Astrophysics Data System (ADS)
Cheng, L.; Du, J.
2015-12-01
The Xiang River, a main tributary of the Yangtze River, is subjected to high floods frequently in recent twenty years. Climate change, including abrupt shifts and fluctuations in precipitation is an important factor influencing hydrological extreme conditions. In addition, human activities are widely recognized as another reasons leading to high flood risk. With the effects of climate change and human interventions on hydrological cycle, there are several questions that need to be addressed. Are floods in the Xiang River basin getting worse? Whether the extreme streamflow shows an increasing tendency? If so, is it because the extreme rainfall events have predominant effect on floods? To answer these questions, the article detected existing trends in extreme precipitation and discharge using Mann-Kendall test. Continuous wavelet transform method was employed to identify the consistency of changes in extreme precipitation and discharge. The Pearson correlation analysis was applied to investigate how much degree of variations in extreme discharge can be explained by climate change. The results indicate that slightly upward trends can be detected in both extreme rainfalls and discharge in the upper region of Xiang River basin. For the most area of middle and lower river basin, the extreme rainfalls show significant positive trends, but the extreme discharge displays slightly upward trends with no significance at 90% confidence level. Wavelet transform analysis results illustrate that highly similar patterns of signal changes can be seen between extreme precipitation and discharge in upper section of the basin, while the changes in extreme precipitation for the middle and lower reaches do not always coincide with the extreme streamflow. The correlation coefficients of the wavelet transforms for the precipitation and discharge signals in most area of the basin pass the significance test. The conclusion may be drawn that floods in recent years are not getting worse in Xiang River basin. The similar signal patterns and positive correlation between extreme discharge and precipitation indicate that the variability of extreme precipitation has an important effect on extreme discharge of flood, although the intensity of human impacts in lower section of Xiang River basin has increased markedly.
Extreme-volatility dynamics in crude oil markets
NASA Astrophysics Data System (ADS)
Jiang, Xiong-Fei; Zheng, Bo; Qiu, Tian; Ren, Fei
2017-02-01
Based on concepts and methods from statistical physics, we investigate extreme-volatility dynamics in the crude oil markets, using the high-frequency data from 2006 to 2010 and the daily data from 1986 to 2016. The dynamic relaxation of extreme volatilities is described by a power law, whose exponents usually depend on the magnitude of extreme volatilities. In particular, the relaxation before and after extreme volatilities is time-reversal symmetric at the high-frequency time scale, but time-reversal asymmetric at the daily time scale. This time-reversal asymmetry is mainly induced by exogenous events. However, the dynamic relaxation after exogenous events exhibits the same characteristics as that after endogenous events. An interacting herding model both with and without exogenous driving forces could qualitatively describe the extreme-volatility dynamics.
Supporting Climatic Trends of Corn and Soybean Production in the USA
NASA Astrophysics Data System (ADS)
Mishra, V.; Cherkauer, K. A.; Verdin, J. P.
2010-12-01
The United States of America (USA) is a major source of corn and soybeans, producing about 39 percent of the world’s corn and 50 percent of world’s soybean supply. The north central states, including parts of the Midwestern US and the Great Plains form what is commonly described as the “Corn Belt” and consist of the most productive grain growing region in the United States. Changes in climate, including precipitation and temperature, are being observed throughout the world, and the Corn Belt region of the US is not immune posing a potential threat to global food security. We conducted a retrospective analysis of observed climate variables and crop production statistics to evaluate if observed climatic trends are having a positive or negative effect on corn and soybean production in the US. We selected climate indices based on gridded daily precipitation, maximum and minimum air temperature data from the National Climatic Data Center (NCDC) for the period of 1920-2009 and for 13 states in the Corn Belt region. We used the standardized precipitation index (SPI) and standardized precipitation evapotranspiration index (SPEI) for different periods overlapping the important seasons for crop growths, such as the planting (April-May), grain-filling (June-August), and harvesting (September -October) seasons. We estimated the seasonal average of maximum and minimum daily temperatures to identify the historic trends and variability in air temperature during the key crop-growth seasons. Extreme warm temperatures can affect crop growth and yields adversely; therefore, cumulative maximum air temperature above the 90th percentiles (e.g. Cumulative Heat Index) was estimated for each growing period. We evaluated historic trends and variability of areal extents of severe or extreme droughts along with the areal extents facing the high cumulative heat stress. Our results showed that climatic extremes (e.g. droughts and heat stress) that occurred during the period of June - August (JJA), affected the yields of corn and soybeans most severely. High moisture and low heat stress during the JJA period favored crop yields, while low moisture and high heat conditions during the planting season (April-May) increased yields. Results also indicated that this part of the US is trending towards lower heat stress and drought extents, and higher moisture conditions during the JJA period. Therefore, in future, if the present trends persist, we expect the climate will more supportive of increased corn and soybean yields.
Bright high-repetition-rate source of narrowband extreme-ultraviolet harmonics beyond 22 eV
Wang, He; Xu, Yiming; Ulonska, Stefan; Robinson, Joseph S.; Ranitovic, Predrag; Kaindl, Robert A.
2015-01-01
Novel table-top sources of extreme-ultraviolet light based on high-harmonic generation yield unique insight into the fundamental properties of molecules, nanomaterials or correlated solids, and enable advanced applications in imaging or metrology. Extending high-harmonic generation to high repetition rates portends great experimental benefits, yet efficient extreme-ultraviolet conversion of correspondingly weak driving pulses is challenging. Here, we demonstrate a highly-efficient source of femtosecond extreme-ultraviolet pulses at 50-kHz repetition rate, utilizing the ultraviolet second-harmonic focused tightly into Kr gas. In this cascaded scheme, a photon flux beyond ≈3 × 1013 s−1 is generated at 22.3 eV, with 5 × 10−5 conversion efficiency that surpasses similar harmonics directly driven by the fundamental by two orders-of-magnitude. The enhancement arises from both wavelength scaling of the atomic dipole and improved spatio-temporal phase matching, confirmed by simulations. Spectral isolation of a single 72-meV-wide harmonic renders this bright, 50-kHz extreme-ultraviolet source a powerful tool for ultrafast photoemission, nanoscale imaging and other applications. PMID:26067922
NASA Astrophysics Data System (ADS)
Fix, Miranda J.; Cooley, Daniel; Hodzic, Alma; Gilleland, Eric; Russell, Brook T.; Porter, William C.; Pfister, Gabriele G.
2018-03-01
We conduct a case study of observed and simulated maximum daily 8-h average (MDA8) ozone (O3) in three US cities for summers during 1996-2005. The purpose of this study is to evaluate the ability of a high resolution atmospheric chemistry model to reproduce observed relationships between meteorology and high or extreme O3. We employ regional coupled chemistry-transport model simulations to make three types of comparisons between simulated and observational data, comparing (1) tails of the O3 response variable, (2) distributions of meteorological predictor variables, and (3) sensitivities of high and extreme O3 to meteorological predictors. This last comparison is made using two methods: quantile regression, for the 0.95 quantile of O3, and tail dependence optimization, which is used to investigate even higher O3 extremes. Across all three locations, we find substantial differences between simulations and observational data in both meteorology and meteorological sensitivities of high and extreme O3.
Complete 3D kinematics of upper extremity functional tasks.
van Andel, Carolien J; Wolterbeek, Nienke; Doorenbosch, Caroline A M; Veeger, DirkJan H E J; Harlaar, Jaap
2008-01-01
Upper extremity (UX) movement analysis by means of 3D kinematics has the potential to become an important clinical evaluation method. However, no standardized protocol for clinical application has yet been developed, that includes the whole upper limb. Standardization problems include the lack of a single representative function, the wide range of motion of joints and the complexity of the anatomical structures. A useful protocol would focus on the functional status of the arm and particularly the orientation of the hand. The aim of this work was to develop a standardized measurement method for unconstrained movement analysis of the UX that includes hand orientation, for a set of functional tasks for the UX and obtain normative values. Ten healthy subjects performed four representative activities of daily living (ADL). In addition, six standard active range of motion (ROM) tasks were executed. Joint angles of the wrist, elbow, shoulder and scapula were analyzed throughout each ADL task and minimum/maximum angles were determined from the ROM tasks. Characteristic trajectories were found for the ADL tasks, standard deviations were generally small and ROM results were consistent with the literature. The results of this study could form the normative basis for the development of a 'UX analysis report' equivalent to the 'gait analysis report' and would allow for future comparisons with pediatric and/or pathologic movement patterns.
NASA Astrophysics Data System (ADS)
Dibike, Y. B.; Eum, H. I.; Prowse, T. D.
2017-12-01
Flows originating from alpine dominated cold region watersheds typically experience extended winter low flows followed by spring snowmelt and summer rainfall driven high flows. In a warmer climate, there will be temperature- induced shift in precipitation from snow towards rain as well as changes in snowmelt timing affecting the frequency of extreme high and low flow events which could significantly alter ecosystem services. This study examines the potential changes in the frequency and severity of hydrologic extremes in the Athabasca River watershed in Alberta, Canada based on the Variable Infiltration Capacity (VIC) hydrologic model and selected and statistically downscaled climate change scenario data from the latest Coupled Model Intercomparison Project (CMIP5). The sensitivity of these projected changes is also examined by applying different extreme flow analysis methods. The hydrological model projections show an overall increase in mean annual streamflow in the watershed and a corresponding shift in the freshet timing to earlier period. Most of the streams are projected to experience increases during the winter and spring seasons and decreases during the summer and early fall seasons, with an overall projected increases in extreme high flows, especially for low frequency events. While the middle and lower parts of the watershed are characterised by projected increases in extreme high flows, the high elevation alpine region is mainly characterised by corresponding decreases in extreme low flow events. However, the magnitude of projected changes in extreme flow varies over a wide range, especially for low frequent events, depending on the climate scenario and period of analysis, and sometimes in a nonlinear way. Nonetheless, the sensitivity of the projected changes to the statistical method of analysis is found to be relatively small compared to the inter-model variability.
Hannon, Joseph; Garrison, J Craig; Conway, John
2014-05-01
/ Lower extremity balance deficits have been shown to lead to altered kinematics and increased injury risk in lower extremity athletes. The purpose of this study was to compare lower extremity balance in baseball players with an ulnar collateral ligament (UCL) tear pre-operatively and post-operatively at the beginning of the pre-return to throwing program stage of rehabilitation (3 months). Thirty-three competitive high school and collegiate male baseball players (18.5 ± 3.2) with a diagnosed UCL tear volunteered for the study. Of the 33 baseball players 29 were pitchers, 1 was a catcher, and 3 were infielders. Participants were seen pre-operatively and at 3 months post operatively. This 3 month point was associated with a follow-up visit to the orthopedic surgeon and subsequent release to begin the pre-return to throwing mark for baseball players following their surgery. Following surgery, each participant followed a standard UCL protocol which included focused lower extremity balance and neuromuscular control exercises. Participants were tested for single leg balance using the Y-Balance Test™ - Lower Quadrant (YBT-LQ) on both their lead and stance limbs. YBT-LQ composite scores were calculated for the stance and lead limbs pre- and post-operatively and compared over time. Paired t-tests were used to calculate differences between time 1 and time 2 (p < 0.05). Baseball players with diagnosed UCL tears demonstrated significant balance deficits on their stance (p < .001) and lead (p = .009) limbs prior to surgery compared to balance measures at the 3-month follow up (Stance Pre-Op = 89.4 ± 7.5%; Stance 3 Month = 94.9 ± 9.5%) (Lead Pre-Op = 90.2 ± 6.7%; Lead 3 Month = 93.6 ± 7.2%). Based on the results of this study, lower extremity balance is altered in baseball players with UCL tears prior to surgery. Statistically significant improvements were seen and balance measures improved at the time of return to throwing. Level 2b.
Motor and executive function at 6 years of age after extremely preterm birth.
Marlow, Neil; Hennessy, Enid M; Bracewell, Melanie A; Wolke, Dieter
2007-10-01
Studies of very preterm infants have demonstrated impairments in multiple neurocognitive domains. We hypothesized that neuromotor and executive-function deficits may independently contribute to school failure. We studied children who were born at < or = 25 completed weeks' gestation in the United Kingdom and Ireland in 1995 at early school age. Children underwent standardized cognitive and neuromotor assessments, including the Kaufman Assessment Battery for Children and NEPSY, and a teacher-based assessment of academic achievement. Of 308 surviving children, 241 (78%) were assessed at a median age of 6 years 4 months. Compared with 160 term classmates, 180 extremely preterm children without cerebral palsy and attending mainstream school performed less well on 3 simple motor tasks: posting coins, heel walking, and 1-leg standing. They more frequently had non-right-hand preferences (28% vs 10%) and more associated/overflow movements during motor tasks. Standardized scores for visuospatial and sensorimotor function performance differed from classmates by 1.6 and 1.1 SDs of the classmates' scores, respectively. These differences attenuated but remained significant after controlling for overall cognitive scores. Cognitive, visuospatial scores, and motor scores explained 54% of the variance in teachers' ratings of performance in the whole set; in the extremely preterm group, additional variance was explained by attention-executive tasks and gender. Impairment of motor, visuospatial, and sensorimotor function, including planning, self-regulation, inhibition, and motor persistence, contributes excess morbidity over cognitive impairment in extremely preterm children and contributes independently to poor classroom performance at 6 years of age.
16 CFR § 1207.4 - Recommended standards for materials of manufacture.
Code of Federal Regulations, 2013 CFR
2013-01-01
... exposure to rain, snow, ice, sunlight, local, normal temperature extremes, local normal wind variations... be toxic to man or harmful to the environment under intended use and reasonably foreseeable abuse or...
NASA Astrophysics Data System (ADS)
Zhou, Ting; Jia, Xiaorong; Liao, Huixuan; Peng, Shijia; Peng, Shaolin
2016-12-01
Conventional models for predicting species distribution under global warming scenarios often treat one species as a homogeneous whole. In the present study, we selected Cunninghamia lanceolata (C. lanceolata), a widely distributed species in China, to investigate the physio-ecological responses of five populations under different temperature regimes. The results demonstrate that increased mean temperatures induce increased growth performance among northern populations, which exhibited the greatest germination capacity and largest increase in the overlap between the growth curve and the monthly average temperature. However,tolerance of the southern population to extremely high temperatures was stronger than among the population from the northern region,shown by the best growth and the most stable photosynthetic system of the southern population under extremely high temperature. This result indicates that the growth advantage among northern populations due to increased mean temperatures may be weakened by lower tolerance to extremely high temperatures. This finding is antithetical to the predicted results. The theoretical coupling model constructed here illustrates that the difference in growth between populations at high and low latitudes and altitudes under global warming will decrease because of the frequent occurrence of extremely high temperatures.
NASA Astrophysics Data System (ADS)
Freychet, N.; Duchez, A.; Wu, C.-H.; Chen, C.-A.; Hsu, H.-H.; Hirschi, J.; Forryan, A.; Sinha, B.; New, A. L.; Graham, T.; Andrews, M. B.; Tu, C.-Y.; Lin, S.-J.
2017-02-01
This work investigates the variability of extreme weather events (drought spells, DS15, and daily heavy rainfall, PR99) over East Asia. It particularly focuses on the large scale atmospheric circulation associated with high levels of the occurrence of these extreme events. Two observational datasets (APHRODITE and PERSIANN) are compared with two high-resolution global climate models (HiRAM and HadGEM3-GC2) and an ensemble of other lower resolution climate models from CMIP5. We first evaluate the performance of the high resolution models. They both exhibit good skill in reproducing extreme events, especially when compared with CMIP5 results. Significant differences exist between the two observational datasets, highlighting the difficulty of having a clear estimate of extreme events. The link between the variability of the extremes and the large scale circulation is investigated, on monthly and interannual timescales, using composite and correlation analyses. Both extreme indices DS15 and PR99 are significantly linked to the low level wind intensity over East Asia, i.e. the monsoon circulation. It is also found that DS15 events are strongly linked to the surface temperature over the Siberian region and to the land-sea pressure contrast, while PR99 events are linked to the sea surface temperature anomalies over the West North Pacific. These results illustrate the importance of the monsoon circulation on extremes over East Asia. The dependencies on of the surface temperature over the continent and the sea surface temperature raise the question as to what extent they could affect the occurrence of extremes over tropical regions in future projections.
X-Aerogels for Structural Components and High Temperature Applications
NASA Technical Reports Server (NTRS)
2005-01-01
Future NASA missions and space explorations rely on the use of materials that are strong ultra lightweight and able to withstand extreme temperatures. Aerogels are low density (0.01-0.5 g/cu cm) high porosity materials that contain a glass like structure formed through standard sol-gel chemistry. As a result of these structural properties, aerogels are excellent thermal insulators and are able to withstand temperatures in excess of l,000 C. The open structure of aerogels, however, renders these materials extremely fragile (fracturing at stress forces less than 0.5 N/sq cm). The goal of NASA Glenn Research Center is to increase the strength of these materials by templating polymers and metals onto the surface of an aerogel network facilitating the use of this material for practical applications such as structural components of space vehicles used in exploration. The work this past year focused on two areas; (1) the research and development of new templated aerogels materials and (2) process development for future manufacturing of structural components. Research and development occurred on the production and characterization of new templating materials onto the standard silica aerogel. Materials examined included polymers such as polyimides, fluorinated isocyanates and epoxies, and, metals such as silver, gold and platinum. The final properties indicated that the density of the material formed using an isocyanate is around 0.50 g/cc with a strength greater than that of steel and has low thermal conductivity. The process used to construct these materials is extremely time consuming and labor intensive. One aspect of the project involved investigating the feasibility of shortening the process time by preparing the aerogels in the templating solvent. Traditionally the polymerization used THF as the solvent and after several washes to remove any residual monomers and water, the solvent around the aerogels was changed to acetonitrile for the templating step. This process took a couple of days. It was experimentally determined that the polymerization reaction could be done in acetonitrile instead of THF with no detrimental effects to the properties of the resulting aerogel. Other changes in the time needed to crosslink the gels in the isocyanate solution as well as changes to the subsequent washing procedure could also shorten the processing time with no effect on the properties. Processing methods were also developed that allowed a variety of shapes as well as sizes of these materials to be formed.
NASA Astrophysics Data System (ADS)
Buzan, J. R.; Huber, M.
2016-12-01
Heat stress is of global concern because it threatens human and animal health and productivity. Here we use the HumanIndexMod to calculate 3 moist thermodynamic quantities and 9 commonly and operationally used heat stress metrics (Buzan et al., 2015). We drive the HumanIndexMod with output from CMIP5 and the Community Earth System Model Large Ensemble (LENS) using the greenhouse gasses forcing, representative concentration pathway 8.5 (RCP8.5). We limit our analysis to models that provide 4x daily output of surface pressure, reference height temperature and moisture, and use lowest model level winds where available, 18 CMIP5 and 40 LENS simulations. We show three novel results: Comparing time slices (2081-2100 and 2026-2045 for CMIP5, and 2071-2080 and 2026-2035 for LENS), we note that each individual heat stress metric extreme, within the multi-model mean, has spatial patterns that are highly correlated (>0.99). Moist thermodynamics and heat stress extremes are intrinsically linked to the thermodynamics of the climate, and scales simply with global mean surface temperature (GMT) changes. For example, large swaths of land surface area from 30°N to 30°S, excluding the Sahel, the Arabian Peninsula, and Himalayan Plateau, show the response of wet bulb temperature to be 0.85°C/°C GMT (standard deviation <0.25) for CMIP5 and 0.85°C/°C GMT (standard deviation <0.2) for LENS in agreement with prior work by Sherwood and Huber (2010). Many heat stress metrics, after being normalized by global mean surface temperature changes, are highly spatially correlated with each other, and may reduce the necessity of numerous metrics to properly quantify total heat stress. The three results establish that different climate models, with various underlying assumptions (CMIP5) and ranges of internal variability (LENS), show similar responses in heat stress with respect to global mean temperature changes. Thus, we find the uncertainty of heat stress extremes, even changes at the fine scale, is largely subsumed within the main uncertainties encompassed in transient climate sensitivity. These results are consistent with the hypothesis that outdoor worker productivity will drop significantly with substantial climate change.
Prenatal stress alters amygdala functional connectivity in preterm neonates.
Scheinost, Dustin; Kwon, Soo Hyun; Lacadie, Cheryl; Sze, Gordon; Sinha, Rajita; Constable, R Todd; Ment, Laura R
2016-01-01
Exposure to prenatal and early-life stress results in alterations in neural connectivity and an increased risk for neuropsychiatric disorders. In particular, alterations in amygdala connectivity have emerged as a common effect across several recent studies. However, the impact of prenatal stress exposure on the functional organization of the amygdala has yet to be explored in the prematurely-born, a population at high risk for neuropsychiatric disorders. We test the hypothesis that preterm birth and prenatal exposure to maternal stress alter functional connectivity of the amygdala using two independent cohorts. The first cohort is used to establish the effects of preterm birth and consists of 12 very preterm neonates and 25 term controls, all without prenatal stress exposure. The second is analyzed to establish the effects of prenatal stress exposure and consists of 16 extremely preterm neonates with prenatal stress exposure and 10 extremely preterm neonates with no known prenatal stress exposure. Standard resting-state functional magnetic resonance imaging and seed connectivity methods are used. When compared to term controls, very preterm neonates show significantly reduced connectivity between the amygdala and the thalamus, the hypothalamus, the brainstem, and the insula (p < 0.05). Similarly, when compared to extremely preterm neonates without exposure to prenatal stress, extremely preterm neonates with exposure to prenatal stress show significantly less connectivity between the left amygdala and the thalamus, the hypothalamus, and the peristriate cortex (p < 0.05). Exploratory analysis of the combined cohorts suggests additive effects of prenatal stress on alterations in amygdala connectivity associated with preterm birth. Functional connectivity from the amygdala to other subcortical regions is decreased in preterm neonates compared to term controls. In addition, these data, for the first time, suggest that prenatal stress exposure amplifies these decreases.
Compound Extremes and Bunched Black (or Grouped Grey) Swans.
NASA Astrophysics Data System (ADS)
Watkins, Nicholas
2013-04-01
Observed "wild" natural fluctuations may differ substantially in their character. Some events may be genuinely unforeseen (and unforeseeable), as with Taleb's "black swans". These may occur singly, or may have their impact further magnified by being ``bunched" in time. Some of the others may, however, be the rare extreme events from a light-tailed underlying distribution. Studying their occurrence may then be tractable with the methods of extreme value theory [e.g. Coles, 2001], suitably adapted to allow correlation if that is observed to be present. Yet others may belong to a third broad class, described in today's presentation [ reviewed in Watkins, GRL Frontiers, 2013, doi: 10.1002/grl.50103]. Such "bursty" time series may show comparatively frequent high amplitude events, and/or long range correlations between successive values. The frequent large values due to the first of these effects, modelled in economics by Mandelbrot in 1963 using heavy- tailed probability distributions, can give rise to an "IPCC type I" burst composed of successive wild events. Conversely, long range dependence, even in a light-tailed Gaussian model like Mandelbrot and van Ness' fractional Brownian motion, can integrate ``mild" events into an extreme "IPCC type III" burst. I will show how a standard statistical time series model, linear fractional stable motion (LFSM), which descends from the two special cases advocated by Mandelbrot, allows these two effects to be varied independently, and will present results from a preliminary study of such bursts in LFSM. The consequences for burst scaling when low frequency effects due to dissipation (FARIMA models), and multiplicative cascades (such as multifractals) are included will also be discussed, and the physical assumptions and constraints associated with making a given choice of model.
Drake, John E; Tjoelker, Mark G; Vårhammar, Angelica; Medlyn, Belinda E; Reich, Peter B; Leigh, Andrea; Pfautsch, Sebastian; Blackman, Chris J; López, Rosana; Aspinwall, Michael J; Crous, Kristine Y; Duursma, Remko A; Kumarathunge, Dushan; De Kauwe, Martin G; Jiang, Mingkai; Nicotra, Adrienne B; Tissue, David T; Choat, Brendan; Atkin, Owen K; Barton, Craig V M
2018-06-01
Heatwaves are likely to increase in frequency and intensity with climate change, which may impair tree function and forest C uptake. However, we have little information regarding the impact of extreme heatwaves on the physiological performance of large trees in the field. Here, we grew Eucalyptus parramattensis trees for 1 year with experimental warming (+3°C) in a field setting, until they were greater than 6 m tall. We withheld irrigation for 1 month to dry the surface soils and then implemented an extreme heatwave treatment of 4 consecutive days with air temperatures exceeding 43°C, while monitoring whole-canopy exchange of CO 2 and H 2 O, leaf temperatures, leaf thermal tolerance, and leaf and branch hydraulic status. The heatwave reduced midday canopy photosynthesis to near zero but transpiration persisted, maintaining canopy cooling. A standard photosynthetic model was unable to capture the observed decoupling between photosynthesis and transpiration at high temperatures, suggesting that climate models may underestimate a moderating feedback of vegetation on heatwave intensity. The heatwave also triggered a rapid increase in leaf thermal tolerance, such that leaf temperatures observed during the heatwave were maintained within the thermal limits of leaf function. All responses were equivalent for trees with a prior history of ambient and warmed (+3°C) temperatures, indicating that climate warming conferred no added tolerance of heatwaves expected in the future. This coordinated physiological response utilizing latent cooling and adjustment of thermal thresholds has implications for tree tolerance of future climate extremes as well as model predictions of future heatwave intensity at landscape and global scales. © 2018 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Mortazavi-Naeini, M.; Bussi, G.; Hall, J. W.; Whitehead, P. G.
2016-12-01
The main aim of water companies is to have a reliable and safe water supply system. To fulfil their duty the water companies have to consider both water quality and quantity issues and challenges. Climate change and population growth will have an impact on water resources both in terms of available water and river water quality. Traditionally, a distinct separation between water quality and abstraction has existed. However, water quality can be a bottleneck in a system since water treatment works can only treat water if it meets certain standards. For instance, high turbidity and large phytoplankton content can increase sharply the cost of treatment or even make river water unfit for human consumption purposes. It is vital for water companies to be able to characterise the quantity and quality of water under extreme weather events and to consider the occurrence of eventual periods when water abstraction has to cease due to water quality constraints. This will give them opportunity to decide on water resource planning and potential changes to reduce the system failure risk. We present a risk-based approach for incorporating extreme events, based on future climate change scenarios from a large ensemble of climate model realisations, into integrated water resources model through combined use of water allocation (WATHNET) and water quality (INCA) models. The annual frequency of imposed restrictions on demand is considered as measure of reliability. We tested our approach on Thames region, in the UK, with 100 extreme events. The results show increase in frequency of imposed restrictions when water quality constraints were considered. This indicates importance of considering water quality issues in drought management plans.
Non-Hodgkin's lymphomas: clinical governance issues.
Fields, P A; Goldstone, A H
2002-09-01
Every patient in every part of the world has the right to expect the best possible quality of care from health care providers. Non-Hodgkin's lymphomas (NHL) are an extremely heterogeneous group of conditions which require important decisions to be taken at many points along the treatment pathway. To get this right every time requires that high-quality standards are instituted and adhered to, so that the best possible outcome is achieved. In the past this has not always been the case because of the failure of clinicians sometimes to adhere to an optimal management plan. In 1995, the UK government commissioned an inquiry into the running of cancer services in the United Kingdom, which culminated in a series of recommendations to improve them. Subsequently, these recommendations were implemented as objectives of the NHS Cancer Plan which is the framework by which the UK government wishes to improve cancer services. Concurrently another general concept has emerged which is designed to ensure that the highest quality standards may be achieved for all patients across the whole National Health Service (NHS). This concept, termed 'clinical governance', brings together a corporate responsibility of all health care workers to deliver high quality standards, in the hope that this will translate into better long-term survival of patients with malignant disease. This chapter focuses on the issues surrounding clinical governance and how the principles of this concept relate to non-Hodgkin's lymphomas.
Hood River Passive House, Hood River, Oregon (Fact Sheet)
DOE Office of Scientific and Technical Information (OSTI.GOV)
None, None
2014-02-01
The Hood River Passive Project was developed by Root Design Build of Hood River Oregon using the Passive House Planning Package (PHPP) to meet all of the requirements for certification under the European Passive House standards. The Passive House design approach has been gaining momentum among residential designers for custom homes and BEopt modeling indicates that these designs may actually exceed the goal of the U.S. Department of Energy's (DOE) Building America program to "reduce home energy use by 30%-50%" (compared to 2009 energy codes for new homes). This report documents the short term test results of the Shift Housemore » and compares the results of PHPP and BEopt modeling of the project. The design includes high R-Value assemblies, extremely tight construction, high performance doors and windows, solar thermal DHW, heat recovery ventilation, moveable external shutters and a high performance ductless mini-split heat pump. Cost analysis indicates that many of the measures implemented in this project did not meet the BA standard for cost neutrality. The ductless mini-split heat pump, lighting and advanced air leakage control were the most cost effective measures. The future challenge will be to value engineer the performance levels indicated here in modeling using production based practices at a significantly lower cost.« less
Kamieniecki, G W
2001-06-01
To review the prevalence literature on psychological distress and psychiatric disorders among homeless youth in Australia, and to compare these rates with Australian youth as a whole. Computerized databases were utilized to access all published Australian studies on psychological distress (as measured by standardized symptom scales and suicidal behaviour) and psychiatric disorders among homeless youth; in addition, unpublished Australian studies were utilized whenever accessible. A total of 14 separate studies were located, only three of which have included non-homeless control groups. In the current review, prevalence data from uncontrolled youth homelessness studies are compared with data from Australian community and student surveys. Homeless youth have usually scored significantly higher on standardized measures of psychological distress than all domiciled control groups. Youth homelessness studies have also reported very high rates of suicidal behaviour, but methodological limitations in these studies make comparisons with community surveys difficult. Furthermore, rates of various psychiatric disorders are usually at least twice as high among homeless youth than among youth from community surveys. Homeless youth in Australia have extremely high rates of psychological distress and psychiatric disorders. As homeless youth are at risk of developing psychiatric disorders and possibly self-injurious behaviour the longer they are homeless, early intervention in relevant health facilities is required.
New Whole-House Solutions Case Study: Hood River Passive House
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2014-02-01
The Hood River Passive Project was developed by Root Design Build of Hood River Oregon using the Passive House Planning Package (PHPP) to meet all of the requirements for certification under the European Passive House standards. The Passive House design approach has been gaining momentum among residential designers for custom homes and BEopt modeling indicates that these designs may actually exceed the goal of the U.S. Department of Energy's (DOE) Building America program to "reduce home energy use by 30%-50%" (compared to 2009 energy codes for new homes). This report documents the short term test results of the Shift Housemore » and compares the results of PHPP and BEopt modeling of the project. The design includes high R-Value assemblies, extremely tight construction, high performance doors and windows, solar thermal DHW, heat recovery ventilation, moveable external shutters and a high performance ductless mini-split heat pump. Cost analysis indicates that many of the measures implemented in this project did not meet the BA standard for cost neutrality. The ductless mini-split heat pump, lighting and advanced air leakage control were the most cost effective measures. The future challenge will be to value engineer the performance levels indicated here in modeling using production based practices at a significantly lower cost.« less
NASA Astrophysics Data System (ADS)
Pántano, V. C.; Penalba, O. C.
2013-05-01
Extreme events of temperature and rainfall have a socio-economic impact in the rainfed agriculture production region in Argentina. The magnitude of the impact can be analyzed through the water balance which integrates the characteristics of the soil and climate conditions. Changes observed in climate variables during the last decades affected the components of the water balance. As a result, a displacement of the agriculture border towards the west was produced, improving the agricultural production of the region. The objective of this work is to analyze how the variability of rainfall and temperature leads the hydric condition of the soil, with special focus on extreme events. The hydric conditions of the soil (HC= Excess- Deficit) were estimated from the monthly water balance (Thornthwaite and Mather method, 1957), using monthly potential evapotranspiration (PET) and monthly accumulated rainfall (R) for 33 stations (period 1970-2006). Information of temperature and rainfall was provided by National Weather Service and the effective capacity of soil water was considered from Forte Lay and Spescha (2001). An agricultural extreme condition occurs when soil moisture and rainfall are inadequate or excessive for the development of the crops. In this study, we define an extreme event when the variable is less (greater) than its 20% and 10% (80% and 90%) percentile. In order to evaluate how sensitive is the HC to water and heat stress in the region, different conditional probabilities were evaluated. There is a weaker response of HC to extreme low PET while extreme low R leads high values of HC. However, this behavior is not always observed, especially in the western region where extreme high and low PET show a stronger influence over the HC. Finally, to analyze the temporal variability of extreme PET and R, leading hydric condition of the soil, the number of stations presenting extreme conditions was computed for each month. As an example, interesting results were observed for April. During this month, the water recharge of the soil is crucial to let the winter crops manage with the scarce rainfalls occurring in the following months. In 1970, 1974, 1977, 1978 and 1997 more than 50% of the stations were under extreme high PET; while 1970, 1974, 1978 and 1988 presented more than 40% under extreme low R. Thus, the 70s was the more threatened decade of the period. Since the 80s (except for 1997), extreme dry events due to one variable or the other are mostly presented separately, over smaller areas. The response of the spatial distribution of HC is stronger when both variables present extreme conditions. In particular, during 1997 the region presents extreme low values of HC as a consequence of extreme low R and high PET. Communities dependent on agriculture are highly sensitive to climate variability and its extremes. In the studied region, it was shown that scarce water and heat stress contribute to the resulting hydric condition, producing strong impact over different productive activities. Extreme temperature seems to have a stronger influence over extreme unfavorable hydric conditions.
NASA Astrophysics Data System (ADS)
Schoof, J. T.
2017-12-01
Extreme temperatures affect society in multiple ways, but the impacts are often different depending on the concurrent humidity. For example, the greatest impacts on human morbidity and mortality result when the temperature and humidity are both elevated. Conversely, high temperatures coupled with low humidity often lead to agricultural impacts resulting in lower yields. Despite the importance of humidity in determining heat wave impacts, relatively few students of future temperature extremes have also considered possible changes in humidity. In a recent study, we investigated recent historical changes in the frequency and intensity and low humidity and high humidity extreme temperature events using a framework based on isobaric equivalent temperature. Here, we extend this approach to climate projections from CMIP5 models to explore possible regional changes in extreme heat characteristics. After using quantile mapping to bias correct and downscale the CMIP5 model outputs, we analyze results from two future periods (2031-2055 and 2061-2085) and two representative concentration pathways, RCP 4.5 and RCP 8.5, corresponding to moderate and high levels of radiative forcing from greenhouse gases. For each of seven US regions, we consider changes in extreme temperature frequency, changes in the proportion of extreme temperature days characterized by high humidity, and changes in the magnitude of temperature and humidity on extreme temperature days.
The critical role of uncertainty in projections of hydrological extremes
NASA Astrophysics Data System (ADS)
Meresa, Hadush K.; Romanowicz, Renata J.
2017-08-01
This paper aims to quantify the uncertainty in projections of future hydrological extremes in the Biala Tarnowska River at Koszyce gauging station, south Poland. The approach followed is based on several climate projections obtained from the EURO-CORDEX initiative, raw and bias-corrected realizations of catchment precipitation, and flow simulations derived using multiple hydrological model parameter sets. The projections cover the 21st century. Three sources of uncertainty are considered: one related to climate projection ensemble spread, the second related to the uncertainty in hydrological model parameters and the third related to the error in fitting theoretical distribution models to annual extreme flow series. The uncertainty of projected extreme indices related to hydrological model parameters was conditioned on flow observations from the reference period using the generalized likelihood uncertainty estimation (GLUE) approach, with separate criteria for high- and low-flow extremes. Extreme (low and high) flow quantiles were estimated using the generalized extreme value (GEV) distribution at different return periods and were based on two different lengths of the flow time series. A sensitivity analysis based on the analysis of variance (ANOVA) shows that the uncertainty introduced by the hydrological model parameters can be larger than the climate model variability and the distribution fit uncertainty for the low-flow extremes whilst for the high-flow extremes higher uncertainty is observed from climate models than from hydrological parameter and distribution fit uncertainties. This implies that ignoring one of the three uncertainty sources may cause great risk to future hydrological extreme adaptations and water resource planning and management.
Numerical investigation of freak waves
NASA Astrophysics Data System (ADS)
Chalikov, D.
2009-04-01
Paper describes the results of more than 4,000 long-term (up to thousands of peak-wave periods) numerical simulations of nonlinear gravity surface waves performed for investigation of properties and estimation of statistics of extreme (‘freak') waves. The method of solution of 2-D potential wave's equations based on conformal mapping is applied to the simulation of wave behavior assigned by different initial conditions, defined by JONSWAP and Pierson-Moskowitz spectra. It is shown that nonlinear wave evolution sometimes results in appearance of very big waves. The shape of freak waves varies within a wide range: some of them are sharp-crested, others are asymmetric, with a strong forward inclination. Some of them can be very big, but not steep enough to create dangerous conditions for vessels (but not for fixed objects). Initial generation of extreme waves can occur merely as a result of group effects, but in some cases the largest wave suddenly starts to grow. The growth is followed sometimes by strong concentration of wave energy around a peak vertical. It is taking place in the course of a few peak wave periods. The process starts with an individual wave in a physical space without significant exchange of energy with surrounding waves. Sometimes, a crest-to-trough wave height can be as large as nearly three significant wave heights. On the average, only one third of all freak waves come to breaking, creating extreme conditions, however, if a wave height approaches the value of three significant wave heights, all of the freak waves break. The most surprising result was discovery that probability of non-dimensional freak waves (normalized by significant wave height) is actually independent of density of wave energy. It does not mean that statistics of extreme waves does not depend on wave energy. It just proves that normalization of wave heights by significant wave height is so effective, that statistics of non-dimensional extreme waves tends to be independent of wave energy. It is naive to expect that high order moments such as skewness and kurtosis can serve as predictors or even indicators of freak waves. Firstly, the above characteristics cannot be calculated with the use of spectrum usually determined with low accuracy. Such calculations are definitely unstable to a slight perturbation of spectrum. Secondly, even if spectrum is determined with high accuracy (for example calculated with the use of exact model), the high order moments cannot serve as the predictors, since they change synchronically with variations of extreme wave heights. Appearance of freak waves occurs simultaneously with increase of the local kurtosis, hence, kurtosis is simply a passive indicator of the same local geometrical properties of a wave field. This effect disappears completely, if spectrum is calculated over a very wide ensemble of waves. In this case existence of a freak wave is just disguised by other, non freak waves. Thirdly, all high order moments are dependant of spectral presentation - they increase with increasing of spectral resolution and cut-frequency. Statistics of non-dimensional waves as well as emergence of extreme waves is the innate property of a nonlinear wave field. Probability function for steep waves has been constructed. Such type function can be used for development of operational forecast of freak waves based on a standard forecast provided by the 3-d generation wave prediction model (WAVEWATCH or WAM).
NASA Technical Reports Server (NTRS)
Pulkkinen, Antti; Bernabeu, Emanuel; Eichner, Jan; Viljanen, Ari; Ngwira, Chigomezyo
2015-01-01
Motivated by the needs of the high-voltage power transmission industry, we use data from the high-latitude IMAGE magnetometer array to study characteristics of extreme geoelectric fields at regional scales. We use 10-s resolution data for years 1993-2013, and the fields are characterized using average horizontal geoelectric field amplitudes taken over station groups that span about 500-km distance. We show that geoelectric field structures associated with localized extremes at single stations can be greatly different from structures associated with regionally uniform geoelectric fields, which are well represented by spatial averages over single stations. Visual extrapolation and rigorous extreme value analysis of spatially averaged fields indicate that the expected range for 1-in-100-year extreme events are 3-8 V/km and 3.4-7.1 V/km, respectively. The Quebec reference ground model is used in the calculations.
Wagner, Roland; Helin, Tapio; Obereder, Andreas; Ramlau, Ronny
2016-02-20
The imaging quality of modern ground-based telescopes such as the planned European Extremely Large Telescope is affected by atmospheric turbulence. In consequence, they heavily depend on stable and high-performance adaptive optics (AO) systems. Using measurements of incoming light from guide stars, an AO system compensates for the effects of turbulence by adjusting so-called deformable mirror(s) (DMs) in real time. In this paper, we introduce a novel reconstruction method for ground layer adaptive optics. In the literature, a common approach to this problem is to use Bayesian inference in order to model the specific noise structure appearing due to spot elongation. This approach leads to large coupled systems with high computational effort. Recently, fast solvers of linear order, i.e., with computational complexity O(n), where n is the number of DM actuators, have emerged. However, the quality of such methods typically degrades in low flux conditions. Our key contribution is to achieve the high quality of the standard Bayesian approach while at the same time maintaining the linear order speed of the recent solvers. Our method is based on performing a separate preprocessing step before applying the cumulative reconstructor (CuReD). The efficiency and performance of the new reconstructor are demonstrated using the OCTOPUS, the official end-to-end simulation environment of the ESO for extremely large telescopes. For more specific simulations we also use the MOST toolbox.
Water-quality conditions in the New River, Imperial County, California
Setmire, James G.
1979-01-01
The New River, when entering the United States at Calexico, Calif., often contains materials which have the appearance of industrial and domestic wastes. Passage of some of these materials is recognized by a sudden increase in turbidity over background levels and the presence of white particulate matter. Water samples taken during these events are usually extremely high in organic content. During a 4-day reconnaissance of water quality in May 1977, white-to-brown extremely turbid water crossed the border on three occasions. On one of these occasions , the water was intensively sampled. The total organic-carbon concentration ranged from 80 to 161 milligrams per liter (mg/l); dissolved organic carbon ranged from 34 to 42 mg/l, and the chemical oxygen demand was as high as 510 mg/l. River profiles showed a dissolved-oxygen sag, with the length of the zone of depressed dissolved-oxygen concentrations varying seasonally. During the summer months, dissolved-oxygen concentrations in the river were lower and the zone of depressed dissolved-oxygen concentrations was longer. The largest increases in dissolved-oxygen concentration from reaeration occurred at the three drop structures and the rock weir near Seeley. The effects of oxygen demanding materials crossing the border extended as far as Highway 80, 19.5 miles downstream from the international boundary at Calexico. Fish kills and anaerobic conditions were also detected as far as Highway 80. Standard bacteria indicator tests for fecal contamination showed a very high health-hazard potential near the border. (Woodard-USGS)
Fabrication of wafer-scale nanopatterned sapphire substrate through phase separation lithography
NASA Astrophysics Data System (ADS)
Guo, Xu; Ni, Mengyang; Zhuang, Zhe; Dai, Jiangping; Wu, Feixiang; Cui, Yushuang; Yuan, Changsheng; Ge, Haixiong; Chen, Yanfeng
2016-04-01
A phase separation lithography (PSL) based on polymer blend provides an extremely simple, low-cost, and high-throughput way to fabricate wafer-scale disordered nanopatterns. This method was introduced to fabricate nanopatterned sapphire substrates (NPSSs) for GaN-based light-emitting diodes (LEDs). The PSL process only involved in spin-coating of polystyrene (PS)/polyethylene glycol (PEG) polymer blend on sapphire substrate and followed by a development with deionized water to remove PEG moiety. The PS nanoporous network was facilely obtained, and the structural parameters could be effectively tuned by controlling the PS/PEG weight ratio of the spin-coating solution. 2-in. wafer-scale NPSSs were conveniently achieved through the PS nanoporous network in combination with traditional nanofabrication methods, such as O2 reactive ion etching (RIE), e-beam evaporation deposition, liftoff, and chlorine-based RIE. In order to investigate the performance of such NPSSs, typical blue LEDs with emission wavelengths of ~450 nm were grown on the NPSS and a flat sapphire substrate (FSS) by metal-organic chemical vapor deposition, respectively. The integral photoluminescence (PL) intensity of the NPSS LED was enhanced by 32.3 % compared to that of the FSS-LED. The low relative standard deviation of 4.7 % for PL mappings of NPSS LED indicated the high uniformity of PL data across the whole 2-in. wafer. Extremely simple, low cost, and high throughput of the process and the ability to fabricate at the wafer scale make PSL a potential method for production of nanopatterned sapphire substrates.
Toward Gas Chemistry in Low Metallicity Starburst Galaxies
NASA Astrophysics Data System (ADS)
Meier, David S.; Anderson, Crystal N.; Turner, Jean; Ott, Juergen; Beck, Sara C.
2017-01-01
Dense gas, which is intimately connected with star formation, is key to understanding star formation. Though challenging to study, dense gas in low metallicity starbursts is important given these system's often extreme star formation and their potential implications for high redshift analogs. High spatial resolution (~50 pc) ALMA observations of several key probes of gas chemistry, including HCN(1-0), HCO+(1-0), CS(2-1), CCH(1-0;3/2-1/2) and SiO(2-1), towards the nearby super star-cluster (SSC) forming, sub-solar metallicity galaxy NGC 5253 are discussed. Dense gas is observed to be extended well beyond the current compact starburst, reaching into the apparently infalling molecular streamer. The faintness of HCN, the standard dense gas tracer, is extreme both in an absolute sense relative to high metallicity starbursts of a similar intensity and in a relative sense, with the HCO+/HCN ratio being one of the most elevated observed. UV-irradiated molecular gas, traced by CCH, is also extended over the mapped region, not being strongly correlated with the SSC. Despite the accretion of molecular gas from the halo and the intense burst of star formation, chemical signatures of shocked gas, traced by SiO (and HNCO), are not obvious. By placing NGC 5253 in context with other local starbursts, like 30 Doradus in the Large Magellanic Clouds and the high metallicity proto-typical starburst NGC 253, it is suggested that a combination of gas excitation and abundance changes associated with the sub solar metallicity may explain these anomalous dense gas properties.
Advanced components for spaceborne infrared astronomy
NASA Technical Reports Server (NTRS)
Davidson, A. W.
1984-01-01
The need for improved cryogenic components to be used in future spaceborne infrared astronomy missions was identified. Improved low noise cryogenic amplifiers operated with infrared detectors, and better cryogenic actuators and motors with extremely low power dissipation are needed. The feasibility of achieving technological breakthroughs in both of these areas was studied. An improved silicon junction field effect transistor (JFET) could be developed if: (1) high purity silicon; (2) optimum dopants; and (3) very high doping levels are used. The feasibility of a simple stepper motor equipped with superconducting coils is demonstrated by construction of such a device based on a standard commercial motor. It is found that useful levels of torque at immeasurably low power levels were achieved. It is concluded that with modest development and optimization efforts, significant performance gains is possible for both cryogenic preamplifiers and superconducting motors and actuators.
NASA Astrophysics Data System (ADS)
Kusangaya, Samuel; Warburton Toucher, Michele L.; van Garderen, Emma Archer
2018-02-01
Downscaled General Circulation Models (GCMs) output are used to forecast climate change and provide information used as input for hydrological modelling. Given that our understanding of climate change points towards an increasing frequency, timing and intensity of extreme hydrological events, there is therefore the need to assess the ability of downscaled GCMs to capture these extreme hydrological events. Extreme hydrological events play a significant role in regulating the structure and function of rivers and associated ecosystems. In this study, the Indicators of Hydrologic Alteration (IHA) method was adapted to assess the ability of simulated streamflow (using downscaled GCMs (dGCMs)) in capturing extreme river dynamics (high and low flows), as compared to streamflow simulated using historical climate data from 1960 to 2000. The ACRU hydrological model was used for simulating streamflow for the 13 water management units of the uMngeni Catchment, South Africa. Statistically downscaled climate models obtained from the Climate System Analysis Group at the University of Cape Town were used as input for the ACRU Model. Results indicated that, high flows and extreme high flows (one in ten year high flows/large flood events) were poorly represented both in terms of timing, frequency and magnitude. Simulated streamflow using dGCMs data also captures more low flows and extreme low flows (one in ten year lowest flows) than that captured in streamflow simulated using historical climate data. The overall conclusion was that although dGCMs output can reasonably be used to simulate overall streamflow, it performs poorly when simulating extreme high and low flows. Streamflow simulation from dGCMs must thus be used with caution in hydrological applications, particularly for design hydrology, as extreme high and low flows are still poorly represented. This, arguably calls for the further improvement of downscaling techniques in order to generate climate data more relevant and useful for hydrological applications such as in design hydrology. Nevertheless, the availability of downscaled climatic output provide the potential of exploring climate model uncertainties in different hydro climatic regions at local scales where forcing data is often less accessible but more accurate at finer spatial scales and with adequate spatial detail.
Counteracting Rotor Imbalance in a Bearingless Motor System with Feedforward Control
NASA Technical Reports Server (NTRS)
Kascak, Peter Eugene; Jansen, Ralph H.; Dever, Timothy; Nagorny, Aleksandr; Loparo, Kenneth
2012-01-01
In standard motor applications, traditional mechanical bearings represent the most economical approach to rotor suspension. However, in certain high performance applications, rotor suspension without bearing contact is either required or highly beneficial. Such applications include very high speed, extreme environment, or limited maintenance access applications. This paper extends upon a novel bearingless motor concept, in which full five-axis levitation and rotation of the rotor is achieved using two motors with opposing conical air-gaps. By leaving the motors' pole-pairs unconnected, different d-axis flux in each pole-pair is created, generating a flux imbalance which creates lateral force. Note this is approach is different than that used in previous bearingless motors, which use separate windings for levitation and rotation. This paper will examine the use of feedforward control to counteract synchronous whirl caused by rotor imbalance. Experimental results will be presented showing the performance of a prototype bearingless system, which was sized for a high speed flywheel energy storage application, with and without feedforward control.
New Clues to the Mysterious Origin of Wide-Separation Planetary-Mass Companions
NASA Astrophysics Data System (ADS)
Bryan, Marta
2018-01-01
Over the past decade, direct imaging searches for young gas giant planets have revealed a new population of young planetary-mass companions with extremely wide orbital separations (>50 AU) and masses near or at the deuterium-burning limit. These companions pose significant challenges to standard formation models, including core accretion, disk instability, and turbulent fragmentation. In my talk I will discuss new results from high-contrast imaging and high-resolution infrared spectroscopy of a sample of directly imaged wide-separation companions that can be used to directly test these three competing formation mechanisms. First, I use high-contrast imaging to strongly discount scattering as a hypothesis for the origin of wide-separation companions. Second, I measure rotation rates of a subset of these companions using their near-IR spectra, and place the first constraints on the angular momentum evolution of young planetary-mass objects. Finally, I explore the ability of high-resolution spectroscopy to constrain the atmospheric C/O ratios of these companions, providing a complementary test of competing formation scenarios.
Response of Polyurethane to Shock Waves: An Experimental Investigation
NASA Astrophysics Data System (ADS)
Jayaram, V.; Rao, Keshava Subba; Thanganayaki, N.; Kumara, H. K. T.; Reddy, K. P. J.
Formation of polyurethane (PU) in vacuum environment and controlling density of polyurethane foams are the present day challenges. Polyurethane exists in numerous forms ranging from flexible to rigid and lightweight foams to tough, stiff elastomers [1]. PU can be used to produce lightweight foams for insulation or hard rubber used as wheels to transport heavy loads and it can be used in high pressure applications. The largest volumes of commercial PU elastomers are made from toluene diisocyanate (TDI) or diphenylmethane-4, 4'-diisocyanate (MDI) [2]. Linear polyurethanes can be processed into final products by any of the standard thermoplastic processes (injection molding, extrusion, thermoforming) as well as by low pressure cast processes in presence of catalysts. Tin, tetrabutyl titanate and zirconium chelates are few effective catalysts used to produce polyurethane for particular application [3]. Thermoset elastomers are formed due to irreversible cross-links, when polymers are chemically cured. Highly porous biodegradable PU was synthesized by thermally induced phase separation technique used in tissue engineering and also in bio-degradable based fluids [4]. Properties of PU like hardness, stress/strain modulus, tear strength etc, was determine using ASTM (American Society for Testing and Materials) standard methods. PU possesses extremely high mechanical properties, excellent abrasion, tear and extrusion resistance. It has outstanding low-temperature limit (-600C) and high temperature limit up to (1500C).
Xu, Kui; Ma, Chao; Lian, Jijian; Bin, Lingling
2014-01-01
Catastrophic flooding resulting from extreme meteorological events has occurred more frequently and drawn great attention in recent years in China. In coastal areas, extreme precipitation and storm tide are both inducing factors of flooding and therefore their joint probability would be critical to determine the flooding risk. The impact of storm tide or changing environment on flooding is ignored or underestimated in the design of drainage systems of today in coastal areas in China. This paper investigates the joint probability of extreme precipitation and storm tide and its change using copula-based models in Fuzhou City. The change point at the year of 1984 detected by Mann-Kendall and Pettitt’s tests divides the extreme precipitation series into two subsequences. For each subsequence the probability of the joint behavior of extreme precipitation and storm tide is estimated by the optimal copula. Results show that the joint probability has increased by more than 300% on average after 1984 (α = 0.05). The design joint return period (RP) of extreme precipitation and storm tide is estimated to propose a design standard for future flooding preparedness. For a combination of extreme precipitation and storm tide, the design joint RP has become smaller than before. It implies that flooding would happen more often after 1984, which corresponds with the observation. The study would facilitate understanding the change of flood risk and proposing the adaption measures for coastal areas under a changing environment. PMID:25310006
Xu, Kui; Ma, Chao; Lian, Jijian; Bin, Lingling
2014-01-01
Catastrophic flooding resulting from extreme meteorological events has occurred more frequently and drawn great attention in recent years in China. In coastal areas, extreme precipitation and storm tide are both inducing factors of flooding and therefore their joint probability would be critical to determine the flooding risk. The impact of storm tide or changing environment on flooding is ignored or underestimated in the design of drainage systems of today in coastal areas in China. This paper investigates the joint probability of extreme precipitation and storm tide and its change using copula-based models in Fuzhou City. The change point at the year of 1984 detected by Mann-Kendall and Pettitt's tests divides the extreme precipitation series into two subsequences. For each subsequence the probability of the joint behavior of extreme precipitation and storm tide is estimated by the optimal copula. Results show that the joint probability has increased by more than 300% on average after 1984 (α = 0.05). The design joint return period (RP) of extreme precipitation and storm tide is estimated to propose a design standard for future flooding preparedness. For a combination of extreme precipitation and storm tide, the design joint RP has become smaller than before. It implies that flooding would happen more often after 1984, which corresponds with the observation. The study would facilitate understanding the change of flood risk and proposing the adaption measures for coastal areas under a changing environment.
The association between preceding drought occurrence and heat waves in the Mediterranean
NASA Astrophysics Data System (ADS)
Russo, Ana; Gouveia, Célia M.; Ramos, Alexandre M.; Páscoa, Patricia; Trigo, Ricardo M.
2017-04-01
A large number of weather driven extreme events has occurred worldwide in the last decade, namely in Europe that has been struck by record breaking extreme events with unprecedented socio-economic impacts, including the mega-heatwaves of 2003 in Europe and 2010 in Russia, and the large droughts in southwestern Europe in 2005 and 2012. The last IPCC report on extreme events points that a changing climate can lead to changes in the frequency, intensity, spatial extent, duration, and timing of weather and climate extremes. These, combined with larger exposure, can result in unprecedented risk to humans and ecosystems. In this context it is becoming increasingly relevant to improve the early identification and predictability of such events, as they negatively affect several socio-economic activities. Moreover, recent diagnostic and modelling experiments have confirmed that hot extremes are often preceded by surface moisture deficits in some regions throughout the world. In this study we analyze if the occurrence of hot extreme months is enhanced by the occurrence of preceding drought events throughout the Mediterranean area. In order to achieve this purpose, the number of hot days in the regions' hottest month will be associated with a drought indicator. The evolution and characterization of drought was analyzed using both the Standardized Precipitation Evaporation Index (SPEI) and the Standardized Precipitation Index (SPI), as obtained from CRU TS3.23 database for the period 1950-2014. We have used both SPI and SPEI for different time scales between 3 and 9 months with a spatial resolution of 0.5°. The number of hot days and nights per month (NHD and NHN) was determined using the ECAD-EOBS daily dataset for the same period and spatial resolution (dataset v14). The NHD and NHN were computed, respectively, as the number of days with a maximum or minimum temperature exceeding the 90th percentile. Results show that the most frequent hottest months for the Mediterranean region occur in July and August. Moreover, the magnitude of correlations between detrended NHD/NHN and the preceding 6- and 9-month SPEI/SPI are usually dimmer than for the 3 month time-scale. Most regions exhibit significantly negative correlations, i.e. high (low) NHD/NHN following negative (positive) SPEI/SPI values, and thus a potential for NHD/NHN early warning. Finally, correlations between the NHD/NHN with SPI and SPEI differ, with SPEI characterized by slightly higher values observed mainly for the 3-months time-scale. Acknowledgments: This work was partially supported by national funds through FCT (Fundação para a Ciência e a Tecnologia, Portugal) under project IMDROFLOOD (WaterJPI/0004/2014). Ana Russo thanks FCT for granted support (SFRH/BPD/99757/2014). A. M. Ramos was also supported by a FCT postdoctoral grant (FCT/DFRH/ SFRH/BPD/84328/2012).
Nenezić, Dragoslav; Pandaitan, Simon; Ilijevski, Nenad; Matić, Predrag; Gajin, Predag; Radak, Dorde
2005-01-01
Although the incidence of prosthetic infection is low (1%-6%), the consequences (limb loss or death) are dramatic for a patient, with high mortality rate (25%-75%) and limb loss in 40%-75% of cases. In case of Szilagyi's grade III infection, standard procedure consists of the excision of prosthesis and wound debridement. Alternative method is medical treatment. This is a case report of a patient with prosthetic infection of Silver-ring graft, used for femoropopliteal reconstruction, in whom an extreme skin necrosis developed in early postoperative period. This complication was successfully treated medically. After repeated debridement and wound-packing, the wound was covered using Thiersch skin graft.
Extreme ultraviolet performance of a multilayer coated high density toroidal grating
NASA Technical Reports Server (NTRS)
Thomas, Roger J.; Keski-Kuha, Ritva A. M.; Neupert, Werner M.; Condor, Charles E.; Gum, Jeffrey S.
1991-01-01
The performance of a multilayer coated diffraction grating has been evaluated at EUV wavelengths both in terms of absolute efficiency and spectral resolution. The application of ten-layer Ir/Si multilayer coating to a 3600-lines/mm blazed toroidal replica grating produced a factor of 9 enhancement in peak efficiency near the design wavelength of about 30 nm in first order, without degrading its excellent quasistigmatic spectral resolution. The measured EUV efficiency peaked at 3.3 percent and was improved over the full spectral range between 25 and 35 nm compared with the premultilayer replica which had a standard gold coating. In addition, the grating's spectral resolution of greater than 5000 was maintained.
Electroweak bubble wall speed limit
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bödeker, Dietrich; Moore, Guy D., E-mail: bodeker@physik.uni-bielefeld.de, E-mail: guymoore@ikp.physik.tu-darmstadt.de
In extensions of the Standard Model with extra scalars, the electroweak phase transition can be very strong, and the bubble walls can be highly relativistic. We revisit our previous argument that electroweak bubble walls can 'run away,' that is, achieve extreme ultrarelativistic velocities γ ∼ 10{sup 14}. We show that, when particles cross the bubble wall, they can emit transition radiation. Wall-frame soft processes, though suppressed by a power of the coupling α, have a significance enhanced by the γ-factor of the wall, limiting wall velocities to γ ∼ 1/α. Though the bubble walls can move at almost the speedmore » of light, they carry an infinitesimal share of the plasma's energy.« less
NASA Technical Reports Server (NTRS)
1992-01-01
The IMAX camera system is used to record on-orbit activities of interest to the public. Because of the extremely high resolution of the IMAX camera, projector, and audio systems, the audience is afforded a motion picture experience unlike any other. IMAX and OMNIMAX motion picture systems were designed to create motion picture images of superior quality and audience impact. The IMAX camera is a 65 mm, single lens, reflex viewing design with a 15 perforation per frame horizontal pull across. The frame size is 2.06 x 2.77 inches. Film travels through the camera at a rate of 336 feet per minute when the camera is running at the standard 24 frames/sec.
Effects of diurnal temperature range on mortality in Hefei city, China
NASA Astrophysics Data System (ADS)
Tang, Jing; Xiao, Chang-chun; Li, Yu-rong; Zhang, Jun-qing; Zhai, Hao-yuan; Geng, Xi-ya; Ding, Rui; Zhai, Jin-xia
2017-12-01
Although several studies indicated an association between diurnal temperature range (DTR) and mortality, the results about modifiers are inconsistent, and few studies were conducted in developing inland country. This study aims to evaluate the effects of DTR on cause-specific mortality and whether season, gender, or age might modify any association in Hefei city, China, during 2007-2016. Quasi-Poisson generalized linear regression models combined with a distributed lag non-linear model (DLNM) were applied to evaluate the relationships between DTR and non-accidental, cardiovascular, and respiratory mortality. We observed a J-shaped relationship between DTR and cause-specific mortality. With a DTR of 8.3 °C as the reference, the cumulative effects of extremely high DTR were significantly higher for all types of mortality than effects of lower or moderate DTR in full year. When stratified by season, extremely high DTR in spring had a greater impact on all cause-specific mortality than other three seasons. Male and the elderly (≥ 65 years) were consistently more susceptible to extremely high DTR effect than female and the youth (< 65 years) for non-accidental and cardiovascular mortality. To the contrary, female and the youth were more susceptible to extremely high DTR effect than male and the elderly for respiratory morality. The study suggests that extremely high DTR is a potential trigger for non-accidental mortality in Hefei city, China. Our findings also highlight the importance of protecting susceptible groups from extremely high DTR especially in the spring.
Effects of diurnal temperature range on mortality in Hefei city, China
NASA Astrophysics Data System (ADS)
Tang, Jing; Xiao, Chang-chun; Li, Yu-rong; Zhang, Jun-qing; Zhai, Hao-yuan; Geng, Xi-ya; Ding, Rui; Zhai, Jin-xia
2018-05-01
Although several studies indicated an association between diurnal temperature range (DTR) and mortality, the results about modifiers are inconsistent, and few studies were conducted in developing inland country. This study aims to evaluate the effects of DTR on cause-specific mortality and whether season, gender, or age might modify any association in Hefei city, China, during 2007-2016. Quasi-Poisson generalized linear regression models combined with a distributed lag non-linear model (DLNM) were applied to evaluate the relationships between DTR and non-accidental, cardiovascular, and respiratory mortality. We observed a J-shaped relationship between DTR and cause-specific mortality. With a DTR of 8.3 °C as the reference, the cumulative effects of extremely high DTR were significantly higher for all types of mortality than effects of lower or moderate DTR in full year. When stratified by season, extremely high DTR in spring had a greater impact on all cause-specific mortality than other three seasons. Male and the elderly (≥ 65 years) were consistently more susceptible to extremely high DTR effect than female and the youth (< 65 years) for non-accidental and cardiovascular mortality. To the contrary, female and the youth were more susceptible to extremely high DTR effect than male and the elderly for respiratory morality. The study suggests that extremely high DTR is a potential trigger for non-accidental mortality in Hefei city, China. Our findings also highlight the importance of protecting susceptible groups from extremely high DTR especially in the spring.
Changes in Extreme Events: from GCM Output to Social, Economic and Ecological Impacts
NASA Astrophysics Data System (ADS)
Tebaldi, C.; Meehl, G. A.
2006-12-01
Extreme events can deeply affect social and natural systems. The current generation of global climate model is producing information that can be directly used to characterize future changes in extreme events, and through a further step their impacts, despite their still relatively coarse resolution. It is important to define extreme indicators consistently with what we expect GCM to be able to represent reliably. We use two examples from our work, heat waves and frost days, that well describe different aspects of the analysis of extremes from GCM output. Frost days are "mild extremes" and their definition and computation is straightforward. GCMs can represent them accurately and display a strong consistent signal of change. The impacts of these changes will be extremely relevant for ecosystems and agriculture. Heat waves do not have a standard definition. On the basis of historical episodes we isolate characteristics that were responsible for the worst effects on human health, for example, and analyze these characteristics in model simulations, validating the model's historical simulations. The changes in these characteristics can then be easily translated in expected differential impacts on public health. Work in progress goes in the direction of better characterization of "heat waves" taking into account jointly a set of variables like maximum and minimum temperatures and humidity, better addressing the biological vulnerabilities of the populations at risk.
Khan, Shujaat; Naseem, Imran; Togneri, Roberto; Bennamoun, Mohammed
2018-01-01
In extreme cold weather, living organisms produce Antifreeze Proteins (AFPs) to counter the otherwise lethal intracellular formation of ice. Structures and sequences of various AFPs exhibit a high degree of heterogeneity, consequently the prediction of the AFPs is considered to be a challenging task. In this research, we propose to handle this arduous manifold learning task using the notion of localized processing. In particular, an AFP sequence is segmented into two sub-segments each of which is analyzed for amino acid and di-peptide compositions. We propose to use only the most significant features using the concept of information gain (IG) followed by a random forest classification approach. The proposed RAFP-Pred achieved an excellent performance on a number of standard datasets. We report a high Youden's index (sensitivity+specificity-1) value of 0.75 on the standard independent test data set outperforming the AFP-PseAAC, AFP_PSSM, AFP-Pred, and iAFP by a margin of 0.05, 0.06, 0.14, and 0.68, respectively. The verification rate on the UniProKB dataset is found to be 83.19 percent which is substantially superior to the 57.18 percent reported for the iAFP method.
Development of a Standard Platinum Resistance Thermometer for Use up to the Copper Point
NASA Astrophysics Data System (ADS)
Tavener, J. P.
2015-08-01
The international temperature scale of 1990 defines temperatures in the range from 13.8 K to 1234.93 K () using a standard platinum resistance thermometer (SPRT) as an interpolating instrument. For temperatures above , the current designs of an SPRT require extreme care to avoid contamination, especially by metallic impurities, which can cause rapid and irreversible drift. This study investigates the performance of a new design of a high-temperature SPRT with the aim of improving the stability of the SPRTs and extending their temperature range. The prototype SPRTs have an alumina sheath, a sapphire support for the sensing element, which are aspirated with dry air and operated with a dc bias voltage to suppress the diffusion of metal-ion contaminants. Three prototype thermometers were exposed to temperatures near or above the copper freezing point, , for total exposure times in excess of 500 h and exhibited drifts in the triple-point resistance of less than 10 mK. The new design eliminates some of the problems associated with fused-silica sheaths and sensor-support structures and is a viable option for a high-accuracy thermometer for temperatures approaching.
Absolute measurement of the Hugoniot and sound velocity of liquid copper at multimegabar pressures
McCoy, Chad August; Knudson, Marcus David; Root, Seth
2017-11-13
Measurement of the Hugoniot and sound velocity provides information on the bulk modulus and Grüneisen parameter of a material at extreme conditions. The capability to launch multilayered (copper/aluminum) flyer plates at velocities in excess of 20 km/s with the Sandia Z accelerator has enabled high-precision sound-velocity measurements at previously inaccessible pressures. For these experiments, the sound velocity of the copper flyer must be accurately known in the multi-Mbar regime. Here we describe the development of copper as an absolutely calibrated sound-velocity standard for high-precision measurements at pressures in excess of 400 GPa. Using multilayered flyer plates, we performed absolute measurementsmore » of the Hugoniot and sound velocity of copper for pressures from 500 to 1200 GPa. These measurements enabled the determination of the Grüneisen parameter for dense liquid copper, clearly showing a density dependence above the melt transition. As a result, combined with earlier data at lower pressures, these results constrain the sound velocity as a function of pressure, enabling the use of copper as a Hugoniot and sound-velocity standard for pressures up to 1200 GPa.« less
Absolute measurement of the Hugoniot and sound velocity of liquid copper at multimegabar pressures
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCoy, Chad August; Knudson, Marcus David; Root, Seth
Measurement of the Hugoniot and sound velocity provides information on the bulk modulus and Grüneisen parameter of a material at extreme conditions. The capability to launch multilayered (copper/aluminum) flyer plates at velocities in excess of 20 km/s with the Sandia Z accelerator has enabled high-precision sound-velocity measurements at previously inaccessible pressures. For these experiments, the sound velocity of the copper flyer must be accurately known in the multi-Mbar regime. Here we describe the development of copper as an absolutely calibrated sound-velocity standard for high-precision measurements at pressures in excess of 400 GPa. Using multilayered flyer plates, we performed absolute measurementsmore » of the Hugoniot and sound velocity of copper for pressures from 500 to 1200 GPa. These measurements enabled the determination of the Grüneisen parameter for dense liquid copper, clearly showing a density dependence above the melt transition. As a result, combined with earlier data at lower pressures, these results constrain the sound velocity as a function of pressure, enabling the use of copper as a Hugoniot and sound-velocity standard for pressures up to 1200 GPa.« less
NASA Astrophysics Data System (ADS)
Beminiwattha, Rakitha; Moller Collaboration
2017-09-01
Parity Violating Electron Scattering (PVES) is an extremely successful precision frontier tool that has been used for testing the Standard Model (SM) and understanding nucleon structure. Several generations of highly successful PVES programs at SLAC, MIT-Bates, MAMI-Mainz, and Jefferson Lab have contributed to the understanding of nucleon structure and testing the SM. But missing phenomena like matter-antimatter asymmetry, neutrino flavor oscillations, and dark matter and energy suggest that the SM is only a `low energy' effective theory. The MOLLER experiment at Jefferson Lab will measure the weak charge of the electron, QWe = 1 - 4sin2θW , with a precision of 2.4 % by measuring the parity violating asymmetry in electron-electron () scattering and will be sensitive to subtle but measurable deviations from precisely calculable predictions from the SM. The MOLLER experiment will provide the best contact interaction search for leptons at low OR high energy makes it a probe of physics beyond the Standard Model with sensitivities to mass-scales of new PV physics up to 7.5 TeV. Overview of the experiment and recent pre-R&D progress will be reported.
SOHO EIT Carrington maps from synoptic full-disk data
NASA Technical Reports Server (NTRS)
Thompson, B. J.; Newmark, J. S.; Gurman, J. B.; Delaboudiniere, J. P.; Clette, F.; Gibson, S. E.
1997-01-01
The solar synoptic maps, obtained from observations carried out since May 1996 by the extreme-ultraviolet imaging telescope (EIT) onboard the Solar and Heliospheric Observatory (SOHO), are presented. The maps were constructed for each Carrington rotation with the calibrated data. The off-limb maps at 1.05 and 1.10 solar radii were generated for three coronal lines using the standard applied to coronagraph synoptic maps. The maps reveal several aspects of the solar structure over the entire rotation and are used in the whole sun month modeling campaign. @txt extreme-ultraviolet imaging telescope
2014-10-01
group, Pig 22227, was due to a gastrointestinal bleed , related to either infectious gastroenteritis/colitis or stress ulcer formation. The third... upper extremity transplantation. Delays in progress and incomplete groups will be discussed in detail in Section 5 – Changes/Problems. Table 1...Implemented successfully first clinical protocol for upper extremity transplantation using donor bone marrow cell therapies and tacrolimus
Synoptic Conditions and Moisture Sources Actuating Extreme Precipitation in Nepal
NASA Astrophysics Data System (ADS)
Bohlinger, Patrik; Sorteberg, Asgeir; Sodemann, Harald
2017-12-01
Despite the vast literature on heavy-precipitation events in South Asia, synoptic conditions and moisture sources related to extreme precipitation in Nepal have not been addressed systematically. We investigate two types of synoptic conditions—low-pressure systems and midlevel troughs—and moisture sources related to extreme precipitation events. To account for the high spatial variability in rainfall, we cluster station-based daily precipitation measurements resulting in three well-separated geographic regions: west, central, and east Nepal. For each region, composite analysis of extreme events shows that atmospheric circulation is directed against the Himalayas during an extreme event. The direction of the flow is regulated by midtropospheric troughs and low-pressure systems traveling toward the respective region. Extreme precipitation events feature anomalous high abundance of total column moisture. Quantitative Lagrangian moisture source diagnostic reveals that the largest direct contribution stems from land (approximately 75%), where, in particular, over the Indo-Gangetic Plain moisture uptake was increased. Precipitation events occurring in this region before the extreme event likely provided additional moisture.
NASA Astrophysics Data System (ADS)
Lu, D.; Ricciuto, D. M.; Evans, K. J.
2017-12-01
Data-worth analysis plays an essential role in improving the understanding of the subsurface system, in developing and refining subsurface models, and in supporting rational water resources management. However, data-worth analysis is computationally expensive as it requires quantifying parameter uncertainty, prediction uncertainty, and both current and potential data uncertainties. Assessment of these uncertainties in large-scale stochastic subsurface simulations using standard Monte Carlo (MC) sampling or advanced surrogate modeling is extremely computationally intensive, sometimes even infeasible. In this work, we propose efficient Bayesian analysis of data-worth using a multilevel Monte Carlo (MLMC) method. Compared to the standard MC that requires a significantly large number of high-fidelity model executions to achieve a prescribed accuracy in estimating expectations, the MLMC can substantially reduce the computational cost with the use of multifidelity approximations. As the data-worth analysis involves a great deal of expectation estimations, the cost savings from MLMC in the assessment can be very outstanding. While the proposed MLMC-based data-worth analysis is broadly applicable, we use it to a highly heterogeneous oil reservoir simulation to select an optimal candidate data set that gives the largest uncertainty reduction in predicting mass flow rates at four production wells. The choices made by the MLMC estimation are validated by the actual measurements of the potential data, and consistent with the estimation obtained from the standard MC. But compared to the standard MC, the MLMC greatly reduces the computational costs in the uncertainty reduction estimation, with up to 600 days cost savings when one processor is used.
Passmore, Brandon; Cole, Zach; Whitaker, Bret; Barkley, Adam; McNutt, Ty; Lostetter, Alexander
2016-08-02
A multichip power module directly connecting the busboard to a printed-circuit board that is attached to the power substrate enabling extremely low loop inductance for extreme environments such as high temperature operation. Wire bond interconnections are taught from the power die directly to the busboard further enabling enable low parasitic interconnections. Integration of on-board high frequency bus capacitors provide extremely low loop inductance. An extreme environment gate driver board allows close physical proximity of gate driver and power stage to reduce overall volume and reduce impedance in the control circuit. Parallel spring-loaded pin gate driver PCB connections allows a reliable and reworkable power module to gate driver interconnections.
Communities that thrive in extreme conditions captured from a freshwater lake.
Low-Décarie, Etienne; Fussmann, Gregor F; Dumbrell, Alex J; Bell, Graham
2016-09-01
Organisms that can grow in extreme conditions would be expected to be confined to extreme environments. However, we were able to capture highly productive communities of algae and bacteria capable of growing in acidic (pH 2), basic (pH 12) and saline (40 ppt) conditions from an ordinary freshwater lake. Microbial communities may thus include taxa that are highly productive in conditions that are far outside the range of conditions experienced in their host ecosystem. The organisms we captured were not obligate extremophiles, but were capable of growing in both extreme and benign conditions. The ability to grow in extreme conditions may thus be a common functional attribute in microbial communities. © 2016 The Author(s).
Imaging of upper extremity stress fractures in the athlete.
Anderson, Mark W
2006-07-01
Although it is much less common than injuries in the lower extremities, an upper extremity stress injury can have a significant impact on an athlete. If an accurate and timely diagnosis is to be made, the clinician must have a high index of suspicion of a stress fracture in any athlete who is involved in a throwing, weightlifting, or upper extremity weight-bearing sport and presents with chronic pain in the upper extremity. Imaging should play an integral role in the work-up of these patients; if initial radiographs are unrevealing, further cross-sectional imaging should be strongly considered. Although a three-phase bone scan is highly sensitive in this regard, MRI has become the study of choice at most centers.
NASA Astrophysics Data System (ADS)
Nelson, David A.; Curran, Allen R.; Nyberg, Hans A.; Marttila, Eric A.; Mason, Patrick A.; Ziriax, John M.
2013-03-01
Human exposure to radio frequency (RF) electromagnetic energy is known to result in tissue heating and can raise temperatures substantially in some situations. Standards for safe exposure to RF do not reflect bio-heat transfer considerations however. Thermoregulatory function (vasodilation, sweating) may mitigate RF heating effects in some environments and exposure scenarios. Conversely, a combination of an extreme environment (high temperature, high humidity), high activity levels and thermally insulating garments may exacerbate RF exposure and pose a risk of unsafe temperature elevation, even for power densities which might be acceptable in a normothermic environment. A high-resolution thermophysiological model, incorporating a heterogeneous tissue model of a seated adult has been developed and used to replicate a series of whole-body exposures at a frequency (100 MHz) which approximates that of human whole-body resonance. Exposures were simulated at three power densities (4, 6 and 8 mW cm-2) plus a sham exposure and at three different ambient temperatures (24, 28 and 31 °C). The maximum hypothalamic temperature increase over the course of a 45 min exposure was 0.28 °C and occurred in the most extreme conditions (Tamb = 31 °C, PD = 8 mW cm-2). Skin temperature increases attributable to RF exposure were modest, with the exception of a ‘hot spot’ in the vicinity of the ankle where skin temperatures exceeded 39 °C. Temperature increases in internal organs and tissues were small, except for connective tissue and bone in the lower leg and foot. Temperature elevation also was noted in the spinal cord, consistent with a hot spot previously identified in the literature.
Nelson, David A; Curran, Allen R; Nyberg, Hans A; Marttila, Eric A; Mason, Patrick A; Ziriax, John M
2013-03-21
Human exposure to radio frequency (RF) electromagnetic energy is known to result in tissue heating and can raise temperatures substantially in some situations. Standards for safe exposure to RF do not reflect bio-heat transfer considerations however. Thermoregulatory function (vasodilation, sweating) may mitigate RF heating effects in some environments and exposure scenarios. Conversely, a combination of an extreme environment (high temperature, high humidity), high activity levels and thermally insulating garments may exacerbate RF exposure and pose a risk of unsafe temperature elevation, even for power densities which might be acceptable in a normothermic environment. A high-resolution thermophysiological model, incorporating a heterogeneous tissue model of a seated adult has been developed and used to replicate a series of whole-body exposures at a frequency (100 MHz) which approximates that of human whole-body resonance. Exposures were simulated at three power densities (4, 6 and 8 mW cm(-2)) plus a sham exposure and at three different ambient temperatures (24, 28 and 31 °C). The maximum hypothalamic temperature increase over the course of a 45 min exposure was 0.28 °C and occurred in the most extreme conditions (T(AMB) = 31 °C, PD = 8 mW cm(-2)). Skin temperature increases attributable to RF exposure were modest, with the exception of a 'hot spot' in the vicinity of the ankle where skin temperatures exceeded 39 °C. Temperature increases in internal organs and tissues were small, except for connective tissue and bone in the lower leg and foot. Temperature elevation also was noted in the spinal cord, consistent with a hot spot previously identified in the literature.
Water resources of the Myakka River basin area, southwest Florida
Joyner, Boyd F.; Sutcliffe, Horace
1976-01-01
Ground water in the Myakka River basin area of southwest Floria is obtained from a water-table aquifer and from five zones in an artesian aquifer. Wells in the water-table aquifer yield generally less than 50 gpm and dissolved solids concentration is less than 500 mg/liter except in coastal areas and the peninsula southwest of the Myakka River estuary. Wells in the Venice area that tap zone 1 usually yield less than 30 gmp. The quality of water is good except in the peninsula area. Zone 2 is the most highly developed aquifer in the heavily populated coastal areas. Wells yield as much as 200 gpm. In most areas, water is of acceptable quality. Wells that tap zone 3 yield as much as 500 gmp. Fluoride concentration ranges from 1 to 3.5 mg/liter. Zone 4 yields as much as 1,500 gpm to large diameter wells. Except in the extreme northeastern part of the area water from zone 4 usually contains high concentrations of fluoride and sulfate. Zone 5 is the most productive aquifer in the area, but dissolved solids concentrations usually are too high for public supply except in the extreme northeast. Surface water derived from natural drainage is of good quality except for occasional high color in summer. Most of the streams in the Myakka River basin area have small drainage basins, are of short channel length, and do not yield high volumes of flow. During the dry season, streamflow is maintained by groundwater discharge, and, as a result, chloride, sulfate, and dissolved solids concentrations and the hardness of the water are above drinking water standards for some streams. (Woodard-USGS)
Symons, Frank J; Tervo, Raymond C; Barney, Chantel C; Damerow, John; Selim, Mona; McAdams, Brian; Foster, Shawn; Wendelschafer Crabb, Gwen; Kennedy, William
2015-11-01
The relation between somatosensory mechanisms and self-injury among children with neurologic impairments associated with developmental delay is not well understood. We evaluated the feasibility of procuring skin biopsies to examine epidermal nerve fiber density and reported self-injury. Following informed parental consent, epidermal skin biopsies were obtained from a distal leg site with no pre-existing skin damage from 11 children with global developmental delay (55% male; mean age = 36.8 months, 17-63 months). Visual microscopic examination and quantitative analyses showed extremely high epidermal nerve fiber density values for some children. Children with reported self-injury (5/11) had significantly (P < .02) greater density values (138.8, standard deviation = 45.5) than children without self-injury (80.5, standard deviation = 17.5). Results from this novel immunohistologic analysis of skin in very young children with neurodevelopmental delays suggest it may be a useful tool to study peripheral innervation as a possible sensory risk factor for self-injury. © The Author(s) 2015.
NASA Astrophysics Data System (ADS)
da Silva Oliveira, C. I.; Martinez-Martinez, D.; Al-Rjoub, A.; Rebouta, L.; Menezes, R.; Cunha, L.
2018-04-01
In this paper, we present a statistical method that allows evaluating the degree of a transparency of a thin film. To do so, the color coordinates are measured on different substrates, and the standard deviation is evaluated. In case of low values, the color depends on the film and not on the substrate, and intrinsic colors are obtained. In contrast, transparent films lead to high values of standard deviation, since the value of the color coordinates depends on the substrate. Between both extremes, colored films with a certain degree of transparency can be found. This method allows an objective and simple evaluation of the transparency of any film, improving the subjective visual inspection and avoiding the thickness problems related to optical spectroscopy evaluation. Zirconium oxynitride films deposited on three different substrates (Si, steel and glass) are used for testing the validity of this method, whose results have been validated with optical spectroscopy, and agree with the visual impression of the samples.
Kelly, Rebecca E; Mansell, Warren; Wood, Alex M; Alatiq, Yousra; Dodd, Alyson; Searson, Ruth
2011-11-01
This research aimed to test whether positive, negative, or conflicting appraisals about activated mood states (e.g., energetic and high states) predicted bipolar disorder. A sample of individuals from clinical and control groups (171 with bipolar disorder, 42 with unipolar depression, and 64 controls) completed a measure of appraisals of internal states. High negative appraisals related to a higher likelihood of bipolar disorder irrespective of positive appraisals. High positive appraisals related to a higher likelihood of bipolar disorder only when negative appraisals were also high. Individuals were most likely to have bipolar disorder, as opposed to unipolar depression or no diagnosis, when they endorsed both extremely positive and extremely negative appraisals of the same, activated states. Appraisals of internal states were based on self-report. The results indicate that individuals with bipolar disorder tend to appraise activated, energetic internal states in opposing or conflicting ways, interpreting these states as both extremely positive and extremely negative. This may lead to contradictory attempts to regulate these states, which may in turn contribute to mood swing symptoms. Psychological therapy for mood swings and bipolar disorder should address extreme and conflicting appraisals of mood states. Copyright © 2011 Elsevier B.V. All rights reserved.
Drought evolution, severity and trends in mainland China over 1961-2013.
Yao, Ning; Li, Yi; Lei, Tianjie; Peng, Lingling
2018-03-01
Droughts have destructive impacts on crop yields and water supplies, and researching droughts is vital for societal stability and human life. This work aimed to assess the spatiotemporal evolution of droughts in mainland China over 1961-2013 using four drought indices. These indices were the percentage of precipitation anomaly (Pa), standard precipitation index (SPI), standard precipitation evapotranspiration index (SPEI) and evaporative demand drought index (EDDI) at multiple timescales ranging from 1-week to 24-month. The variations of the SPI, SPEI and EDDI were compared with historical severe or extreme droughts. The general increases of the Pa, SPI and SPEI, and general decrease of the EDDI, consistently implied an overall relief of drought conditions over 1961-2013. The different drought indices revealed historical drought conditions, including the national extreme droughts in 1961, 1965, 1972, 1978, 1986, 1988, 1992, 1994, 1997, 1999 and 2000, but various drought severity levels were classified for each drought event since the classification standards differed. Although the SPI and SPEI performed better than the EDDI and there were higher correlations between the SPI and the SPEI, all the indices were regional- or station-specific and have identified historical severe or extreme drought events. At shorter timescales, the EDDI revealed earlier onsets and ends of flash droughts, unlike those indicated by the SPI and SPEI. The comparison of the different indices based on the historical drought events confirmed the uses of the Pa, SPI and SPEI for determining continuous droughts and that of the EDDI for identifying flash droughts. Copyright © 2017 Elsevier B.V. All rights reserved.
Robust Engineering Designs for Infrastructure Adaptation to a Changing Climate
NASA Astrophysics Data System (ADS)
Samaras, C.; Cook, L.
2015-12-01
Infrastructure systems are expected to be functional, durable and safe over long service lives - 50 to over 100 years. Observations and models of climate science show that greenhouse gas emissions resulting from human activities have changed climate, weather and extreme events. Projections of future changes (albeit with uncertainties caused by inadequacies of current climate/weather models) can be made based on scenarios for future emissions, but actual future emissions are themselves uncertain. Most current engineering standards and practices for infrastructure assume that the probabilities of future extreme climate and weather events will match those of the past. Climate science shows that this assumption is invalid, but is unable, at present, to define these probabilities over the service lives of existing and new infrastructure systems. Engineering designs, plans, and institutions and regulations will need to be adaptable for a range of future conditions (conditions of climate, weather and extreme events, as well as changing societal demands for infrastructure services). For their current and future projects, engineers should: Involve all stakeholders (owners, financers, insurance, regulators, affected public, climate/weather scientists, etc.) in key decisions; Use low regret, adaptive strategies, such as robust decision making and the observational method, comply with relevant standards and regulations, and exceed their requirements where appropriate; Publish design studies and performance/failure investigations to extend the body of knowledge for advancement of practice. The engineering community should conduct observational and modeling research with climate/weather/social scientists and the concerned communities and account rationally for climate change in revised engineering standards and codes. This presentation presents initial research on decisionmaking under uncertainty for climate resilient infrastructure design.
Resilience and Suicidality among Homeless Youth
ERIC Educational Resources Information Center
Cleverley, Kristin; Kidd, Sean A.
2011-01-01
Homeless and street-involved youth are considered an extremely high risk group, with many studies highlighting trajectories characterized by abusive, neglectful, and unstable family histories, victimization and criminal involvement while on the streets, high rates of physical and mental illness, and extremely high rates of mortality. While there…
Fan, Zheng; Kocis, Keith; Valley, Robert; Howard, James F; Chopra, Manisha; Chen, Yasheng; An, Hongyu; Lin, Weili; Muenzer, Joseph; Powers, William
2015-09-01
We evaluated safety and feasibility of high-pressure transvenous limb perfusion in an upper extremity of adult patients with muscular dystrophy, after completing a similar study in a lower extremity. A dose escalation study of single-limb perfusion with 0.9% saline was carried out in nine adults with muscular dystrophies under intravenous analgesia. Our study demonstrates that it is feasible and definitely safe to perform high-pressure transvenous perfusion with 0.9% saline up to 35% of limb volume in the upper extremities of young adults with muscular dystrophy. Perfusion at 40% limb volume is associated with short-lived physiological changes in peripheral nerves without clinical correlates in one subject. This study provides the basis for a phase 1/2 clinical trial using pressurized transvenous delivery into upper limbs of nonambulatory patients with Duchenne muscular dystrophy. Furthermore, our results are applicable to other conditions such as limb girdle muscular dystrophy as a method for delivering regional macromolecular therapeutics in high dose to skeletal muscles of the upper extremity.
Modelling Precipitation and Temperature Extremes: The Importance of Horizontal Resolution
NASA Astrophysics Data System (ADS)
Shields, C. A.; Kiehl, J. T.; Meehl, G. A.
2013-12-01
Understanding Earth's water cycle on a warming planet is of critical importance in society's ability to adapt to climate change. Extreme weather events, such as floods, heat waves, and drought will likely change with the water cycle as greenhouse gases continue to rise. Location, duration, and intensity of extreme events can be studied using complex earth system models. Here, we employ the fully coupled Community Earth System Model (CESM1.0) to evaluate extreme event impacts for different possible future forcing scenarios. Simulations applying the Representative Concentration Pathway (RCP) scenarios 2.6 and 8.5 were chosen to bracket the range of model responses. Because extreme weather events happen on a regional scale, there is a tendency to favor using higher resolution models, i.e. models that can represent regional features with greater accuracy. Within the CESM1.0 framework, we evaluate both the standard 1 degree resolution (1 degree atmosphere/land coupled to 1 degree ocean/sea ice), and the higher 0.5 degree resolution version (0.5 degree atmosphere/land coupled to 1 degree ocean/sea ice), focusing on extreme precipitation events, heat waves, and droughts. We analyze a variety of geographical regions, but generally find that benefits from increased horizontal resolution are most significant on the regional scale.
Are “extreme consumption games” drinking games? Sometimes it's a matter of perspective
Zamboanga, Byron L.; Pearce, Marc W.; Kenney, Shannon R.; Ham, Lindsay S.; Woods, Olivia E.; Borsari, Brian
2013-01-01
Drinking games are widespread on college campuses and pose health risks to their players. Although there has been considerable research progress in the college drinking games literature, there does not appear to be a standard definition of the term “drinking games.” Researchers, however, have attempted to classify and categorize drinking games in a systematic manner. For example, one category of drinking games (e.g., chugging, keg stands) is often referred to as consumption or extreme consumption games. Questions remain as to whether or how these types of games align with researchers' definitions of drinking games or the categorization systems advanced by researchers in the field. Potential challenges regarding the definition and categorization of drinking games, particularly with respect to extreme consumption types of games, are discussed. PMID:23968169
Highly flexible and all-solid-state paperlike polymer supercapacitors.
Meng, Chuizhou; Liu, Changhong; Chen, Luzhuo; Hu, Chunhua; Fan, Shoushan
2010-10-13
In recent years, much effort have been dedicated to achieve thin, lightweight and even flexible energy-storage devices for wearable electronics. Here we demonstrate a novel kind of ultrathin all-solid-state supercapacitor configuration with an extremely simple process using two slightly separated polyaniline-based electrodes well solidified in the H(2)SO(4)-polyvinyl alcohol gel electrolyte. The thickness of the entire device is much comparable to that of a piece of commercial standard A4 print paper. Under its highly flexible (twisting) state, the integrate device shows a high specific capacitance of 350 F/g for the electrode materials, well cycle stability after 1000 cycles and a leakage current of as small as 17.2 μA. Furthermore, due to its polymer-based component structure, it has a specific capacitance of as high as 31.4 F/g for the entire device, which is more than 6 times that of current high-level commercial supercapacitor products. These highly flexible and all-solid-state paperlike polymer supercapacitors may bring new design opportunities of device configuration for energy-storage devices in the future wearable electronic area.
Boros, Emil; Katalin, V-Balogh; Vörös, Lajos; Horváth, Zsófia
2017-01-01
Soda lakes and pans represent saline ecosystems with unique chemical composition, occurring on all continents. The purpose of this study was to identify and characterise the main environmental gradients and trophic state that prevail in the soda pans (n=84) of the Carpathian Basin in Central Europe. Underwater light conditions, dissolved organic matter, phosphorus and chlorophyll a were investigated in 84 pans during 2009-2010. Besides, water temperature was measured hourly with an automatic sensor throughout one year in a selected pan. The pans were very shallow (median depth: 15 cm), and their extremely high turbidity (Secchi depth median: 3 cm, min: 0.5 cm) was caused by high concentrations of inorganic suspended solids (median: 0.4 g L -1 , max: 16 g L -1 ), which was the dominant (>50%) contributing factor to the vertical attenuation coefficient in 67 pans (80%). All pans were polyhumic (median DOC: 47 mg L -1 ), and total phosphorus concentration was also extremely high (median: 2 mg L -1 , max: 32 mg L -1 ). The daily water temperature maximum (44 °C) and fluctuation maximum (28 °C) were extremely high during summertime. The combination of environmental boundaries: shallowness, daily water temperature fluctuation, intermittent hydroperiod, high turbidity, polyhumic organic carbon concentration, high alkalinity and hypertrophy represent a unique extreme aquatic ecosystem.
Polygenic determinants in extremes of high-density lipoprotein cholesterol[S
Dron, Jacqueline S.; Wang, Jian; Low-Kam, Cécile; Khetarpal, Sumeet A.; Robinson, John F.; McIntyre, Adam D.; Ban, Matthew R.; Cao, Henian; Rhainds, David; Dubé, Marie-Pierre; Rader, Daniel J.; Lettre, Guillaume; Tardif, Jean-Claude
2017-01-01
HDL cholesterol (HDL-C) remains a superior biochemical predictor of CVD risk, but its genetic basis is incompletely defined. In patients with extreme HDL-C concentrations, we concurrently evaluated the contributions of multiple large- and small-effect genetic variants. In a discovery cohort of 255 unrelated lipid clinic patients with extreme HDL-C levels, we used a targeted next-generation sequencing panel to evaluate rare variants in known HDL metabolism genes, simultaneously with common variants bundled into a polygenic trait score. Two additional cohorts were used for validation and included 1,746 individuals from the Montréal Heart Institute Biobank and 1,048 individuals from the University of Pennsylvania. Findings were consistent between cohorts: we found rare heterozygous large-effect variants in 18.7% and 10.9% of low- and high-HDL-C patients, respectively. We also found common variant accumulation, indicated by extreme polygenic trait scores, in an additional 12.8% and 19.3% of overall cases of low- and high-HDL-C extremes, respectively. Thus, the genetic basis of extreme HDL-C concentrations encountered clinically is frequently polygenic, with contributions from both rare large-effect and common small-effect variants. Multiple types of genetic variants should be considered as contributing factors in patients with extreme dyslipidemia. PMID:28870971