The Effects of Tools of the Mind on Math and Reading Scores in Kindergarten
ERIC Educational Resources Information Center
Mackay, Patricia E.
2013-01-01
Although a limited body of research has supported the positive impact of the Tools of the Mind curriculum on the development of self-regulation, research supporting a direct relationship between Tools and academic achievement is extremely limited. The purpose of this study is to evaluate the effectiveness of the Tools of the Mind curriculum…
"Extreme Programming" in a Bioinformatics Class
ERIC Educational Resources Information Center
Kelley, Scott; Alger, Christianna; Deutschman, Douglas
2009-01-01
The importance of Bioinformatics tools and methodology in modern biological research underscores the need for robust and effective courses at the college level. This paper describes such a course designed on the principles of cooperative learning based on a computer software industry production model called "Extreme Programming" (EP).…
NASA Astrophysics Data System (ADS)
Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.
2017-11-01
The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.
NASA Astrophysics Data System (ADS)
Rice, J.; Joyce, L. A.; Armel, B.; Bevenger, G.; Zubic, R.
2011-12-01
Climate change introduces a significant challenge for land managers and decision makers managing the natural resources that provide many benefits from forests. These benefits include water for urban and agricultural uses, wildlife habitat, erosion and climate control, aquifer recharge, stream flows regulation, water temperature regulation, and cultural services such as outdoor recreation and aesthetic enjoyment. The Forest Service has responded to this challenge by developing a national strategy for responding to climate change (the National Roadmap for Responding to Climate Change, July 2010). In concert with this national strategy, the Forest Service's Westwide Climate Initiative has conducted 4 case studies on individual Forests in the western U.S to develop climate adaptation tools. Western National Forests are particularly vulnerable to climate change as they have high-mountain topography, diversity in climate and vegetation, large areas of water limited ecosystems, and increasing urbanization. Information about the vulnerability and capacity of resources to adapt to climate change and extremes is lacking. There is an urgent need to provide customized tools and synthesized local scale information about the impacts to resources from future climate change and extremes, as well as develop science based adaptation options and strategies in National Forest management and planning. The case study on the Shoshone National Forest has aligned its objectives with management needs by developing a climate extreme vulnerability tool that guides adaptation options development. The vulnerability tool determines the likely degree to which native Yellowstone cutthroat trout and water availability are susceptible to, or unable to cope with adverse effects of climate change extremes. We spatially categorize vulnerability for water and native trout resources using exposure, sensitivity, and adaptive capacity indicators that use minimum and maximum climate and GIS data. Results show that the vulnerability of water availability may increase in areas that have less storage and become more dominated by rain instead of snow. Native trout habitat was found to improve in some areas from warmer temperatures suggesting future refugia habitat may need to be a focus of conservation efforts. The climate extreme vulnerability tool provides Forest Service resource managers science based information that guides adaptation strategy development; prioritize conservation projects; guides monitoring efforts, and helps promote more resilient ecosystems undergoing the effects of climate change.
High resolution modelling of extreme precipitation events in urban areas
NASA Astrophysics Data System (ADS)
Siemerink, Martijn; Volp, Nicolette; Schuurmans, Wytze; Deckers, Dave
2015-04-01
The present day society needs to adjust to the effects of climate change. More extreme weather conditions are expected, which can lead to longer periods of drought, but also to more extreme precipitation events. Urban water systems are not designed for such extreme events. Most sewer systems are not able to drain the excessive storm water, causing urban flooding. This leads to high economic damage. In order to take appropriate measures against extreme urban storms, detailed knowledge about the behaviour of the urban water system above and below the streets is required. To investigate the behaviour of urban water systems during extreme precipitation events new assessment tools are necessary. These tools should provide a detailed and integral description of the flow in the full domain of overland runoff, sewer flow, surface water flow and groundwater flow. We developed a new assessment tool, called 3Di, which provides detailed insight in the urban water system. This tool is based on a new numerical methodology that can accurately deal with the interaction between overland runoff, sewer flow and surface water flow. A one-dimensional model for the sewer system and open channel flow is fully coupled to a two-dimensional depth-averaged model that simulates the overland flow. The tool uses a subgrid-based approach in order to take high resolution information of the sewer system and of the terrain into account [1, 2]. The combination of using the high resolution information and the subgrid based approach results in an accurate and efficient modelling tool. It is now possible to simulate entire urban water systems using extreme high resolution (0.5m x 0.5m) terrain data in combination with a detailed sewer and surface water network representation. The new tool has been tested in several Dutch cities, such as Rotterdam, Amsterdam and The Hague. We will present the results of an extreme precipitation event in the city of Schiedam (The Netherlands). This city deals with significant soil consolidation and the low-lying areas are prone to urban flooding. The simulation results are compared with measurements in the sewer network. References [1] Guus S. Stelling G.S., 2012. Quadtree flood simulations with subgrid digital elevation models. Water Management 165 (WM1):1329-1354. [2] Vincenzo Cassuli and Guus S. Stelling, 2013. A semi-implicit numerical model for urban drainage systems. International Journal for Numerical Methods in Fluids. Vol. 73:600-614. DOI: 10.1002/fld.3817
Behavior Prediction Tools Strengthen Nanoelectronics
NASA Technical Reports Server (NTRS)
2013-01-01
Several years ago, NASA started making plans to send robots to explore the deep, dark craters on the Moon. As part of these plans, NASA needed modeling tools to help engineer unique electronics to withstand extremely cold temperatures. According to Jonathan Pellish, a flight systems test engineer at Goddard Space Flight Center, "An instrument sitting in a shadowed crater on one of the Moon s poles would hover around 43 K", that is, 43 kelvin, equivalent to -382 F. Such frigid temperatures are one of the main factors that make the extreme space environments encountered on the Moon and elsewhere so extreme. Radiation is another main concern. "Radiation is always present in the space environment," says Pellish. "Small to moderate solar energetic particle events happen regularly and extreme events happen less than a handful of times throughout the 7 active years of the 11-year solar cycle." Radiation can corrupt data, propagate to other systems, require component power cycling, and cause a host of other harmful effects. In order to explore places like the Moon, Jupiter, Saturn, Venus, and Mars, NASA must use electronic communication devices like transmitters and receivers and data collection devices like infrared cameras that can resist the effects of extreme temperature and radiation; otherwise, the electronics would not be reliable for the duration of the mission.
Spatial extremes modeling applied to extreme precipitation data in the state of Paraná
NASA Astrophysics Data System (ADS)
Olinda, R. A.; Blanchet, J.; dos Santos, C. A. C.; Ozaki, V. A.; Ribeiro, P. J., Jr.
2014-11-01
Most of the mathematical models developed for rare events are based on probabilistic models for extremes. Although the tools for statistical modeling of univariate and multivariate extremes are well developed, the extension of these tools to model spatial extremes includes an area of very active research nowadays. A natural approach to such a modeling is the theory of extreme spatial and the max-stable process, characterized by the extension of infinite dimensions of multivariate extreme value theory, and making it possible then to incorporate the existing correlation functions in geostatistics and therefore verify the extremal dependence by means of the extreme coefficient and the Madogram. This work describes the application of such processes in modeling the spatial maximum dependence of maximum monthly rainfall from the state of Paraná, based on historical series observed in weather stations. The proposed models consider the Euclidean space and a transformation referred to as space weather, which may explain the presence of directional effects resulting from synoptic weather patterns. This method is based on the theorem proposed for de Haan and on the models of Smith and Schlather. The isotropic and anisotropic behavior of these models is also verified via Monte Carlo simulation. Estimates are made through pairwise likelihood maximum and the models are compared using the Takeuchi Information Criterion. By modeling the dependence of spatial maxima, applied to maximum monthly rainfall data from the state of Paraná, it was possible to identify directional effects resulting from meteorological phenomena, which, in turn, are important for proper management of risks and environmental disasters in countries with its economy heavily dependent on agribusiness.
Strategic thinking for radiology.
Schilling, R B
1997-08-01
We have now analyzed the use and benefits of four Strategic Thinking Tools for Radiology: the Vision Statement, the High Five, the Two-by-Two, and Real-Win-Worth. Additional tools will be provided during the tutorial. The tools provided above should be considered as examples. They all contain the 10 benefits outlined earlier to varying degrees. It is extremely important that the tools be used in a manner consistent with the Vision Statement of the organization. The specific situation, the effectiveness of the team, and the experience developed with the tools over time will determine the true benefits of the process. It has also been shown that with active use of the types of tools provided above, teams have learned to modify the tools for increased effectiveness and have created additional tools for specific purposes. Once individuals in the organization become committed to improving communication and to using tools/frameworks for solving problems as a team, effectiveness becomes boundless.
Easy-To-Use Connector-Assembly Tool
NASA Technical Reports Server (NTRS)
Redmon, John W., Jr.; Jankowski, Fred
1988-01-01
Tool compensates for user's loss of dexterity under awkward conditions. Has jaws that swivel over 180 degree so angle adjusts with respect to handles. Oriented and held in position most comfortable and effective for user in given situation. Jaws lined with rubber pads so they conform to irregularly shaped parts and grips firmly but gently. Once tool engages part, it locks on it so user can release handles without losing part. Ratchet mechanism in tool allows user to work handles back and forth in confined space to connect or disconnect part. Quickly positioned, locked, and released. Gives user feel of its grip on part. Frees grasping muscles from work during part of task, giving user greater freedom to move hand. Operates with only one hand, leaving user's other hand free to manipulate wiring or other parts. Also adapts to handling and positioning extremely-hot or extremely-cold fluid lines, contaminated objects, abrasive or sharp objects, fragile items, and soft objects.
NASA Astrophysics Data System (ADS)
Cooley, D. S.; Castillo, F.; Thibaud, E.
2017-12-01
A 2015 heatwave in Pakistan is blamed for over a thousand deaths. This event consisted of several days of very high temperatures and unusually high humidity for this region. However, none of these days exceeded the threshold for "extreme danger" in terms of the heat index. The heat index is a univariate function of both temperature and humidity which is universally applied at all locations regardless of local climate. Understanding extremes which arise from multiple factors is challenging. In this paper we will present a tool for examining bivariate extreme behavior. The tool, developed in the statistical software R, draws isolines of equal exceedance probability. These isolines can be understood as bivariate "return levels". The tool is based on a dependence framework specific for extremes, is semiparametric, and is able to extrapolate isolines beyond the range of the data. We illustrate this tool using the Pakistan heat wave data and other bivariate data.
The parser generator as a general purpose tool
NASA Technical Reports Server (NTRS)
Noonan, R. E.; Collins, W. R.
1985-01-01
The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.
HISTOPATHOLOGICAL BIOMARKERS AS INTEGRATORS OF CHEMICAL CONTAMINANT EXPOSURE AND EFFECTS IN FISH
Histopathology can be an extremely useful tool for assessing effects of chemical exposure in fish at the level of the individual. Although somewhat qualitative, the histopathological approach is especially valuable because observed lesions represent an integration of cumulative e...
NASA Astrophysics Data System (ADS)
Wu, Yanling
2018-05-01
In this paper, the extreme waves were generated using the open source computational fluid dynamic (CFD) tools — OpenFOAM and Waves2FOAM — using linear and nonlinear NewWave input. They were used to conduct the numerical simulation of the wave impact process. Numerical tools based on first-order (with and without stretching) and second-order NewWave are investigated. The simulation to predict force loading for the offshore platform under the extreme weather condition is implemented and compared.
2017-10-01
USER GUIDE 1,4-Dioxane Remediation by Extreme Soil Vapor Extraction (XSVE) Screening-Level Feasibility Assessment and Design Tool in...Support of 1,4-Dioxane Remediation by Extreme Soil Vapor Extraction (XSVE) ESTCP Project ER-201326 OCTOBER 2017 Rob Hinchee Integrated Science...Technology, Inc. 1509 Coastal Highway Panacea, FL 32346 8/8/2013 - 8/8/2018 10-2017 1,4-Dioxane Remediation by Extreme Soil Vapor Extraction (XSVE) Screening
Chhetri, Bimal K; Takaro, Tim K; Balshaw, Robert; Otterstatter, Michael; Mak, Sunny; Lem, Marcus; Zubel, Marc; Lysyshyn, Mark; Clarkson, Len; Edwards, Joanne; Fleury, Manon D; Henderson, Sarah B; Galanis, Eleni
2017-10-01
Drinking water related infections are expected to increase in the future due to climate change. Understanding the current links between these infections and environmental factors is vital to understand and reduce the future burden of illness. We investigated the relationship between weekly reported cryptosporidiosis and giardiasis (n = 7,422), extreme precipitation (>90th percentile), drinking water turbidity, and preceding dry periods in a drinking water system located in greater Vancouver, British Columbia, Canada (1997-2009) using distributed lag non-linear Poisson regression models adjusted for seasonality, secular trend, and the effect of holidays on reporting. We found a significant increase in cryptosporidiosis and giardiasis 4-6 weeks after extreme precipitation. The effect was greater following a dry period. Similarly, extreme precipitation led to significantly increased turbidity only after prolonged dry periods. Our results suggest that the risk of cryptosporidiosis and giardiasis increases with extreme precipitation, and that the effects are more pronounced after a prolonged dry period. Given that extreme precipitation events are expected to increase with climate change, it is important to further understand the risks from these events, develop planning tools, and build resilience to these future risks.
YouTube Video as Health Literacy Tool: A Test of Body Image Campaign Effectiveness.
Meng, Juan; Bissell, Kim L; Pan, Po-Lin
2015-01-01
This study examined the effectiveness of four media campaigns about disordered eating behaviors. It investigated possible factors that affected females' perceived effectiveness of four campaign videos. Results indicated that health campaign about a celebrity's struggle with extreme thinness proved to be the least effective of four campaign videos, whereas the video presenting solid facts about the dangers of extreme dieting was perceived as the most effective campaign. Self-discrepancy was not a significant predictor to females' perceived effectiveness of campaign videos. Similarly, the frequency of Internet usage was proved as a weak predictor of their perceived effectiveness. These findings and the possible rationale for the lack of support with regard to the correlates of campaign effectiveness were also discussed.
Takacs, Judit; Leiter, Jeff R S; Peeler, Jason D
2011-06-01
Lower extremity fractures, if not treated appropriately, can increase the risk of morbidity. Partial weight-bearing after surgical repair is recommended; however, current methods of partial weight-bearing may cause excessive loads through the lower extremity. A new rehabilitation tool that uses lower body positive-pressure is described, that may allow partial weight-bearing while preventing excessive loads, thereby improving functional outcomes. A patient with multiple lower extremity fractures underwent a 6-month rehabilitation programme using bodyweight support technology 3 times per week, post-surgery. The patient experienced a reduction in pain and an improvement in ankle range of motion (p=0.002), walking speed (p>0.05) and physical function (p=0.004), as assessed by the Foot and Ankle Module of the American Academy of Orthopaedic Surgeons Lower Limb Outcomes Assessment Instrument. Training did not appear to affect fracture healing, as was evident on radiograph. The effect of lower body positive-pressure on effusion, which has not previously been reported in the literature, was also investigated. No significant difference in effusion of the foot and ankle when using lower body positive-pressure was found. Initial results suggest that this new technology may be a useful rehabilitation tool that allows partial weight-bearing during the treatment of lower extremity injuries.
Yasuda, Tomohiro; Fukumura, Kazuya; Nakajima, Toshiaki
2017-04-01
[Purpose] To examine if the SPPB is higher with healthy subjects than outpatients, which was higher than inpatients and if the SPPB can be validated assessment tool for strength tests and lower extremity morphological evaluation in cardiovascular disease patients. [Subjects and Methods] Twenty-four middle aged and older adults with cardiovascular disease were recruited from inpatient and outpatient facilities and assigned to separate experimental groups. Twelve age-matched healthy volunteers were assigned to a control group. SPPB test was used to assess balance and functional motilities. The test outcomes were compared with level of care (inpatient vs. outpatient), physical characteristics, strength and lower extremity morphology. [Results] Total SPPB scores, strength tests (knee extensor muscle strength), and lower extremity morphological evaluation (muscle thickness of anterior and posterior mid-thigh and posterior lower-leg) were greater in healthy subjects and outpatients groups compared with inpatients. To predict total Short Physical Performance Battery scores, the predicted knee extension and anterior mid-thigh muscle thickness were calculated. [Conclusion] The SPPB is an effective tool as the strength tests and lower extremity morphological evaluation for middle-aged and older adult cardiovascular disease patients. Notably, high knee extensor muscle strength and quadriceps femoris muscle thickness are positively associated with high SPPB scores.
2013-10-01
study will recruit wounded warriors with severe extremity trauma, which places them at high risk for heterotopic ossification (HO); bone formation at...involved in HO; 2) to define accurate and practical methods to predict where HO will develop; and 3) to define potential therapies for prevention or...elicit HO. These tools also need to provide effective methods for early diagnosis or risk assessment (prediction) so that therapies for prevention or
1979-11-30
the detection and analysis of this wear is extremely important. In this study, it was determined that ferrography is an effective tool for this...dealt with the practical applications of ferrography to fluid power systems. The first two phases were investigations of the life improvements of...damning evidence that ferrography is not the beneficial tool it was originally thought to be. However, a further analysis of the entire program and the
NASA Astrophysics Data System (ADS)
Cox, D. T.; Wang, H.; Cramer, L.; Mostafizi, A.; Park, H.
2016-12-01
A 2015 heatwave in Pakistan is blamed for over a thousand deaths. This event consisted of several days of very high temperatures and unusually high humidity for this region. However, none of these days exceeded the threshold for "extreme danger" in terms of the heat index. The heat index is a univariate function of both temperature and humidity which is universally applied at all locations regardless of local climate. Understanding extremes which arise from multiple factors is challenging. In this paper we will present a tool for examining bivariate extreme behavior. The tool, developed in the statistical software R, draws isolines of equal exceedance probability. These isolines can be understood as bivariate "return levels". The tool is based on a dependence framework specific for extremes, is semiparametric, and is able to extrapolate isolines beyond the range of the data. We illustrate this tool using the Pakistan heat wave data and other bivariate data.
Air pollution as it affects orchids at the New York Botanical Garden
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adderley, L.
A general discussion of the effects of air pollution on orchids is presented, along with ameliorative measures. One orchid, Dendrobium Phalaenopsis, is suggested as an air pollution bioassay tool, in that it is extremely sensitive to air pollution.
HISTOPATHOLOGICAL BIOMARKERS AS INTEGRATORS OF ANTHROPOGENIC AND ENVIRONMENTAL STRESSORS
Histopathology is an extremely useful tool for assessing effects of exposure to stressors at the level of the individual. Even though the histopathological approach is somewhat qualitative, it is very valuable because the observed lesions represent an integration of cumulative e...
Avila, M L; Brandão, L R; Williams, S; Ward, L C; Montoya, M I; Stinson, J; Kiss, A; Lara-Corrales, I; Feldman, B M
2016-08-01
Our goal was to conduct the item generation and piloting phases of a new discriminative and evaluative tool for pediatric post-thrombotic syndrome. We followed a formative model for the development of the tool, focusing on the signs/symptoms (items) that define post-thrombotic syndrome. For item generation, pediatric thrombosis experts and subjects diagnosed with extremity post-thrombotic syndrome during childhood nominated items. In the piloting phase, items were cross-sectionally measured in children with limb deep vein thrombosis to examine item performance. Twenty-three experts and 16 subjects listed 34 items, which were then measured in 140 subjects with previous diagnosis of limb deep vein thrombosis (70 upper extremity and 70 lower extremity). The items with strongest correlation with post-thrombotic syndrome severity and largest area under the curve were pain (in older children), paresthesia, and swollen limb for the upper extremity group, and pain (in older children), tired limb, heaviness, tightness and paresthesia for the lower extremity group. The diagnostic properties of the items and their correlations with post-thrombotic syndrome severity varied according to the assessed venous territory. The information gathered in this study will help experts decide which item should be considered for inclusion in the new tool. Copyright © 2016 Elsevier Ltd. All rights reserved.
Cheng, Hsin-Yi Kathy; Lien, Yueh-Ju; Yu, Yu-Chun; Ju, Yan-Ying; Pei, Yu-Cheng; Cheng, Chih-Hsiu; Wu, David Bin-Chia
2013-04-01
A high percentage of children with cerebral palsy (CP) have difficulty keeping up with the handwriting demands at school. Previous studies have addressed the effects of proper sitting and writing tool on writing performance, but less on body biomechanics. The aim of this study was to investigate the influence of lower body stabilization and pencil design on body biomechanics in children with CP. Fourteen children (12.31±4.13 years old) with CP were recruited for this study. A crossover repeated measures design was employed, with two independent variables: lower body stabilization (with/without) and pencil (regular/assigned grip height/biaxial). The writing task was to trace the Archimedean spiral mazes. Electromyography (EMG) of the upper extremity, the wrist flexion/extension movements, and the whole body photography were recorded to quantify the changes in posture and upper extremity biomechanics. Two-way repeated measures ANOVA was used for statistical analysis. No significant main effects were revealed in the EMG and wrist kinematics. The lower body stabilization significantly decreased the trunk lateral and forward deviations, and the visual focus-vertical angle. The biaxial pencil and the assigned grip height design significantly decreased the head, shoulder, trunk, and pelvic deviations compared with the regular design. The results indicated that the lower body positioning was effective in improving the trunk posture. A pencil with an assigned grip height or with a biaxial design could improve head, shoulder, trunk and pelvic alignment, but did not influence the muscle exertion of the upper extremity. This study could provide guidelines for parents, teachers and clinicians regarding the selection of writing tools and the knowledge of proper positioning for the children with handwriting difficulties. Further analyses can focus on the design, modification and clinical application of assitive sitting and writing devices for the use in children with handwriting difficulties. Copyright © 2013 Elsevier Ltd. All rights reserved.
A European Solution to Islamic Extremism in Western Europe
2006-04-14
Physically destroying terrorist organizations (“direct action”) is an effective tool. Where freedom of action and freedom of movement exist, there... effectiveness and sometimes duplicate effort. This paper will explain the growing Islamic extremist threat in Western Europe and present a case for why that...native-born youth franchise al-Qa’ida and execute a terrorist attack that effects a change in government. Terrorists executed a planned and deliberate
Demonstration of a Catalytic Converter Using a Lawn Mower Engine
ERIC Educational Resources Information Center
Young, Mark A.
2010-01-01
Catalytic conversion is an important tool in environmental-remediation strategies and source removal of pollutants. Because a catalyst is regenerated, the chemistry can be extremely effective for conversion of undesirable pollutant species to less harmful products in situations where the pollutants have accumulated or are being continuously…
Correlation between cervical vertebral and dental maturity in Iranian subjects.
Heravi, Farzin; Imanimoghaddam, Mahrokh; Rahimi, Hoda
2011-12-01
Determination of the skeletal maturation is extremely important in clinical orthodontics. Cervical vertebral maturation is an effective diagnostic tool for determining the adolescent growth spurt. The aim of this study was to investigate the correlation between the stages of calcification of teeth and the cervical vertebral maturity stages.
An individual-based modeling approach to simulating recreation use in wilderness settings
Randy Gimblett; Terry Daniel; Michael J. Meitner
2000-01-01
Landscapes protect biological diversity and provide unique opportunities for human-nature interactions. Too often, these desirable settings suffer from extremely high visitation. Given the complexity of social, environmental and economic interactions, resource managers need tools that provide insights into the cause and effect relationships between management actions...
NASA Astrophysics Data System (ADS)
Scoccimarro, Enrico; Fogli, Pier Giuseppe; Gualdi, Silvio
2017-04-01
It is well known that an increase of temperature over Europe, both in terms of averages and extremes, is expected within the current century. In order to consider health impacts under warm conditions, it is important to take into account the combined effect of temperature and humidity on the human body. To this aim a basic index - the humindex - representative of the perceived temperature, under different scenarios and periods, has been investigated in this study. A very low concomitance of extreme temperature events and extreme humindex events is found over the present climate, reinforcing the importance to investigate not only extreme temperature and relative humidity future projections but also the combination of the two parameters. A set of 10-km resolution regional climate simulations provided within the EUR-11 EURO-CORDEX multi-model effort, demonstrates ability in representing the intense and extreme events of the humindex over the present climate and to be eligible as a tool to quantify future changes in geographical patterns of exposed areas over Europe. An enlargement of the domain subject to dangerous conditions is found since the middle of the current century, reaching 60 degrees North when considering really extreme events. The most significant increase in humindex extreme events is found when comparing the 2066-2095 projections under rcp8.5 scenario, to the 1966-2005 period: bearing in mind that changes in relative humidity may either amplify or offset the health effects of temperature extremes, a less pronounced projected reduction of relative humidity intensity in the Northern part of the European domain, associated to extreme temperature and humindex, makes Northern Europe the most prone region to a local increase of the humindex extremes.
Building Flexible User Interfaces for Solving PDEs
NASA Astrophysics Data System (ADS)
Logg, Anders; Wells, Garth N.
2010-09-01
FEniCS is a collection of software tools for the automated solution of differential equations by finite element methods. In this note, we describe how FEniCS can be used to solve a simple nonlinear model problem with varying levels of automation. At one extreme, FEniCS provides tools for the fully automated and adaptive solution of nonlinear partial differential equations. At the other extreme, FEniCS provides a range of tools that allow the computational scientist to experiment with novel solution algorithms.
Trukhmanov, I M; Suslova, G A; Ponomarenko, G N
This paper is devoted to the characteristic of the informative value of the functional step test with the application of the heel cushions in the children for the purpose of differential diagnostics of anatomic and functional differences in the length of the lower extremities. A total of 85 schoolchildren with different length of the lower extremities have been examined. The comparative evaluation of the results of clinical and instrumental examinations was undertaken. The data obtained with the help of the functional step test give evidence of its very high sensitivity, specificity, and clinical significant as a tool for the examination of the children with different length of the low extremities. It is concluded that the test is one of the most informative predictors of the effectiveness of rehabilitation in the children with different length of the lower extremities.
Telescience - Concepts And Contributions To The Extreme Ultraviolet Explorer Mission
NASA Astrophysics Data System (ADS)
Marchant, Will; Dobson, Carl; Chakrabarti, Supriya; Malina, Roger F.
1987-10-01
A goal of the telescience concept is to allow scientists to use remotely located instruments as they would in their laboratory. Another goal is to increase reliability and scientific return of these instruments. In this paper we discuss the role of transparent software tools in development, integration, and postlaunch environments to achieve hands on access to the instrument. The use of transparent tools helps to reduce the parallel development of capability and to assure that valuable pre-launch experience is not lost in the operations phase. We also discuss the use of simulation as a rapid prototyping technique. Rapid prototyping provides a cost-effective means of using an iterative approach to instrument design. By allowing inexpensive produc-tion of testbeds, scientists can quickly tune the instrument to produce the desired scientific data. Using portions of the Extreme Ultraviolet Explorer (EUVE) system, we examine some of the results of preliminary tests in the use of simulation and tran-sparent tools. Additionally, we discuss our efforts to upgrade our software "EUVE electronics" simulator to emulate a full instrument, and give the pros and cons of the simulation facilities we have developed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Chen; Wang, Jianhui; Ton, Dan
Recent severe power outages caused by extreme weather hazards have highlighted the importance and urgency of improving the resilience of the electric power grid. As the distribution grids still remain vulnerable to natural disasters, the power industry has focused on methods of restoring distribution systems after disasters in an effective and quick manner. The current distribution system restoration practice for utilities is mainly based on predetermined priorities and tends to be inefficient and suboptimal, and the lack of situational awareness after the hazard significantly delays the restoration process. As a result, customers may experience an extended blackout, which causes largemore » economic loss. On the other hand, the emerging advanced devices and technologies enabled through grid modernization efforts have the potential to improve the distribution system restoration strategy. However, utilizing these resources to aid the utilities in better distribution system restoration decision-making in response to extreme weather events is a challenging task. Therefore, this paper proposes an integrated solution: a distribution system restoration decision support tool designed by leveraging resources developed for grid modernization. We first review the current distribution restoration practice and discuss why it is inadequate in response to extreme weather events. Then we describe how the grid modernization efforts could benefit distribution system restoration, and we propose an integrated solution in the form of a decision support tool to achieve the goal. The advantages of the solution include improving situational awareness of the system damage status and facilitating survivability for customers. The paper provides a comprehensive review of how the existing methodologies in the literature could be leveraged to achieve the key advantages. The benefits of the developed system restoration decision support tool include the optimal and efficient allocation of repair crews and resources, the expediting of the restoration process, and the reduction of outage durations for customers, in response to severe blackouts due to extreme weather hazards.« less
Evaluation of extreme temperature events in northern Spain based on process control charts
NASA Astrophysics Data System (ADS)
Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.
2018-02-01
Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.
Faust, Kyle; Faust, David
2015-08-12
Problematic or addictive digital gaming (including all types of electronic devices) can and has had extremely adverse impacts on the lives of many individuals across the world. The understanding of this phenomenon, and the effectiveness of treatment design and monitoring, can be improved considerably by continuing refinement of assessment tools. The present article briefly overviews tools designed to measure problematic or addictive use of digital gaming, the vast majority of which are founded on the Diagnostic and Statistical Manual of Mental Disorders (DSM) criteria for other addictive disorders, such as pathological gambling. Although adapting DSM content and strategies for measuring problematic digital gaming has proven valuable, there are some potential issues with this approach. We discuss the strengths and limitations of current methods for measuring problematic or addictive gaming and provide various recommendations that might help in enhancing or supplementing existing tools, or in developing new and even more effective tools.
Faust, Kyle; Faust, David
2015-01-01
Problematic or addictive digital gaming (including all types of electronic devices) can and has had extremely adverse impacts on the lives of many individuals across the world. The understanding of this phenomenon, and the effectiveness of treatment design and monitoring, can be improved considerably by continuing refinement of assessment tools. The present article briefly overviews tools designed to measure problematic or addictive use of digital gaming, the vast majority of which are founded on the Diagnostic and Statistical Manual of Mental Disorders (DSM) criteria for other addictive disorders, such as pathological gambling. Although adapting DSM content and strategies for measuring problematic digital gaming has proven valuable, there are some potential issues with this approach. We discuss the strengths and limitations of current methods for measuring problematic or addictive gaming and provide various recommendations that might help in enhancing or supplementing existing tools, or in developing new and even more effective tools. PMID:26274977
2013-06-01
realistically representing the world in a simulation environment. A screenshot of the combat model used for this research is shown below. There are six...changes in use of technology (Ryan & Jons, 1992). Cost effectiveness and operational effectiveness are important, and it is extremely hard to achieve...effectiveness of ships using simulation and analytical models, to create a ship synthesis model, and most importantly, to develop decision making tools
Physical Exam Risk Factors for Lower Extremity Injury in High School Athletes: A Systematic Review
Onate, James A.; Everhart, Joshua S.; Clifton, Daniel R.; Best, Thomas M.; Borchers, James R.; Chaudhari, Ajit M.W.
2016-01-01
Objective A stated goal of the preparticipation physical evaluation (PPE) is to reduce musculoskeletal injury, yet the musculoskeletal portion of the PPE is reportedly of questionable use in assessing lower extremity injury risk in high school-aged athletes. The objectives of this study are: (1) identify clinical assessment tools demonstrated to effectively determine lower extremity injury risk in a prospective setting, and (2) critically assess the methodological quality of prospective lower extremity risk assessment studies that use these tools. Data Sources A systematic search was performed in PubMed, CINAHL, UptoDate, Google Scholar, Cochrane Reviews, and SportDiscus. Inclusion criteria were prospective injury risk assessment studies involving athletes primarily ages 13 to 19 that used screening methods that did not require highly specialized equipment. Methodological quality was evaluated with a modified physiotherapy evidence database (PEDro) scale. Main Results Nine studies were included. The mean modified PEDro score was 6.0/10 (SD, 1.5). Multidirectional balance (odds ratio [OR], 3.0; CI, 1.5–6.1; P < 0.05) and physical maturation status (P < 0.05) were predictive of overall injury risk, knee hyperextension was predictive of anterior cruciate ligament injury (OR, 5.0; CI, 1.2–18.4; P < 0.05), hip external: internal rotator strength ratio of patellofemoral pain syndrome (P = 0.02), and foot posture index of ankle sprain (r = −0.339, P = 0.008). Conclusions Minimal prospective evidence supports or refutes the use of the functional musculoskeletal exam portion of the current PPE to assess lower extremity injury risk in high school athletes. Limited evidence does support inclusion of multidirectional balance assessment and physical maturation status in a musculoskeletal exam as both are generalizable risk factors for lower extremity injury. PMID:26978166
NASA Astrophysics Data System (ADS)
Abdel-Aal, H. A.; Mansori, M. El
2012-12-01
Cutting tools are subject to extreme thermal and mechanical loads during operation. The state of loading is intensified in dry cutting environment especially when cutting the so called hard-to-cut-materials. Although, the effect of mechanical loads on tool failure have been extensively studied, detailed studies on the effect of thermal dissipation on the deterioration of the cutting tool are rather scarce. In this paper we study failure of coated carbide tools due to thermal loading. The study emphasizes the role assumed by the thermo-physical properties of the tool material in enhancing or preventing mass attrition of the cutting elements within the tool. It is shown that within a comprehensive view of the nature of conduction in the tool zone, thermal conduction is not solely affected by temperature. Rather it is a function of the so called thermodynamic forces. These are the stress, the strain, strain rate, rate of temperature rise, and the temperature gradient. Although that within such consideration description of thermal conduction is non-linear, it is beneficial to employ such a form because it facilitates a full mechanistic understanding of thermal activation of tool wear.
Mingliang Liu; Michael E. Barber; Keith A. Cherkauer; Pete Robichaud; Jennifer C. Adam
2016-01-01
Increases in wildfire occurrence and severity under an altered climate can substantially impact terrestrial ecosystems through enhancing runoff erosion. Improved prediction tools that provide high resolution spatial information are necessary for location-specific soil conservation and watershed management. However, quantifying the magnitude of soil erosion and...
Pest Control For Container-Grown Longleaf Pine
Scott Enebak; Bill Carey
2002-01-01
Several insect, weed, and disease pests are discussed that have been observed affecting container-grown longleaf pine (Pinus palustris Mill.) seedlings. The available tools to minimize the effects of these pests are limited to a few select insecticides, herbicides, and fungicides. Extreme care should be taken to ensure that the chemical chosen is...
Siaw-Sakyi, Vincent
2017-12-01
Wound infection is proving to be a challenge for health care professionals. The associated complications and cost of wound infection is immense and can lead to death in extreme cases. Current management of wound infection is largely subjective and relies on the knowledge of the health care professional to identify and initiate treatment. In response, we have developed an infection prediction and assessment tool. The Wound Infection Risk-Assessment and Evaluation tool (WIRE) and its management strategy is a tool with the aim to bring objectivity to infection prediction, assessment and management. A local audit carried out indicated a high infection prediction rate. More work is being done to improve its effectiveness.
Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package
NASA Astrophysics Data System (ADS)
Cheng, L.; AghaKouchak, A.; Gilleland, E.
2013-12-01
Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.
Analysis of real-time vibration data
Safak, E.
2005-01-01
In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.
Gold, J E; Punnett, L; Cherniack, M; Wegman, D H
2005-01-01
Upper extremity musculoskeletal disorders (UEMSDs) comprise a large proportion of work-related illnesses in the USA. Physical risk factors including manual force and segmental vibration have been associated with UEMSDs. Reduced sensitivity to vibration in the fingertips (a function of nerve integrity) has been found in those exposed to segmental vibration, to hand force, and in office workers. The objective of this study was to determine whether an association exists between digital vibration thresholds (VTs) and exposure to ergonomic stressors in automobile manufacturing. Interviews and physical examinations were conducted in a cross-sectional survey of workers (n = 1174). In multivariable robust regression modelling, associations with workers' estimates of ergonomic stressors stratified on tool use were determined. VTs were separately associated with hand force, vibration as felt through the floor (whole body vibration), and with an index of multiple exposures in both tool users and non-tool users. Additional associations with contact stress and awkward upper extremity postures were found in tool users. Segmental vibration was not associated with VTs. Further epidemiologic and laboratory studies are needed to confirm the associations found. The association with self-reported whole body vibration exposure suggests a possible sympathetic nervous system effect, which remains to be explored.
Gyehee Lee; Liping A. Cai; Everette Mills; Joseph T. O' Leary
2002-01-01
Internet plays a significant role in generating new business and facilitating customers' need for a better way to plan and book their trips. From a marketers' perspective, one of the seemingly "fatal attractions" of the Internet for DMOs is that it can be an extremely effective tool in terms of both cost effectiveness and market penetration compared...
Overview of the biology of extreme events
NASA Astrophysics Data System (ADS)
Gutschick, V. P.; Bassirirad, H.
2008-12-01
Extreme events have, variously, meteorological origins as in heat waves or precipitation extremes, or biological origins as in pest and disease eruptions (or tectonic, earth-orbital, or impact-body origins). Despite growing recognition that these events are changing in frequency and intensity, a universal model of ecological responses to these events is slow to emerge. Extreme events, negative and positive, contrast with normal events in terms of their effects on the physiology, ecology, and evolution of organisms, hence also on water, carbon, and nutrient cycles. They structure biogeographic ranges and biomes, almost surely more than mean values often used to define biogeography. They are challenging to study for obvious reasons of field-readiness but also because they are defined by sequences of driving variables such as temperature, not point events. As sequences, their statistics (return times, for example) are challenging to develop, as also from the involvement of multiple environmental variables. These statistics are not captured well by climate models. They are expected to change with climate and land-use change but our predictive capacity is currently limited. A number of tools for description and analysis of extreme events are available, if not widely applied to date. Extremes for organisms are defined by their fitness effects on those organisms, and are specific to genotypes, making them major agents of natural selection. There is evidence that effects of extreme events may be concentrated in an extended recovery phase. We review selected events covering ranges of time and magnitude, from Snowball Earth to leaf functional loss in weather events. A number of events, such as the 2003 European heat wave, evidence effects on water and carbon cycles over large regions. Rising CO2 is the recent extreme of note, for its climatic effects and consequences for growing seasons, transpiration, etc., but also directly in its action as a substrate of photosynthesis. Effects on water and N cycles are already marked. Adaptive responses of plants are very irregularly distributed among species and genotypes, most adaptive responses having been lost over 20 My of minimal or virtually accidental genetic selection for correlated traits. Offsets of plant activity from those of pollinators and pests may amplify direct physiological effects on plants. Another extreme of interest is the insect-mediated mass dieoff of conifers across western North America tied to a rare combination of drought and year-long high temperatures.
Alotaibi, Naser M; Aljadi, Sameera H; Alrowayeh, Hesham N
2016-12-01
To investigate the psychometric properties (reliability, validity and responsiveness) of the DASH-Arabic in a cohort of Arabic patients presenting with various upper extremity conditions. Participants were 139 patients with various upper extremity conditions, who completed the DASH-Arabic at the baseline, 2-5 days later and 30-36 days later. Participants completed demographic data forms, the SF-36 and VAS at baseline, and a Global Rating of Change scale at first and second follow-ups. Cronbach's alpha of the DASH-Arabic was 0.94. Test-retest reliability was excellent with an ICC of 0.97. The SEM was 3.50 and the MDC95 was 9.28. Construct validity of the DASH-Arabic with the SF-36 subscales and VAS scores ranged from r -0.32 to -0.57, all statistically significant (p < 0.001). The effect size (ES) for the DASH-Arabic was 1.39 and its standard response mean was 1.51. The area under the curve was 0.82 (95% CI = 0.72-0.92, p < 0.001). The optimally efficient cutoff for an improvement was found to be a difference of 15 DASH points. The DASH-Arabic is a reliable, valid and responsive upper extremity outcome measure for patients whose primary language is Arabic; it can be used to document patient status and outcomes and support evidence-based practice. Implications for Rehabilitation The DASH-Arabic demonstrated sound psychometric properties of reliability, validity and responsiveness. It is an effective patient status and outcome tool that will support evidence-based practice. This tool is recommended for evaluating upper extremity work-related injuries and tracking therapeutic outcomes.
A Fiducial Approach to Extremes and Multiple Comparisons
ERIC Educational Resources Information Center
Wandler, Damian V.
2010-01-01
Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…
A Tool for Rating the Resilience of Critical Infrastructures in Extreme Fires
2014-05-01
provide a tool for NRC to help the Canadian industry to develop extreme fire protection materials and technologies for critical infrastructures. Future...supported by the Canadian Safety and Security Program (CSSP) which is led by Defence Research and Development Canada’s Centre for Security Science, in...in oil refinery and chemical industry facilities. The only available standard in North America that addresses the transportation infrastructure is
Telescience - Concepts and contributions to the Extreme Ultraviolet Explorer mission
NASA Technical Reports Server (NTRS)
Marchant, Will; Dobson, Carl; Chakrabarti, Supriya; Malina, Roger F.
1987-01-01
It is shown how the contradictory goals of low-cost and fast data turnaround characterizing the Extreme Ultraviolet Explorer (EUVE) mission can be achieved via the early use of telescience style transparent tools and simulations. The use of transparent tools reduces the parallel development of capability while ensuring that valuable prelaunch experience is not lost in the operations phase. Efforts made to upgrade the 'EUVE electronics' simulator are described.
A study of the impacts of climate change scenarios on the plant hardiness zones of Albania
USDA-ARS?s Scientific Manuscript database
Maps of plant hardiness zones are useful tools for determining the extreme limits for the survival of plants. Exploration of projected climate change effects on hardiness zones can help identify areas most affected by climate change. Such studies are important in areas with high risks related to cli...
ERIC Educational Resources Information Center
Chung, Gregory K. W. K.; Delacruz, Girlie C.; Dionne, Gary B.; Baker, Eva L.; Lee, John J.; Osmundson, Ellen
2016-01-01
This report addresses a renewed interest in individualized instruction, driven in part by advances in technology and assessment as well as a persistent desire to increase the access, efficiency, and cost effectiveness of training and education. Using computer-based instruction we delivered extremely efficient instruction targeted to low knowledge…
Tools in Support of Planning for Weather and Climate Extremes
NASA Astrophysics Data System (ADS)
Done, J.; Bruyere, C. L.; Hauser, R.; Holland, G. J.; Tye, M. R.
2016-12-01
A major limitation to planning for weather and climate extremes is the lack of maintained and readily available tools that can provide robust and well-communicated predictions and advice on their impacts. The National Center for Atmospheric Research is facilitating a collaborative international program to develop and support such tools within its Capacity Center for Climate and Weather Extremes aimed at improving community resilience planning and reducing weather and climate impacts. A Global Risk, Resilience and Impacts Toolbox is in development and will provide: A portable web-based interface to process work requests from a variety of users and locations; A sophisticated framework that enables specialized community tools to access a comprehensive database (public and private) of geo-located hazard, vulnerability, exposure, and loss data; A community development toolkit that enables and encourages community tool developments geared towards specific user management and planning needs, and A comprehensive community support facilitated by NCAR utilizing tutorials and a help desk. A number of applications are in development, built off the latest climate science, and in collaboration with private industry and local and state governments. Example applications will be described, including a hurricane damage tool in collaboration with the reinsurance sector, and a weather management tool for the construction industry. These examples will serve as starting points to discuss the broader potential of the toolbox.
Bolt, H. L.; Williams, C. E. J.; Brooks, R. V.; ...
2017-01-13
Hydrophobicity has proven to be an extremely useful parameter in small molecule drug discovery programmes given that it can be used as a predictive tool to enable rational design. For larger molecules, including peptoids, where folding is possible, the situation is more complicated and the average hydrophobicity (as determined by RP-HPLC retention time) may not always provide an effective predictive tool for rational design. Herein, we report the first ever application of partitioning experiments to determine the log D values for a series of peptoids. By comparing log D and average hydrophobicities we highlight the potential advantage of employing themore » former as a predictive tool in the rational design of biologically active peptoids.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bolt, H. L.; Williams, C. E. J.; Brooks, R. V.
Hydrophobicity has proven to be an extremely useful parameter in small molecule drug discovery programmes given that it can be used as a predictive tool to enable rational design. For larger molecules, including peptoids, where folding is possible, the situation is more complicated and the average hydrophobicity (as determined by RP-HPLC retention time) may not always provide an effective predictive tool for rational design. Herein, we report the first ever application of partitioning experiments to determine the log D values for a series of peptoids. By comparing log D and average hydrophobicities we highlight the potential advantage of employing themore » former as a predictive tool in the rational design of biologically active peptoids.« less
Steiner, Jean L; Engle, David M; Xiao, Xiangming; Saleh, Ali; Tomlinson, Peter; Rice, Charles W; Cole, N Andy; Coleman, Samuel W; Osei, Edward; Basara, Jeffrey; Middendorf, Gerad; Gowda, Prasanna; Todd, Richard; Moffet, Corey; Anandhi, Aavudai; Starks, Patrick J; Ocshner, Tyson; Reuter, Ryan; Devlin, Daniel
2014-11-01
Ruminant livestock provides meat and dairy products that sustain health and livelihood for much of the world's population. Grazing lands that support ruminant livestock provide numerous ecosystem services, including provision of food, water, and genetic resources; climate and water regulation; support of soil formation; nutrient cycling; and cultural services. In the U.S. southern Great Plains, beef production on pastures, rangelands, and hay is a major economic activity. The region's climate is characterized by extremes of heat and cold and extremes of drought and flooding. Grazing lands occupy a large portion of the region's land, significantly affecting carbon, nitrogen, and water budgets. To understand vulnerabilities and enhance resilience of beef production, a multi-institutional Coordinated Agricultural Project (CAP), the "grazing CAP," was established. Integrative research and extension spanning biophysical, socioeconomic, and agricultural disciplines address management effects on productivity and environmental footprints of production systems. Knowledge and tools being developed will allow farmers and ranchers to evaluate risks and increase resilience to dynamic conditions. The knowledge and tools developed will also have relevance to grazing lands in semiarid and subhumid regions of the world. © 2014 New York Academy of Sciences.
The role of humidity in determining scenarios of perceived temperature extremes in Europe
NASA Astrophysics Data System (ADS)
Scoccimarro, Enrico; Fogli, Pier Giuseppe; Gualdi, Silvio
2017-11-01
An increase of the 2 m temperature over Europe is expected within the current century. In order to consider health impacts, it is important to evaluate the combined effect of temperature and humidity on the human body. To achieve this, projections of a basic index—the humidex—representative of the perceived temperature, under different scenarios and periods, have been investigated. The simultaneous occurrence of observed extreme temperature events and perceived extreme temperature events is seldom found within the present climate, reinforcing the importance of investigating the combination of the two fields. A set of 10 km resolution regional climate simulations, provided within the EURO-CORDEX multi-model effort, demonstrates an ability in representing moderate to extreme events of perceived temperature over the present climate, and to be useful as a tool for quantifying future changes in geographical patterns of exposed areas over Europe. Following the RCP8.5 emission scenario, an expansion of the area subject to dangerous conditions is suggested from the middle of the current century, reaching 60 °N. The most significant increase of perceived extreme temperature conditions is found comparing the 2066-2095 projections to the 1976-2005 period; bearing in mind that changes in relative humidity may either amplify or offset the health effects of temperature, a less pronounced projected reduction of relative humidity in the north-eastern part of Europe, associated with extreme humidex events, makes northern Europe the most prone region to an increase of moderate to extreme values of perceived temperature. This is in agreement with a pronounced projected specific humidity increase.
Extreme learning machine for reduced order modeling of turbulent geophysical flows.
San, Omer; Maulik, Romit
2018-04-01
We investigate the application of artificial neural networks to stabilize proper orthogonal decomposition-based reduced order models for quasistationary geophysical turbulent flows. An extreme learning machine concept is introduced for computing an eddy-viscosity closure dynamically to incorporate the effects of the truncated modes. We consider a four-gyre wind-driven ocean circulation problem as our prototype setting to assess the performance of the proposed data-driven approach. Our framework provides a significant reduction in computational time and effectively retains the dynamics of the full-order model during the forward simulation period beyond the training data set. Furthermore, we show that the method is robust for larger choices of time steps and can be used as an efficient and reliable tool for long time integration of general circulation models.
Extreme learning machine for reduced order modeling of turbulent geophysical flows
NASA Astrophysics Data System (ADS)
San, Omer; Maulik, Romit
2018-04-01
We investigate the application of artificial neural networks to stabilize proper orthogonal decomposition-based reduced order models for quasistationary geophysical turbulent flows. An extreme learning machine concept is introduced for computing an eddy-viscosity closure dynamically to incorporate the effects of the truncated modes. We consider a four-gyre wind-driven ocean circulation problem as our prototype setting to assess the performance of the proposed data-driven approach. Our framework provides a significant reduction in computational time and effectively retains the dynamics of the full-order model during the forward simulation period beyond the training data set. Furthermore, we show that the method is robust for larger choices of time steps and can be used as an efficient and reliable tool for long time integration of general circulation models.
NASA Astrophysics Data System (ADS)
Renschler, C.; Sheridan, M. F.; Patra, A. K.
2008-05-01
The impact and consequences of extreme geophysical events (hurricanes, floods, wildfires, volcanic flows, mudflows, etc.) on properties and processes should be continuously assessed by a well-coordinated interdisciplinary research and outreach approach addressing risk assessment and resilience. Communication between various involved disciplines and stakeholders is the key to a successful implementation of an integrated risk management plan. These issues become apparent at the level of decision support tools for extreme events/disaster management in natural and managed environments. The Geospatial Project Management Tool (GeoProMT) is a collaborative platform for research and training to document and communicate the fundamental steps in transforming information for extreme events at various scales for analysis and management. GeoProMT is an internet-based interface for the management of shared geo-spatial and multi-temporal information such as measurements, remotely sensed images, and other GIS data. This tool enhances collaborative research activities and the ability to assimilate data from diverse sources by integrating information management. This facilitates a better understanding of natural processes and enhances the integrated assessment of resilience against both the slow and fast onset of hazard risks. Fundamental to understanding and communicating complex natural processes are: (a) representation of spatiotemporal variability, extremes, and uncertainty of environmental properties and processes in the digital domain, (b) transformation of their spatiotemporal representation across scales (e.g. interpolation, aggregation, disaggregation.) during data processing and modeling in the digital domain, and designing and developing tools for (c) geo-spatial data management, and (d) geo-spatial process modeling and effective implementation, and (e) supporting decision- and policy-making in natural resources and hazard management at various spatial and temporal scales of interest. GeoProMT is useful for researchers, practitioners, and decision-makers, because it provides an integrated environmental system assessment and data management approach that considers the spatial and temporal scales and variability in natural processes. Particularly in the occurrence or onset of extreme events it can utilize the latest data sources that are available at variable scales, combine them with existing information, and update assessment products such as risk and vulnerability assessment maps. Because integrated geo-spatial assessment requires careful consideration of all the steps in utilizing data, modeling and decision-making formats, each step in the sequence must be assessed in terms of how information is being scaled. At the process scale various geophysical models (e.g. TITAN, LAHARZ, or many other examples) are appropriate for incorporation in the tool. Some examples that illustrate our approach include: 1) coastal parishes impacted by Hurricane Rita (Southwestern Louisiana), 2) a watershed affected by extreme rainfall induced debris-flows (Madison County, Virginia; Panabaj, Guatemala; Casita, Nicaragua), and 3) the potential for pyroclastic flows to threaten a city (Tungurahua, Ecuador). This research was supported by the National Science Foundation.
Kim, Dae Wook; Kim, Sug-Whan
2005-02-07
We present a novel simulation technique that offers efficient mass fabrication strategies for 2m class hexagonal mirror segments of extremely large telescopes. As the first of two studies in series, we establish the theoretical basis of the tool influence function (TIF) for precessing tool polishing simulation for non-rotating workpieces. These theoretical TIFs were then used to confirm the reproducibility of the material removal foot-prints (measured TIFs) of the bulged precessing tooling reported elsewhere. This is followed by the reverse-computation technique that traces, employing the simplex search method, the real polishing pressure from the empirical TIF. The technical details, together with the results and implications described here, provide the theoretical tool for material removal essential to the successful polishing simulation which will be reported in the second study.
Developing a passive load reduction blade for the DTU 10 MW reference turbine
NASA Astrophysics Data System (ADS)
de Vaal, J. B.; Nygaard, T. A.; Stenbro, R.
2016-09-01
This paper presents the development of a passive load reduction blade for the DTU 10 MW reference wind turbine, using the aero-hydro-servo-elastic analysis tool 3DFloat. Passive load reduction is achieved by introducing sweep to the path of the blade elastic axis, so that out-of-plane bending deflections result in load alleviating torsional deformations of the blade. Swept blades are designed to yield similar annual energy production as a rotor with a reference straight blade. This is achieved by modifying the aerodynamic twist distribution for swept blades based on non-linear blade deflection under steady state loads. The passive load reduction capability of a blade design is evaluated by running a selection of fatigue- and extreme load cases with the analysis tool 3DFloat and determining equivalent fatigue loads, fatigue damage and extreme loads at the blade root and tower base. The influence of sweep on the flutter speed of a blade design is also investigated. A large number of blade designs are evaluated by varying the parameters defining the sweep path of a blade's elastic axis. Results show that a moderate amount of sweep can effectively reduce equivalent fatigue damage and extreme loads, without significantly reducing the flutter speed, or compromising annual energy production.
Durrieu, Gilles; Pham, Quang-Khoai; Foltête, Anne-Sophie; Maxime, Valérie; Grama, Ion; Tilly, Véronique Le; Duval, Hélène; Tricot, Jean-Marie; Naceur, Chiraz Ben; Sire, Olivier
2016-07-01
Water quality can be evaluated using biomarkers such as tissular enzymatic activities of endemic species. Measurement of molluscs bivalves activity at high frequency (e.g., valvometry) during a long time period is another way to record the animal behavior and to evaluate perturbations of the water quality in real time. As the pollution affects the activity of oysters, we consider the valves opening and closing velocities to monitor the water quality assessment. We propose to model the huge volume of velocity data collected in the framework of valvometry using a new nonparametric extreme values statistical model. The objective is to estimate the tail probabilities and the extreme quantiles of the distribution of valve closing velocity. The tail of the distribution function of valve closing velocity is modeled by a Pareto distribution with parameter t,τ , beyond a threshold τ according to the time t of the experiment. Our modeling approach reveals the dependence between the specific activity of two enzymatic biomarkers (Glutathione-S-transferase and acetylcholinesterase) and the continuous recording of oyster valve velocity, proving the suitability of this tool for water quality assessment. Thus, valvometry allows in real-time in situ analysis of the bivalves behavior and appears as an effective early warning tool in ecological risk assessment and marine environment monitoring.
Frequency Domain Modeling of SAW Devices
NASA Technical Reports Server (NTRS)
Wilson, W. C.; Atkinson, G. M.
2007-01-01
New SAW sensors for integrated vehicle health monitoring of aerospace vehicles are being investigated. SAW technology is low cost, rugged, lightweight, and extremely low power. However, the lack of design tools for MEMS devices in general, and for Surface Acoustic Wave (SAW) devices specifically, has led to the development of tools that will enable integrated design, modeling, simulation, analysis and automatic layout generation of SAW devices. A frequency domain model has been created. The model is mainly first order, but it includes second order effects from triple transit echoes. This paper presents the model and results from the model for a SAW delay line device.
Lymphoscintigraphic findings in chylous reflux in a lower extremity.
Berenji, Gholam R; Iker, Emily; Glass, Edwin C
2007-09-01
Lymphoscintigraphy is a useful and safe tool for the diagnostic evaluation of a swollen extremity. Unilateral leg swelling with cutaneous chylous vesicles is a common manifestation of chylous reflux. The authors present a case of chylous reflux in an 11-year-old boy who presented with swelling and skin lesions of the left lower extremity.
Tempest: Tools for Addressing the Needs of Next-Generation Climate Models
NASA Astrophysics Data System (ADS)
Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.
2015-12-01
Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.
Extreme Events and Energy Providers: Science and Innovation
NASA Astrophysics Data System (ADS)
Yiou, P.; Vautard, R.
2012-04-01
Most socio-economic regulations related to the resilience to climate extremes, from infrastructure or network design to insurance premiums, are based on a present-day climate with an assumption of stationarity. Climate extremes (heat waves, cold spells, droughts, storms and wind stilling) affect in particular energy production, supply, demand and security in several ways. While national, European or international projects have generated vast amounts of climate projections for the 21st century, their practical use in long-term planning remains limited. Estimating probabilistic diagnostics of energy user relevant variables from those multi-model projections will help the energy sector to elaborate medium to long-term plans, and will allow the assessment of climate risks associated to those plans. The project "Extreme Events for Energy Providers" (E3P) aims at filling a gap between climate science and its practical use in the energy sector and creating in turn favourable conditions for new business opportunities. The value chain ranges from addressing research questions directly related to energy-significant climate extremes to providing innovative tools of information and decision making (including methodologies, best practices and software) and climate science training for the energy sector, with a focus on extreme events. Those tools will integrate the scientific knowledge that is developed by scientific communities, and translate it into a usable probabilistic framework. The project will deliver projection tools assessing the probabilities of future energy-relevant climate extremes at a range of spatial scales varying from pan-European to local scales. The E3P project is funded by the Knowledge and Innovation Community (KIC Climate). We will present the mechanisms of interactions between academic partners, SMEs and industrial partners for this project. Those mechanisms are elementary bricks of a climate service.
Translational informatics: an industry perspective.
Cantor, Michael N
2012-01-01
Translational informatics (TI) is extremely important for the pharmaceutical industry, especially as the bar for regulatory approval of new medications is set higher and higher. This paper will explore three specific areas in the drug development lifecycle, from tools developed by precompetitive consortia to standardized clinical data collection to the effective delivery of medications using clinical decision support, in which TI has a major role to play. Advancing TI will require investment in new tools and algorithms, as well as ensuring that translational issues are addressed early in the design process of informatics projects, and also given higher weight in funding or publication decisions. Ultimately, the source of translational tools and differences between academia and industry are secondary, as long as they move towards the shared goal of improving health.
Using Weather Types to Understand and Communicate Weather and Climate Impacts
NASA Astrophysics Data System (ADS)
Prein, A. F.; Hale, B.; Holland, G. J.; Bruyere, C. L.; Done, J.; Mearns, L.
2017-12-01
A common challenge in atmospheric research is the translation of scientific advancements and breakthroughs to decision relevant and actionable information. This challenge is central to the mission of NCAR's Capacity Center for Climate and Weather Extremes (C3WE, www.c3we.ucar.edu). C3WE advances our understanding of weather and climate impacts and integrates these advances with distributed information technology to create tools that promote a global culture of resilience to weather and climate extremes. Here we will present an interactive web-based tool that connects historic U.S. losses and fatalities from extreme weather and climate events to 12 large-scale weather types. Weather types are dominant weather situations such as winter high-pressure systems over the U.S. leading to very cold temperatures or summertime moist humid air masses over the central U.S. leading to severe thunderstorms. Each weather type has a specific fingerprint of economic losses and fatalities in a region that is quantified. Therefore, weather types enable a direct connection of observed or forecasted weather situation to loss of life and property. The presented tool allows the user to explore these connections, raise awareness of existing vulnerabilities, and build resilience to weather and climate extremes.
NASA Astrophysics Data System (ADS)
Foufoula-Georgiou, E.
1989-05-01
A storm transposition approach is investigated as a possible tool of assessing the frequency of extreme precipitation depths, that is, depths of return period much greater than 100 years. This paper focuses on estimation of the annual exceedance probability of extreme average precipitation depths over a catchment. The probabilistic storm transposition methodology is presented, and the several conceptual and methodological difficulties arising in this approach are identified. The method is implemented and is partially evaluated by means of a semihypothetical example involving extreme midwestern storms and two hypothetical catchments (of 100 and 1000 mi2 (˜260 and 2600 km2)) located in central Iowa. The results point out the need for further research to fully explore the potential of this approach as a tool for assessing the probabilities of rare storms, and eventually floods, a necessary element of risk-based analysis and design of large hydraulic structures.
The use of copula functions for predictive analysis of correlations between extreme storm tides
NASA Astrophysics Data System (ADS)
Domino, Krzysztof; Błachowicz, Tomasz; Ciupak, Maurycy
2014-11-01
In this paper we present a method used in quantitative description of weakly predictable hydrological, extreme events at inland sea. Investigations for correlations between variations of individual measuring points, employing combined statistical methods, were carried out. As a main tool for this analysis we used a two-dimensional copula function sensitive for correlated extreme effects. Additionally, a new proposed methodology, based on Detrended Fluctuations Analysis (DFA) and Anomalous Diffusion (AD), was used for the prediction of negative and positive auto-correlations and associated optimum choice of copula functions. As a practical example we analysed maximum storm tides data recorded at five spatially separated places at the Baltic Sea. For the analysis we used Gumbel, Clayton, and Frank copula functions and introduced the reversed Clayton copula. The application of our research model is associated with modelling the risk of high storm tides and possible storm flooding.
NASA Astrophysics Data System (ADS)
Jamaludin, A. S.; Hosokawa, A.; Furumoto, T.; Koyano, T.; Hashimoto, Y.
2018-03-01
Cutting process of difficult-to-cut material such as stainless steel, generates immensely excessive heat, which is one of the major causes related to shortening tool life and lower quality of surface finish. It is proven that application of cutting fluid during the cutting process of difficult-to-cut material is able to improve the cutting performance, but excessive application of cutting fluid leads to another problem such as increasing processing cost and environmental hazardous pollution of workplace. In the study, Extreme Cold Mist system is designed and tested along with various Minimum Quantity Lubrication (MQL) systems on turning process of stainless steel AISI 316. In the study, it is obtained that, Extreme Cold Mist system is able to reduce cutting force up to 60N and improve the surface roughness of the machined surface significantly.
A regressive storm model for extreme space weather
NASA Astrophysics Data System (ADS)
Terkildsen, Michael; Steward, Graham; Neudegg, Dave; Marshall, Richard
2012-07-01
Extreme space weather events, while rare, pose significant risk to society in the form of impacts on critical infrastructure such as power grids, and the disruption of high end technological systems such as satellites and precision navigation and timing systems. There has been an increased focus on modelling the effects of extreme space weather, as well as improving the ability of space weather forecast centres to identify, with sufficient lead time, solar activity with the potential to produce extreme events. This paper describes the development of a data-based model for predicting the occurrence of extreme space weather events from solar observation. The motivation for this work was to develop a tool to assist space weather forecasters in early identification of solar activity conditions with the potential to produce extreme space weather, and with sufficient lead time to notify relevant customer groups. Data-based modelling techniques were used to construct the model, and an extensive archive of solar observation data used to train, optimise and test the model. The optimisation of the base model aimed to eliminate false negatives (missed events) at the expense of a tolerable increase in false positives, under the assumption of an iterative improvement in forecast accuracy during progression of the solar disturbance, as subsequent data becomes available.
Stover, Bert; Silverstein, Barbara; Wickizer, Thomas; Martin, Diane P; Kaufman, Joel
2007-06-01
Work related upper extremity musculoskeletal disorders (MSD) result in substantial disability, and expense. Identifying workers or jobs with high risk can trigger intervention before workers are injured or the condition worsens. We investigated a disability instrument, the QuickDASH, as a workplace screening tool to identify workers at high risk of developing upper extremity MSDs. Subjects included workers reporting recurring upper extremity MSD symptoms in the past 7 days (n = 559). The QuickDASH was reasonably accurate at baseline with sensitivity of 73% for MSD diagnosis, and 96% for symptom severity. Specificity was 56% for diagnosis, and 53% for symptom severity. At 1-year follow-up sensitivity and specificity for MSD diagnosis was 72% and 54%, respectively, as predicted by the baseline QuickDASH score. For symptom severity, sensitivity and specificity were 86% and 52%. An a priori target sensitivity of 70% and specificity of 50% was met by symptom severity, work pace and quality, and MSD diagnosis. The QuickDASH may be useful for identifying jobs or workers with increased risk for upper extremity MSDs. It may provide an efficient health surveillance screening tool useful for targeting early workplace intervention for prevention of upper extremity MSD problems.
NASA Astrophysics Data System (ADS)
Solecki, W. D.; Friedman, E. S.; Breitzer, R.
2016-12-01
Increasingly frequent extreme weather events are becoming an immediate priority for urban coastal practitioners and stakeholders, adding complexity to decisions concerning risk management for short-term action and long-term needs of city climate stakeholders. The conflict between the prioritization of short versus long-term events by decision-makers creates disconnect between climate science and its applications. The Consortium for Climate Risk in the Urban Northeast (CCRUN), a NOAA RISA team, is developing a set of mechanisms to help bridge this gap. The mechanisms are designed to promote the application of climate science on extreme weather events and their aftermath. It is in the post event policy window where significant opportunities for science-policy linkages exist. In particular, CCRUN is interested in producing actionable and useful information for city managers to use in decision-making processes surrounding extreme weather events and climate change. These processes include a sector specific needs assessment survey instrument and two tools for urban coastal practitioners and stakeholders. The tools focus on post event learning and connections between resilience and transformative adaptation. Elements of the two tools are presented. Post extreme event learning supports urban coastal practitioners and decision-makers concerned about maximizing opportunities for knowledge transfer and assimilation, and policy initiation and development following an extreme weather event. For the urban U.S. Northeast, post event learning helps coastal stakeholders build the capacity to adapt to extreme weather events, and inform and develop their planning capacity through analysis of past actions and steps taken in response to Hurricane Sandy. Connecting resilience with transformative adaptation is intended to promote resilience in urban Northeast coastal settings to the long-term negative consequences of extreme weather events. This is done through a knowledge co-production engagement process that links innovative and flexible adaptation pathways that can address requirements for short-term action and long-term needs.
The Use of Mastery Learning with Competency-Based Grading in an Organic Chemistry Course
ERIC Educational Resources Information Center
Diegelman-Parente, Amy
2011-01-01
Mastery learning is an instructional method based on the idea that students learn best if they fully understand, or master, one concept before moving on to the next and has been shown to be extremely effective in math and science curricula. Competency-based grading is an evaluative tool that allows the faculty member to determine the level of…
[Ultrasound examination for lower extremity deep vein thrombosis].
Toyota, Kosaku
2014-09-01
Surgery is known to be a major risk factor of vein thrombosis. Progression from lower extremity deep vein thrombosis (DVT) to pulmonary embolism can lead to catastrophic outcome, although the incidence ratio is low. The ability to rule in or rule out DVT is becoming essential for anesthesiologists. Non-invasive technique of ultrasonography is a sensitive and specific tool for the assessment of lower extremity DVT. This article introduces the basics and practical methods of ultrasound examination for lower extremity DVT.
A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J
Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). Formore » all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vyakaranam, Bharat GNVSR; Vallem, Mallikarjuna R.; Nguyen, Tony B.
The vulnerability of large power systems to cascading failures and major blackouts has become evident since the Northeast blackout in 1965. Based on analyses of the series of cascading blackouts in the past decade, the research community realized the urgent need to develop better methods, tools, and practices for performing cascading-outage analysis and for evaluating mitigations that are easily accessible by utility planning engineers. PNNL has developed the Dynamic Contingency Analysis Tool (DCAT) as an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power planning engineers to assess the impact and likelihoodmore » of extreme contingencies and potential cascading events across their systems and interconnections. DCAT analysis will help identify potential vulnerabilities and allow study of mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. Using the DCAT capability, we examined the impacts of various load conditions to identify situations in which the power grid may encounter cascading outages that could lead to potential blackouts. This paper describes the usefulness of the DCAT tool and how it helps to understand potential impacts of load demand on cascading failures on the power system.« less
NASA Astrophysics Data System (ADS)
Vicari, Rosa; Gires, Auguste; Tchiguirinskaia, Ioulia; Schertzer, Daniel
2014-05-01
The combined effects of climate change and increasing urbanisation call for new solutions to achieve urban resiliency to extreme weather. The research projects carried out by the HM&Co team (LEESU & Chair 'Hydrology for Resilient Cities' sponsored by Veolia) need to be supported by communication activities aimed to support community capacity building and cooperation between scientists and their partners and stakeholders. While outreach activities are becoming an integral part of many research projects on climate adaptation, their evaluation is scarce, rather optional, very limited. This work aims to develop quantitative and qualitative evaluation of science communication and to design corresponding assessment tools. It will be examined how evaluation can eventually improve the quality, efficiency and impact of communication activities in enhancing collaboration between scientists, professionals (e.g. water managers, urban planners) and beneficiaries (e.g. concerned citizens, policy makers). The research takes hold on several case studies on projects and programs aiming to increase the resiliency of cities to extreme weather: French projects and programmes such as RadX@IdF and Chair "Hydrology for a resilient city", European projects such as Climate KIC Blue Green Dream and Interreg NWE IVB RainGain and worldwide collaborations (e.g. TOMACS). The evaluation techniques and tools developed in the framework of this work are intended to become a useful support for engineers and researchers involved in projects on urban hydrology where resilience to extreme weather events relies also on effective communication processes between the above mentioned social actors. In particular, one of the purposes of this work is to highlight how auto-evaluation can improve on-going communication activities and create a virtuous circle of planning/implementation/evaluation. This research has links with those on the development of exploration techniques of the unstructured social big data, with a particular focus on digital communications.
Adami, Silvano; Bertoldo, Francesco; Gatti, Davide; Minisola, Giovanni; Rossini, Maurizio; Sinigaglia, Luigi; Varenna, Massimo
2013-09-01
The definition of osteoporosis was based for several years on bone mineral density values, which were used by most guidelines for defining treatment thresholds. The availability of tools for the estimation of fracture risk, such as FRAX™ or its adapted Italian version, DeFRA, is providing a way to grade osteoporosis severity. By applying these new tools, the criteria identified in Italy for treatment reimbursability (e.g., "Nota 79") are confirmed as extremely conservative. The new fracture risk-assessment tools provide continuous risk values that can be used by health authorities (or "payers") for identifying treatment thresholds. FRAX estimates the risk for "major osteoporotic fractures," which are not counted in registered fracture trials. Here, we elaborate an algorithm to convert vertebral and nonvertebral fractures to the "major fractures" of FRAX, and this allows a cost-effectiveness assessment for each drug.
NASA Astrophysics Data System (ADS)
Matonse, A. H.; Porter, J. H.; Frei, A.
2015-12-01
Providing an average 1.1 billion gallons (~ 4.2 x 106 cubic meters) of drinking water per day to approximately nine million people in New York City (NYC) and four upstate counties, the NYC water supply is among the world's largest unfiltered systems. In addition to providing a reliable water supply in terms of water quantity and quality, the city has to fulfill other flow objectives to serve downstream communities. At times, such as during extreme hydrological events, water quality issues may restrict water usage for parts of the system. To support a risk-based water supply decision making process NYC has developed the Operations Support Tool (OST). OST combines a water supply systems model with reservoir water quality models, near real time data ingestion, data base management and an ensemble hydrological forecast. A number of reports have addressed the frequency and intensities of extreme hydrological events across the continental US. In the northeastern US studies have indicated an increase in the frequency of extremely large precipitation and streamflow events during the most recent decades. During this presentation we describe OST and, using case studies we demonstrate how this tool has been useful to support operational decisions. We also want to motivate a discussion about how undergoing changes in patterns of hydrological extreme events elevate the challenge faced by water supply managers and the role of the scientific community to integrate nonstationarity approaches in hydrologic forecast and modeling.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Peter, Thomas; Ribatet, Mathieu; Davison, Anthony C.; Stübi, Rene; Weihs, Philipp; Holawe, Franz
2010-05-01
In this study tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) are applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately. The study illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) (Rieder et al., 2010a). A daily moving threshold was implemented for consideration of the seasonal cycle in total ozone. The frequency of days with extreme low (termed ELOs) and extreme high (termed EHOs) total ozone and the influence of those on mean values and trends is analyzed for Arosa total ozone time series. The results show (a) an increase in ELOs and (b) a decrease in EHOs during the last decades and (c) that the overall trend during the 1970s and 1980s in total ozone is strongly dominated by changes in these extreme events. After removing the extremes, the time series shows a strongly reduced trend (reduction by a factor of 2.5 for trend in annual mean). Furthermore, it is shown that the fitted model represents the tails of the total ozone data set with very high accuracy over the entire range (including absolute monthly minima and maxima). Also the frequency distribution of ozone mini-holes (using constant thresholds) can be calculated with high accuracy. Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight in time series properties. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances lead to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the presented new extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998b.
Marchese, Victoria G; Rai, Shesh N; Carlson, Claire A; Hinds, Pamela S; Spearing, Elena M; Zhang, Lijun; Callaway, Lulie; Neel, Michael D; Rao, Bhaskar N; Ginsberg, Jill P
2007-08-01
Reliability and validity of a new tool, Functional Mobility Assessment (FMA), were examined in patients with lower-extremity sarcoma. FMA requires the patients to physically perform the functional mobility measures, unlike patient self-report or clinician administered measures. A sample of 114 subjects participated, 20 healthy volunteers and 94 patients with lower-extremity sarcoma after amputation, limb-sparing, or rotationplasty surgery. Reliability of the FMA was examined by three raters testing 20 healthy volunteers and 23 subjects with lower-extremity sarcoma. Concurrent validity was examined using data from 94 subjects with lower-extremity sarcoma who completed the FMA, Musculoskeletal Tumor Society (MSTS), Short-Form 36 (SF-36v2), and Toronto Extremity Salvage Scale (TESS) scores. Construct validity was measured by the ability of the FMA to discriminate between subjects with and without functional mobility deficits. FMA demonstrated excellent reliability (ICC [2,1] >or=0.97). Moderate correlations were found between FMA and SF-36v2 (r = 0.60, P < 0.01), FMA and MSTS (r = 0.68, P < 0.01), and FMA and TESS (r = 0.62, P < 0.01). The patients with lower-extremity sarcoma scored lower on the FMA as compared to healthy controls (P < 0.01). The FMA is a reliable and valid functional outcome measure for patients with lower-extremity sarcoma. This study supports the ability of the FMA to discriminate between patients with varying functional abilities and supports the need to include measures of objective functional mobility in examination of patients with lower-extremity sarcoma.
ERIC Educational Resources Information Center
Martin, Magy; Martin, Don
2015-01-01
The critical success of online instructors is their ability to engage students in the learning process. With this expertise, the online experience is extremely effective. The goal of this book is to help faculty understand the processes of teaching online and learning to be student-centered, which are the first steps toward becoming a successful…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Luce, F. P.; Azevedo, G. de M.; Baptista, D. L.
The formation and time resolved behavior of individual Pb nanoparticles embedded in silica have been studied by in-situ transmission electron microscopy observations at high temperatures (400–1100 °C) and under 200 keV electron irradiation. It is shown that under such extreme conditions, nanoparticles can migrate at long distances presenting a Brownian-like behavior and eventually coalesce. The particle migration phenomenon is discussed considering the influence of the thermal energy and the electron irradiation effects on the atomic diffusion process which is shown to control particle migration. These results and comparison with ex-situ experiments tackle the stability and the microstructure evolution of nanoparticles systems undermore » extreme conditions. It elucidates on the effects of energetic particle irradiation-annealing treatments either as a tool or as a detrimental issue that could hamper their long-term applications in radiation-harsh environments such as in space or nuclear sectors.« less
Science & Technology Review September/October 2008
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bearinger, J P
2008-07-21
This issue has the following articles: (1) Answering Scientists Most Audacious Questions--Commentary by Dona Crawford; (2) Testing the Accuracy of the Supernova Yardstick--High-resolution simulations are advancing understanding of Type Ia supernovae to help uncover the mysteries of dark energy; (3) Developing New Drugs and Personalized Medical Treatment--Accelerator mass spectrometry is emerging as an essential tool for assessing the effects of drugs in humans; (4) Triage in a Patch--A painless skin patch and accompanying detector can quickly indicate human exposure to biological pathogens, chemicals, explosives, or radiation; and (5) Smoothing Out Defects for Extreme Ultraviolet Lithography--A process for smoothing mask defectsmore » helps move extreme ultraviolet lithography one step closer to creating smaller, more powerful computer chips.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wilbanks, Thomas J.; Fernandez, Steven J.; Allen, Melissa R.
The President s Climate Change Action Plan calls for the development of better science, data, and tools for climate preparedness. Many of the current questions about preparedness for extreme weather events in coming decades are, however, difficult to answer with assets that have been developed by climate science to answer longer-term questions about climate change. Capacities for projecting exposures to climate-related extreme events, along with their implications for interconnected infrastructures, are now emerging.
Wilbanks, Thomas J.; Fernandez, Steven J.; Allen, Melissa R.
2015-06-23
The President s Climate Change Action Plan calls for the development of better science, data, and tools for climate preparedness. Many of the current questions about preparedness for extreme weather events in coming decades are, however, difficult to answer with assets that have been developed by climate science to answer longer-term questions about climate change. Capacities for projecting exposures to climate-related extreme events, along with their implications for interconnected infrastructures, are now emerging.
NASA Astrophysics Data System (ADS)
Duffy, P. B.; Colohan, P.; Driggers, R.; Herring, D.; Laurier, F.; Petes, L.; Ruffo, S.; Tilmes, C.; Venkataraman, B.; Weaver, C. P.
2014-12-01
Effective adaptation to impacts of climate change requires best-available information. To be most useful, this information should be easily found, well-documented, and translated into tools that decision-makers use and trust. To meet these needs, the President's Climate Action Plan includes efforts to develop "actionable climate science". The Climate Data Initiative (CDI) leverages the Federal Government's extensive, open data resources to stimulate innovation and private-sector entrepreneurship in support of actions to prepare for climate change. The Initiative forges commitments and partnerships from the private, NGO, academic, and public sectors to create data-driven tools. Open data from Federal agencies to support this innovation is available on Climate.Data.gov, initially focusing on coastal flooding but soon to expand to topics including food, energy, water, energy, transportation, and health. The Climate Resilience Toolkit (CRT) will facilitate access to data-driven resilience tools, services, and best practices, including those accessible through the CDI. The CRT will also include access to training and tutorials, case studies, engagement forums, and other information sources. The Climate Action Plan also calls for a public-private partnership on extreme weather risk, with the goal of generating improved assessments of risk from different types of extreme weather events, using methods and data that are transparent and accessible. Finally, the U.S. Global Change Research Program and associated agencies work to advance the science necessary to inform decisions and sustain assessments. Collectively, these efforts represent increased emphasis across the Federal Government on the importance of information to support climate resilience.
Cudia, Paola; Weis, Luca; Baba, Alfonc; Kiper, Pawel; Marcante, Andrea; Rossi, Simonetta; Angelini, Corrado; Piccione, Francesco
2016-11-01
Functional electrical stimulation (FES) is a new rehabilitative approach that combines electrical stimulation with a functional task. This pilot study evaluated the safety and effectiveness of FES lower extremity training in myotonic dystrophy type 1. This is a controlled pilot study that enrolled 20 patients with myotonic dystrophy type 1 over 2 years. Eight patients (age, 39-67 years) fulfilled the inclusion criteria. Four participants performed FES cycling training for 15 days (one daily session of 30 minutes for 5 days a week). A control group, matched for clinical and genetic variables, who had contraindications to electrical stimulation, performed 6 weeks of conventional resistance and aerobic training. The modified Medical Research Council Scale and functional assessments were performed before and after treatment. Cohen d effect size was used for statistical analysis. Functional electrical stimulation induced lower extremity training was well tolerated and resulted in a greater improvement of tibialis anterior muscle strength (d = 1,583), overall muscle strength (d = 1,723), and endurance (d = 0,626) than conventional training. Functional electrical stimulation might be considered a safe and valid tool to improve muscle function, also in muscles severely compromised in which no other restorative options are available. Confirmation of FES efficacy through further clinical trials is strongly advised.
Techniques that Link Extreme Events to the Large Scale, Applied to California Heat Waves
NASA Astrophysics Data System (ADS)
Grotjahn, R.
2015-12-01
Understanding the mechanisms how Californian Central Valley (CCV) summer extreme hot spells develop is very important since the events have major impacts on the economy and human safety. Results from a series of CCV heat wave studies will be presented, emphasizing the techniques used. Key larger scale elements are identified statistically that are also consistent with synoptic and dynamic understanding of what must be present during extreme heat. Beyond providing a clear synoptic explanation, these key elements have high predictability, in part because soil moisture has little annual variation in the heavily-irrigated CCV. In turn, the predictability naturally leads to an effective tool to assess climate model simulation of these heat waves in historical and future climate scenarios. (Does the model develop extreme heat for the correct reasons?) Further work identified that these large scale elements arise in two quite different ways: one from expansion southwestward of a pre-existing heat wave in southwest Canada, the other formed in place from parcels traversing the North Pacific. The pre-existing heat wave explains an early result showing correlation between heat waves in Sacramento California, and other locations along the US west coast, including distant Seattle Washington. CCV heat waves can be preceded by unusually strong tropical Indian Ocean and Indonesian convection, this partial link may occur through an Asian subtropical jet wave guide. Another link revealed by diagnostics is a middle and higher latitude source of wave activity in Siberia and East Asia that also leads to the development of the CCV heat wave. This talk will address as many of these results and the tools used to obtain them as is reasonable within the available time.
Climate Change in the US: Potential Consequences for Human Health
NASA Technical Reports Server (NTRS)
Maynard, Nancy G.
2001-01-01
The U.S. National Assessment identified five major areas of consequences of climate change in the United States: temperature-related illnesses and deaths, health effects related to extreme weather events, air pollution-related health effects, water- and food-borne diseases, and insect-, tick-, and rodent-borne diseases. The U.S. National Assessment final conclusions about these potential health effects will be described. In addition, a summary of some of the new tools for studying human health aspects of climate change as well as environment-health linkages through remotely sensed data and observations will be provided.
NASA Astrophysics Data System (ADS)
Wartenburger, Richard; Hirschi, Martin; Donat, Markus G.; Greve, Peter; Pitman, Andy J.; Seneviratne, Sonia I.
2017-09-01
This article extends a previous study Seneviratne et al. (2016) to provide regional analyses of changes in climate extremes as a function of projected changes in global mean temperature. We introduce the DROUGHT-HEAT Regional Climate Atlas, an interactive tool to analyse and display a range of well-established climate extremes and water-cycle indices and their changes as a function of global warming. These projections are based on simulations from the fifth phase of the Coupled Model Intercomparison Project (CMIP5). A selection of example results are presented here, but users can visualize specific indices of interest using the online tool. This implementation enables a direct assessment of regional climate changes associated with global mean temperature targets, such as the 2 and 1.5° limits agreed within the 2015 Paris Agreement.
On the response of halophilic archaea to space conditions.
Leuko, Stefan; Rettberg, Petra; Pontifex, Ashleigh L; Burns, Brendan P
2014-02-21
Microorganisms are ubiquitous and can be found in almost every habitat and ecological niche on Earth. They thrive and survive in a broad spectrum of environments and adapt to rapidly changing external conditions. It is of great interest to investigate how microbes adapt to different extreme environments and with modern human space travel, we added a new extreme environment: outer space. Within the last 50 years, technology has provided tools for transporting microbial life beyond Earth's protective shield in order to study in situ responses to selected conditions of space. This review will focus on halophilic archaea, as, due to their ability to survive in extremes, they are often considered a model group of organisms to study responses to the harsh conditions associated with space. We discuss ground-based simulations, as well as space experiments, utilizing archaea, examining responses and/or resistance to the effects of microgravity and UV in particular. Several halophilic archaea (e.g., Halorubrum chaoviator) have been exposed to simulated and actual space conditions and their survival has been determined as well as the protective effects of halite shown. Finally, the intriguing potential of archaea to survive on other planets or embedded in a meteorite is postulated.
Sea Extremes: Integrated impact assessment in coastal climate adaptation
NASA Astrophysics Data System (ADS)
Sorensen, Carlo; Knudsen, Per; Broge, Niels; Molgaard, Mads; Andersen, Ole
2016-04-01
We investigate effects of sea level rise and a change in precipitation pattern on coastal flooding hazards. Historic and present in situ and satellite data of water and groundwater levels, precipitation, vertical ground motion, geology, and geotechnical soil properties are combined with flood protection measures, topography, and infrastructure to provide a more complete picture of the water-related impact from climate change at an exposed coastal location. Results show that future sea extremes evaluated from extreme value statistics may, indeed, have a large impact. The integrated effects from future storm surges and other geo- and hydro-parameters need to be considered in order to provide for the best protection and mitigation efforts, however. Based on the results we present and discuss a simple conceptual model setup that can e.g. be used for 'translation' of regional sea level rise evidence and projections to concrete impact measures. This may be used by potentially affected stakeholders -often working in different sectors and across levels of governance, in a common appraisal of the challenges faced ahead. The model may also enter dynamic tools to evaluate local impact as sea level research advances and projections for the future are updated.
On the Response of Halophilic Archaea to Space Conditions
Leuko, Stefan; Rettberg, Petra; Pontifex, Ashleigh L.; Burns, Brendan P.
2014-01-01
Microorganisms are ubiquitous and can be found in almost every habitat and ecological niche on Earth. They thrive and survive in a broad spectrum of environments and adapt to rapidly changing external conditions. It is of great interest to investigate how microbes adapt to different extreme environments and with modern human space travel, we added a new extreme environment: outer space. Within the last 50 years, technology has provided tools for transporting microbial life beyond Earth’s protective shield in order to study in situ responses to selected conditions of space. This review will focus on halophilic archaea, as, due to their ability to survive in extremes, they are often considered a model group of organisms to study responses to the harsh conditions associated with space. We discuss ground-based simulations, as well as space experiments, utilizing archaea, examining responses and/or resistance to the effects of microgravity and UV in particular. Several halophilic archaea (e.g., Halorubrum chaoviator) have been exposed to simulated and actual space conditions and their survival has been determined as well as the protective effects of halite shown. Finally, the intriguing potential of archaea to survive on other planets or embedded in a meteorite is postulated. PMID:25370029
Virtual Reality for Traumatic Brain Injury.
Zanier, Elisa R; Zoerle, Tommaso; Di Lernia, Daniele; Riva, Giuseppe
2018-01-01
In this perspective, we discuss the potential of virtual reality (VR) in the assessment and rehabilitation of traumatic brain injury, a silent epidemic of extremely high burden and no pharmacological therapy available. VR, endorsed by the mobile and gaming industries, is now available in more usable and cheaper tools allowing its therapeutic engagement both at the bedside and during the daily life at chronic stages after injury with terrific potential for a longitudinal disease modifying effect.
Development of a Wafer Positioning System for the Sandia Extreme Ultraviolet Lithography Tool
NASA Technical Reports Server (NTRS)
Wronosky, John B.; Smith, Tony G.; Darnold, Joel R.
1996-01-01
A wafer positioning system was recently developed by Sandia National Laboratories for an Extreme Ultraviolet Lithography (EUVL) tool. The system, which utilizes a magnetically levitated fine stage to provide ultra-precise positioning in all six degrees of freedom, incorporates technological improvements resulting from four years of prototype development. This paper describes the design, implementation, and functional capability of the system. Specifics regarding control system electronics, including software and control algorithm structure, as well as performance design goals and test results are presented. Potential system enhancements, some of which are in process, are also discussed.
Material Behavior At The Extreme Cutting Edge In Bandsawing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sarwar, Mohammed; Haider, Julfikar; Persson, Martin
2011-01-17
In recent years, bandsawing has been widely accepted as a favourite option for metal cutting off operations where the accuracy of cut, good surface finish, low kerf loss, long tool life and high material removal rate are required. Material removal by multipoint cutting tools such as bandsaw is a complex mechanism owing to the geometry of the bandsaw tooth (e.g., limited gullet size, tooth setting etc.) and the layer of material removed or undeformed chip thickness or depth of cut (5 {mu}m-50 {mu}m) being smaller than or equal to the cutting edge radius (5 {mu}m-15 {mu}m). This situation can leadmore » to inefficient material removal in bandsawing. Most of the research work are concentrated on the mechanics of material removal by single point cutting tool such as lathe tool. However, such efforts are very limited in multipoint cutting tools such as in bandsaw. This paper presents the fundamental understanding of the material behaviour at the extreme cutting edge of bandsaw tooth, which would help in designing and manufacturing of blades with higher cutting performance and life. ''High Speed Photography'' has been carried out to analyse the material removal process at the extreme cutting edge of bandsaw tooth. Geometric model of chip formation mechanisms based on the evidences found during ''High Speed Photography'' and ''Quick Stop'' process is presented. Wear modes and mechanism in bimetal and carbide tipped bandsaw teeth are also presented.« less
Water Power Data and Tools | Water Power | NREL
computer modeling tools and data with state-of-the-art design and analysis. Photo of a buoy designed around National Wind Technology Center's Information Portal as well as a WEC-Sim fact sheet. WEC Design Response Toolbox The WEC Design Response Toolbox provides extreme response and fatigue analysis tools specifically
Forensic Analysis of Windows Hosts Using UNIX-based Tools
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cory Altheide
2004-07-19
Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linuxmore » operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.« less
Evaluation of Process Performance for Sustainable Hard Machining
NASA Astrophysics Data System (ADS)
Rotella, Giovanna; Umbrello, Domenico; , Oscar W. Dillon, Jr.; Jawahir, I. S.
This paper aims to evaluate the sustainability performance of machining operation of through-hardening steel, AISI 52100, taking into account the impact of the material removal process in its various aspects. Experiments were performed for dry and cryogenic cutting conditions using chamfered cubic boron nitride (CBN) tool inserts at varying cutting conditions (cutting speed and feed rate). Cutting forces, mechanical power, tool wear, white layer thickness, surface roughness and residual stresses were investigated in order to evaluate the effects of extreme in-process cooling on the machined surface. The results indicate that cryogenic cooling has the potential to be used for surface integrity enhancement for improved product life and more sustainable functional performance.
The magnitude and effects of extreme solar particle events
NASA Astrophysics Data System (ADS)
Jiggens, Piers; Chavy-Macdonald, Marc-Andre; Santin, Giovanni; Menicucci, Alessandra; Evans, Hugh; Hilgers, Alain
2014-06-01
The solar energetic particle (SEP) radiation environment is an important consideration for spacecraft design, spacecraft mission planning and human spaceflight. Herein is presented an investigation into the likely severity of effects of a very large Solar Particle Event (SPE) on technology and humans in space. Fluences for SPEs derived using statistical models are compared to historical SPEs to verify their appropriateness for use in the analysis which follows. By combining environment tools with tools to model effects behind varying layers of spacecraft shielding it is possible to predict what impact a large SPE would be likely to have on a spacecraft in Near-Earth interplanetary space or geostationary Earth orbit. Also presented is a comparison of results generated using the traditional method of inputting the environment spectra, determined using a statistical model, into effects tools and a new method developed as part of the ESA SEPEM Project allowing for the creation of an effect time series on which statistics, previously applied to the flux data, can be run directly. The SPE environment spectra is determined and presented as energy integrated proton fluence (cm-2) as a function of particle energy (in MeV). This is input into the SHIELDOSE-2, MULASSIS, NIEL, GRAS and SEU effects tools to provide the output results. In the case of the new method for analysis, the flux time series is fed directly into the MULASSIS and GEMAT tools integrated into the SEPEM system. The output effect quantities include total ionising dose (in rads), non-ionising energy loss (MeV g-1), single event upsets (upsets/bit) and the dose in humans compared to established limits for stochastic (or cancer-causing) effects and tissue reactions (such as acute radiation sickness) in humans given in grey-equivalent and sieverts respectively.
Spatiotemporal Detection of Unusual Human Population Behavior Using Mobile Phone Data
Dobra, Adrian; Williams, Nathalie E.; Eagle, Nathan
2015-01-01
With the aim to contribute to humanitarian response to disasters and violent events, scientists have proposed the development of analytical tools that could identify emergency events in real-time, using mobile phone data. The assumption is that dramatic and discrete changes in behavior, measured with mobile phone data, will indicate extreme events. In this study, we propose an efficient system for spatiotemporal detection of behavioral anomalies from mobile phone data and compare sites with behavioral anomalies to an extensive database of emergency and non-emergency events in Rwanda. Our methodology successfully captures anomalous behavioral patterns associated with a broad range of events, from religious and official holidays to earthquakes, floods, violence against civilians and protests. Our results suggest that human behavioral responses to extreme events are complex and multi-dimensional, including extreme increases and decreases in both calling and movement behaviors. We also find significant temporal and spatial variance in responses to extreme events. Our behavioral anomaly detection system and extensive discussion of results are a significant contribution to the long-term project of creating an effective real-time event detection system with mobile phone data and we discuss the implications of our findings for future research to this end. PMID:25806954
NASA Astrophysics Data System (ADS)
Reinstorf, F.; Kramer, S.; Koch, T.; Pfützner, B.
2017-12-01
Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high-resolution groundwater level simulation was carried out. A decision support process with an intensive stakeholder interaction combined with high-resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.
ESH assessment of advanced lithography materials and processes
NASA Astrophysics Data System (ADS)
Worth, Walter F.; Mallela, Ram
2004-05-01
The ESH Technology group at International SEMATECH is conducting environment, safety, and health (ESH) assessments in collaboration with the lithography technologists evaluating the performance of an increasing number of new materials and technologies being considered for advanced lithography such as 157nm photresist and extreme ultraviolet (EUV). By performing data searches for 75 critical data types, emissions characterizations, and industrial hygiene (IH) monitoring during the use of the resist candidates, it has been shown that the best performing resist formulations, so far, appear to be free of potential ESH concerns. The ESH assessment of the EUV lithography tool that is being developed for SEMATECH has identified several features of the tool that are of ESH concern: high energy consumption, poor energy conversion efficiency, tool complexity, potential ergonomic and safety interlock issues, use of high powered laser(s), generation of ionizing radiation (soft X-rays), need for adequate shielding, and characterization of the debris formed by the extreme temperature of the plasma. By bringing these ESH challenges to the attention of the technologists and tool designers, it is hoped that the processes and tools can be made more ESH friendly.
Bartnicka, Joanna; Zietkiewicz, Agnieszka A; Kowalski, Grzegorz J
2018-03-19
With reference to four different minimally invasive surgery (MIS) cholecystectomy the aims were: to recognize the factors influencing dominant wrist postures manifested by the surgeon; to detect risk factors involved in maintaining deviated wrist postures; to compare the wrist postures of surgeons while using laparoscopic tools. Video films were recorded during live surgeries. The films were synchronized with wrist joint angles obtained from wireless electrogoniometers placed on the surgeon's hand. The analysis was conducted for five different laparoscopic tools used during all surgical techniques. The most common wrist posture was extension. In the case of one laparoscopic tool, the mean values defining extended wrist posture were distinct in all four surgical techniques. For one type of surgical technique, considered to be the most beneficial for patients, more extreme postures were noticed regarding all laparoscopic tools. We recognized a new factor, apart from the tool's handle design, that influences extreme and deviated wrist postures. It involves three areas of task specification including the type of action, type of motion patterns and motion dynamism. The outcomes proved that the surgical technique which is most beneficial for the patient imposes the greatest strain on the surgeon's wrist.
The National Extreme Events Data and Research Center (NEED)
NASA Astrophysics Data System (ADS)
Gulledge, J.; Kaiser, D. P.; Wilbanks, T. J.; Boden, T.; Devarakonda, R.
2014-12-01
The Climate Change Science Institute at Oak Ridge National Laboratory (ORNL) is establishing the National Extreme Events Data and Research Center (NEED), with the goal of transforming how the United States studies and prepares for extreme weather events in the context of a changing climate. NEED will encourage the myriad, distributed extreme events research communities to move toward the adoption of common practices and will develop a new database compiling global historical data on weather- and climate-related extreme events (e.g., heat waves, droughts, hurricanes, etc.) and related information about impacts, costs, recovery, and available research. Currently, extreme event information is not easy to access and is largely incompatible and inconsistent across web sites. NEED's database development will take into account differences in time frames, spatial scales, treatments of uncertainty, and other parameters and variables, and leverage informatics tools developed at ORNL (i.e., the Metadata Editor [1] and Mercury [2]) to generate standardized, robust documentation for each database along with a web-searchable catalog. In addition, NEED will facilitate convergence on commonly accepted definitions and standards for extreme events data and will enable integrated analyses of coupled threats, such as hurricanes/sea-level rise/flooding and droughts/wildfires. Our goal and vision is that NEED will become the premiere integrated resource for the general study of extreme events. References: [1] Devarakonda, Ranjeet, et al. "OME: Tool for generating and managing metadata to handle BigData." Big Data (Big Data), 2014 IEEE International Conference on. IEEE, 2014. [2] Devarakonda, Ranjeet, et al. "Mercury: reusable metadata management, data discovery and access system." Earth Science Informatics 3.1-2 (2010): 87-94.
Laser Resurfacing: Full Field and Fractional.
Pozner, Jason N; DiBernardo, Barry E
2016-07-01
Laser resurfacing is a very popular procedure worldwide. Full field and fractional lasers are used in many aesthetic practices. There have been significant advances in laser resurfacing in the past few years, which make patient treatments more efficacious and with less downtime. Erbium and carbon dioxide and ablative, nonablative, and hybrid fractional lasers are all extremely effective and popular tools that have a place in plastic surgery and dermatology offices. Copyright © 2016 Elsevier Inc. All rights reserved.
2015-10-30
predictors of ACL injury.25 189 Several studies investigate the effects of faulty movement and injury 190 prediction for the lower extremity. In 2006...at 40% and 39% of the total injuries, respectively.16 In 2012, 83 193 NCAA Division I football players participated in a survey to assess low back...recent study , firefighters performed the FMS™ and firefighter-specific testing. Two 218 of the musculoskeletal movement variables were predictive of
Armijo-Olivo, Susan; Woodhouse, Linda J; Steenstra, Ivan A; Gross, Douglas P
2016-12-01
To determine whether the Disabilities of the Arm, Shoulder, and Hand (DASH) tool added to the predictive ability of established prognostic factors, including patient demographic and clinical outcomes, to predict return to work (RTW) in injured workers with musculoskeletal (MSK) disorders of the upper extremity. A retrospective cohort study using a population-based database from the Workers' Compensation Board of Alberta (WCB-Alberta) that focused on claimants with upper extremity injuries was used. Besides the DASH, potential predictors included demographic, occupational, clinical and health usage variables. Outcome was receipt of compensation benefits after 3 months. To identify RTW predictors, a purposeful logistic modelling strategy was used. A series of receiver operating curve analyses were performed to determine which model provided the best discriminative ability. The sample included 3036 claimants with upper extremity injuries. The final model for predicting RTW included the total DASH score in addition to other established predictors. The area under the curve for this model was 0.77, which is interpreted as fair discrimination. This model was statistically significantly different than the model of established predictors alone (p<0.001). When comparing the DASH total score versus DASH item 23, a non-significant difference was obtained between the models (p=0.34). The DASH tool together with other established predictors significantly helped predict RTW after 3 months in participants with upper extremity MSK disorders. An appealing result for clinicians and busy researchers is that DASH item 23 has equal predictive ability to the total DASH score. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.
Kourgialas, Nektarios N; Dokou, Zoi; Karatzas, George P
2015-05-01
The purpose of this study was to create a modeling management tool for the simulation of extreme flow events under current and future climatic conditions. This tool is a combination of different components and can be applied in complex hydrogeological river basins, where frequent flood and drought phenomena occur. The first component is the statistical analysis of the available hydro-meteorological data. Specifically, principal components analysis was performed in order to quantify the importance of the hydro-meteorological parameters that affect the generation of extreme events. The second component is a prediction-forecasting artificial neural network (ANN) model that simulates, accurately and efficiently, river flow on an hourly basis. This model is based on a methodology that attempts to resolve a very difficult problem related to the accurate estimation of extreme flows. For this purpose, the available measurements (5 years of hourly data) were divided in two subsets: one for the dry and one for the wet periods of the hydrological year. This way, two ANNs were created, trained, tested and validated for a complex Mediterranean river basin in Crete, Greece. As part of the second management component a statistical downscaling tool was used for the creation of meteorological data according to the higher and lower emission climate change scenarios A2 and B1. These data are used as input in the ANN for the forecasting of river flow for the next two decades. The final component is the application of a meteorological index on the measured and forecasted precipitation and flow data, in order to assess the severity and duration of extreme events. Copyright © 2015 Elsevier Ltd. All rights reserved.
Near Real Time Analytics of Human Sensor Networks in the Realm of Big Data
NASA Astrophysics Data System (ADS)
Aulov, O.; Halem, M.
2012-12-01
With the prolific development of social media, emergency responders have an increasing interest in harvesting social media from outlets such as Flickr, Twitter, and Facebook, in order to assess the scale and specifics of extreme events including wild fires, earthquakes, terrorist attacks, oil spills, etc. A number of experimental platforms have successfully been implemented to demonstrate the utilization of social media data in extreme events, including Twitter Earthquake Detector, which relied on tweets for earthquake monitoring; AirTwitter, which used tweets for air quality reporting; and our previous work, using Flickr data as boundary value forcings to improve the forecast of oil beaching in the aftermath of the Deepwater Horizon oil spill. The majority of these platforms addressed a narrow, specific type of emergency and harvested data from a particular outlet. We demonstrate an interactive framework for monitoring, mining and analyzing a plethora of heterogeneous social media sources for a diverse range of extreme events. Our framework consists of three major parts: a real time social media aggregator, a data processing and analysis engine, and a web-based visualization and reporting tool. The aggregator gathers tweets, Facebook comments from fan pages, Google+ posts, forum discussions, blog posts (such as LiveJournal and Blogger.com), images from photo-sharing platforms (such as Flickr, Picasa), videos from video-sharing platforms (youtube, Vimeo), and so forth. The data processing and analysis engine pre-processes the aggregated information and annotates it with geolocation and sentiment information. In many cases, the metadata of the social media posts does not contain geolocation information—-however, a human reader can easily guess from the body of the text what location is discussed. We are automating this task by use of Named Entity Recognition (NER) algorithms and a gazetteer service. The visualization and reporting tool provides a web-based, user-friendly interface that provides time-series analysis and plotting tools, geo-spacial visualization tools with interactive maps, and cause-effect inference tools. We demonstrate how we address big data challenges of monitoring, aggregating and analyzing vast amounts of social media data at a near realtime. As a result, our framework not only allows emergency responders to augment their situational awareness with social media information, but can also allow them to extract geophysical data and incorporate it into their analysis models.
Multi-agent electricity market modeling with EMCAS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
North, M.; Macal, C.; Conzelmann, G.
2002-09-05
Electricity systems are a central component of modern economies. Many electricity markets are transitioning from centrally regulated systems to decentralized markets. Furthermore, several electricity markets that have recently undergone this transition have exhibited extremely unsatisfactory results, most notably in California. These high stakes transformations require the introduction of largely untested regulatory structures. Suitable tools that can be used to test these regulatory structures before they are applied to real systems are required. Multi-agent models can provide such tools. To better understand the requirements such as tool, a live electricity market simulation was created. This experience helped to shape the developmentmore » of the multi-agent Electricity Market Complex Adaptive Systems (EMCAS) model. To explore EMCAS' potential, several variations of the live simulation were created. These variations probed the possible effects of changing power plant outages and price setting rules on electricity market prices.« less
The utility of the KJOC score in professional baseball in the United States.
Franz, Justin O; McCulloch, Patrick C; Kneip, Chris J; Noble, Philip C; Lintner, David M
2013-09-01
The Kerlan-Jobe Orthopaedic Clinic (KJOC) Shoulder and Elbow questionnaire has been shown by previous studies to be more sensitive than other validated subjective measurement tools in the detection of upper extremity dysfunction in overhead-throwing athletes. The primary objective was to establish normative data for KJOC scores in professional baseball players in the United States. The secondary objectives were to evaluate the effect of player age, playing position, professional competition level, history of injury, history of surgery, and time point of administration on the KJOC score. Cross-sectional study; Level of evidence, 3. From 2011 to 2012, a total of 203 major league and minor league baseball players within the Houston Astros professional baseball organization completed the KJOC questionnaire. The questionnaire was administered at 3 time points: spring training 2011, end of season 2011, and spring training 2012. The KJOC scores were analyzed for significant differences based on player age, injury history, surgery history, fielding position, competition level, self-reported playing status, and time point of KJOC administration. The average KJOC score among healthy players with no history of injury was 97.1 for major league players and 96.8 for minor league players. The time point of administration did not significantly affect the final KJOC score (P = .224), and KJOC outcomes did not vary with player age (r = -0.012; P = .867). Significantly lower average KJOC scores were reported by players with a history of upper extremity injury (86.7; P < .001) and upper extremity surgery (75.4; P < .0001). The KJOC results did vary with playing position (P = .0313), with the lowest average scores being reported by pitchers (90.9) and infielders (91.3). This study establishes a quantitative baseline for the future evaluation of professional baseball players with the KJOC score. Age and time of administration had no significant effect on the outcome of the KJOC score. Missed practices or games within the previous year because of injury were the most significant demographic predictors of lower KJOC scores. The KJOC score was shown to be a sensitive measurement tool for detecting subtle changes in the upper extremity performance of the professional baseball population studied.
ATHLETE: Lunar Cargo Unloading from a High Deck
NASA Technical Reports Server (NTRS)
Wilcox, Brian H.
2010-01-01
As part of the NASA Exploration Technology Development Program, the Jet Propulsion Laboratory is developing a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. Each vehicle is based on six wheels at the ends of six multi-degree-of freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through or at least out of extreme terrain, the wheels and wheel actuators can be sized for nominal terrain. There are substantial mass savings in the wheel and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are at least comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be lighter than a conventional all-terrain mobility chassis. A side benefit of this approach is that each limb has sufficient degrees-of freedom to be used as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb. A power-take-off from the wheel actuates the tools, so that they can take advantage of the 1+ horsepower motor in each wheel to enable drilling, gripping or other power-tool functions.
ATHLETE: a Cargo and Habitat Transporter for the Moon
NASA Technical Reports Server (NTRS)
Wilcox, Brian H.
2009-01-01
As part of the NASA Exploration Technology Development Program, the Jet Propulsion Laboratory is developing a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. The vehicle concept is based on six wheels at the ends of six multi-degree-of-freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through (or at least out of) extreme terrain, the wheels and wheel actuators can be sized only for nominal terrain. There are substantial mass savings in the wheels and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be about 25 percent lighter than a conventional mobility chassis for planetary exploration. A side benefit of this approach is that each limb has sufficient degrees-of-freedom for use as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb. A rotating power-take-off from the wheel actuates the tools, so that they can take advantage of the 1-plus-horsepower motor in each wheel to enable drilling, gripping or other power-tool functions.
An "Extreme Makeover" of a Course in Special Education
ERIC Educational Resources Information Center
Nicoll-Senft, Joan M.
2009-01-01
Just as the popular television show "Extreme Makeover: Home Edition" targets the demolition and reconstruction of a home so that it better meets the needs of its owners, Fink's approach to integrated course design (ICD; 2003) provides higher education faculty with the tools to deconstruct and do a major remodel of their college courses. Teaching…
Kim, Jung Hee; Lee, Byoung-Hee
2015-06-01
The objective of this study was to evaluate the effects of mirror therapy in combination with biofeedback functional electrical stimulation (BF-FES) on motor recovery of the upper extremities after stroke. Twenty-nine patients who suffered a stroke > 6 months prior participated in this study and were randomly allocated to three groups. The BF-FES + mirror therapy and FES + mirror therapy groups practiced training for 5 × 30 min sessions over a 4-week period. The control group received a conventional physical therapy program. The following clinical tools were used to assess motor recovery of the upper extremities: electrical muscle tester, electrogoniometer, dual-inclinometer, electrodynamometer, the Box and Block Test (BBT) and Jabsen Taylor Hand Function Test (JHFT), the Functional Independence Measure, the Modified Ashworth Scale, and the Stroke Specific Quality of Life (SSQOL) assessment. The BF-FES + mirror therapy group showed significant improvement in wrist extension as revealed by the Manual Muscle Test and Range of Motion (p < 0.05). The BF-FES + mirror therapy group showed significant improvement in the BBT, JTHT, and SSQOL compared with the FES + mirror therapy group and control group (p < 0.05). We found that BF-FES + mirror therapy induced motor recovery and improved quality of life. These results suggest that mirror therapy, in combination with BF-FES, is feasible and effective for motor recovery of the upper extremities after stroke. Copyright © 2014 John Wiley & Sons, Ltd.
Generalized extreme gust wind speeds distributions
Cheng, E.; Yeung, C.
2002-01-01
Since summer 1996, the US wind engineers are using the extreme gust (or 3-s gust) as the basic wind speed to quantify the destruction of extreme winds. In order to better understand these destructive wind forces, it is important to know the appropriate representations of these extreme gust wind speeds. Therefore, the purpose of this study is to determine the most suitable extreme value distributions for the annual extreme gust wind speeds recorded in large selected areas. To achieve this objective, we are using the generalized Pareto distribution as the diagnostic tool for determining the types of extreme gust wind speed distributions. The three-parameter generalized extreme value distribution function is, thus, reduced to either Type I Gumbel, Type II Frechet or Type III reverse Weibull distribution function for the annual extreme gust wind speeds recorded at a specific site.With the considerations of the quality and homogeneity of gust wind data collected at more than 750 weather stations throughout the United States, annual extreme gust wind speeds at selected 143 stations in the contiguous United States were used in the study. ?? 2002 Elsevier Science Ltd. All rights reserved.
Mobile Phones Democratize and Cultivate Next-Generation Imaging, Diagnostics and Measurement Tools
Ozcan, Aydogan
2014-01-01
In this article, I discuss some of the emerging applications and the future opportunities and challenges created by the use of mobile phones and their embedded components for the development of next-generation imaging, sensing, diagnostics and measurement tools. The massive volume of mobile phone users, which has now reached ~7 billion, drives the rapid improvements of the hardware, software and high-end imaging and sensing technologies embedded in our phones, transforming the mobile phone into a cost-effective and yet extremely powerful platform to run e.g., biomedical tests and perform scientific measurements that would normally require advanced laboratory instruments. This rapidly evolving and continuing trend will help us transform how medicine, engineering and sciences are practiced and taught globally. PMID:24647550
NONMEMory: a run management tool for NONMEM.
Wilkins, Justin J
2005-06-01
NONMEM is an extremely powerful tool for nonlinear mixed-effect modelling and simulation of pharmacokinetic and pharmacodynamic data. However, it is a console-based application whose output does not lend itself to rapid interpretation or efficient management. NONMEMory has been created to be a comprehensive project manager for NONMEM, providing detailed summary, comparison and overview of the runs comprising a given project, including the display of output data, simple post-run processing, fast diagnostic plots and run output management, complementary to other available modelling aids. Analysis time ought not to be spent on trivial tasks, and NONMEMory's role is to eliminate these as far as possible by increasing the efficiency of the modelling process. NONMEMory is freely available from http://www.uct.ac.za/depts/pha/nonmemory.php.
Carbon contamination topography analysis of EUV masks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, Y.-J.; Yankulin, L.; Thomas, P.
2010-03-12
The impact of carbon contamination on extreme ultraviolet (EUV) masks is significant due to throughput loss and potential effects on imaging performance. Current carbon contamination research primarily focuses on the lifetime of the multilayer surfaces, determined by reflectivity loss and reduced throughput in EUV exposure tools. However, contamination on patterned EUV masks can cause additional effects on absorbing features and the printed images, as well as impacting the efficiency of cleaning process. In this work, several different techniques were used to determine possible contamination topography. Lithographic simulations were also performed and the results compared with the experimental data.
First-Order SPICE Modeling of Extreme-Temperature 4H-SiC JFET Integrated Circuits
NASA Technical Reports Server (NTRS)
Neudeck, Philip G.; Spry, David J.; Chen, Liang-Yu
2016-01-01
A separate submission to this conference reports that 4H-SiC Junction Field Effect Transistor (JFET) digital and analog Integrated Circuits (ICs) with two levels of metal interconnect have reproducibly demonstrated electrical operation at 500 C in excess of 1000 hours. While this progress expands the complexity and durability envelope of high temperature ICs, one important area for further technology maturation is the development of reasonably accurate and accessible computer-aided modeling and simulation tools for circuit design of these ICs. Towards this end, we report on development and verification of 25 C to 500 C SPICE simulation models of first order accuracy for this extreme-temperature durable 4H-SiC JFET IC technology. For maximum availability, the JFET IC modeling is implemented using the baseline-version SPICE NMOS LEVEL 1 model that is common to other variations of SPICE software and importantly includes the body-bias effect. The first-order accuracy of these device models is verified by direct comparison with measured experimental device characteristics.
Contamination Effects on EUV Optics
NASA Technical Reports Server (NTRS)
Tveekrem, J.
1999-01-01
During ground-based assembly and upon exposure to the space environment, optical surfaces accumulate both particles and molecular condensibles, inevitably resulting in degradation of optical instrument performance. Currently, this performance degradation (and the resulting end-of-life instrument performance) cannot be predicted with sufficient accuracy using existing software tools. Optical design codes exist to calculate instrument performance, but these codes generally assume uncontaminated optical surfaces. Contamination models exist which predict approximate end-of-life contamination levels, but the optical effects of these contamination levels can not be quantified without detailed information about the optical constants and scattering properties of the contaminant. The problem is particularly pronounced in the extreme ultraviolet (EUV, 300-1,200 A) and far (FUV, 1,200-2,000 A) regimes due to a lack of data and a lack of knowledge of the detailed physical and chemical processes involved. Yet it is in precisely these wavelength regimes that accurate predictions are most important, because EUV/FUV instruments are extremely sensitive to contamination.
Extreme events in total ozone: Spatio-temporal analysis from local to global scale
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; di Rocco, Stefania; Jancso, Leonhardt M.; Peter, Thomas; Davison, Anthony C.
2010-05-01
Recently tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) have been applied for the first time in the field of stratospheric ozone research, as statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not address the internal data structure concerning extremes adequately (Rieder et al., 2010a,b). A case study the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al., 1998a,b) illustrates that tools based on extreme value theory are appropriate to identify ozone extremes and to describe the tails of the total ozone record. Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances led to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more such fingerprints than conventional time series analysis of annual and seasonal mean values. Especially, the extremal analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b). Overall the extremes concept provides new information on time series properties, variability, trends and the influence of dynamics and chemistry, complementing earlier analyses focusing only on monthly (or annual) mean values. Findings described above could be proven also for the total ozone records of 5 other long-term series (Belsk, Hohenpeissenberg, Hradec Kralove, Potsdam, Uccle) showing that strong influence of atmospheric dynamics (NAO, ENSO) on total ozone is a global feature in the northern mid-latitudes (Rieder et al., 2010c). In a next step frequency distributions of extreme events are analyzed on global scale (northern and southern mid-latitudes). A specific focus here is whether findings gained through analysis of long-term European ground based stations can be clearly identified as a global phenomenon. By showing results from these three types of studies an overview of extreme events in total ozone (and the dynamical and chemical features leading to those) will be presented from local to global scales. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998a. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998b.
EPA’s Office of Research and Development (ORD) has been developing tools and illustrative case studies for decision makers in local and regional authorities who are facing challenges of establishing resilience to extreme weather events, aging built environment and infrastru...
Disability in the upper extremity and quality of life in hand-arm vibration syndrome.
Poole, Kerry; Mason, Howard
2005-11-30
To investigate whether hand-arm vibration syndrome (HAVS) leads to disability in the upper extremity or deficit in quality of life (QoL) using validated questionnaire tools, and to establish whether these effects are related to the Stockholm Workshop Staging (SWS). This was a postal cross-sectional questionnaire study with a 50% response rate. Four hundred and forty-four males, who had been diagnosed and staged according to the SWS were sent the Disability in the Arm, Shoulder and Hand (DASH) and the SF-36v2 QoL questionnaires. HAVS cases had significantly greater DASH disability scores and reduced QoL physical and mental component scores compared to published normal values. Those HAVS cases with a presumptive diagnosis of Carpal Tunnel Syndrome(CTS) had even higher disability scores. There was a clear, linear relationship between both the DASH disability score and the physical component of the QoL and sensorineural SWS, but not with the vascular SWS. HAVS has a significant effect on an individual's perceived ability to perform everyday tasks involving the upper extremity, and their quality of life. Physical capability may be further compromised in those individuals who have a presumptive diagnosis of CTS. These findings may have important implications regarding management of the affected worker.
Dependence and risk assessment for oil prices and exchange rate portfolios: A wavelet based approach
NASA Astrophysics Data System (ADS)
Aloui, Chaker; Jammazi, Rania
2015-10-01
In this article, we propose a wavelet-based approach to accommodate the stylized facts and complex structure of financial data, caused by frequent and abrupt changes of markets and noises. Specifically, we show how the combination of both continuous and discrete wavelet transforms with traditional financial models helps improve portfolio's market risk assessment. In the empirical stage, three wavelet-based models (wavelet-EGARCH with dynamic conditional correlations, wavelet-copula, and wavelet-extreme value) are considered and applied to crude oil price and US dollar exchange rate data. Our findings show that the wavelet-based approach provides an effective and powerful tool for detecting extreme moments and improving the accuracy of VaR and Expected Shortfall estimates of oil-exchange rate portfolios after noise is removed from the original data.
NASA Technical Reports Server (NTRS)
Pandya, Abhilash; Maida, James; Hasson, Scott; Greenisen, Michael; Woolford, Barbara
1993-01-01
As manned exploration of space continues, analytical evaluation of human strength characteristics is critical. These extraterrestrial environments will spawn issues of human performance which will impact the designs of tools, work spaces, and space vehicles. Computer modeling is an effective method of correlating human biomechanical and anthropometric data with models of space structures and human work spaces. The aim of this study is to provide biomechanical data from isolated joints to be utilized in a computer modeling system for calculating torque resulting from any upper extremity motions: in this study, the ratchet wrench push-pull operation (a typical extravehicular activity task). Established here are mathematical relationships used to calculate maximum torque production of isolated upper extremity joints. These relationships are a function of joint angle and joint velocity.
An application programming interface for extreme precipitation and hazard products
NASA Astrophysics Data System (ADS)
Kirschbaum, D.; Stanley, T.; Cappelaere, P. G.; Reed, J.; Lammers, M.
2016-12-01
Remote sensing data provides situational awareness of extreme events and hazards over large areas in a way that is impossible to achieve with in situ data. However, more valuable than raw data is actionable information based on user needs. This information can take the form of derived products, extraction of a subset of variables in a larger data matrix, or data processing for a specific goal. These products can then stream to the end users, who can use these data to improve local to global decision making. This presentation will outline both the science and methodology of two new data products and tools that can provide relevant climate and hazard data for response and support. The Global Precipitation Measurement (GPM) mission provides near real-time information on rain and snow around the world every thirty minutes. Through a new applications programing interface (API), this data can be freely accessed by consumers to visualize, analyze, and communicate where, when and how much rain is falling worldwide. The second tool is a global landslide model that provides situational awareness of potential landslide activity in near real-time, utilizing several remotely sensed data products. This hazard information is also provided through an API and is being ingested by the emergency response community, international aid organizations, and others around the world. This presentation will highlight lessons learned through the development, implementation, and communication of these products and tools with the goal of enabling better and more effective decision making.
Kawamura, Kunio
2017-01-01
Although studies about the origin of life are a frontier in science and a number of effective approaches have been developed, drawbacks still exist. Examples include: (1) simulation of chemical evolution experiments (which were demonstrated for the first time by Stanley Miller); (2) approaches tracing back the most primitive life-like systems (on the basis of investigations of present organisms); and (3) constructive approaches for making life-like systems (on the basis of molecular biology), such as in vitro construction of the RNA world. Naturally, simulation experiments of chemical evolution under plausible ancient Earth environments have been recognized as a potentially fruitful approach. Nevertheless, simulation experiments seem not to be sufficient for identifying the scenario from molecules to life. This is because primitive Earth environments are still not clearly defined and a number of possibilities should be taken into account. In addition, such environments frequently comprise extreme conditions when compared to the environments of present organisms. Therefore, we need to realize the importance of accurate and convenient experimental approaches that use practical research tools, which are resistant to high temperature and pressure, to facilitate chemical evolution studies. This review summarizes improvements made in such experimental approaches over the last two decades, focusing primarily on our hydrothermal microflow reactor technology. Microflow reactor systems are a powerful tool for performing simulation experiments in diverse simulated hydrothermal Earth conditions in order to measure the kinetics of formation and degradation and the interactions of biopolymers. PMID:28974048
NASA Astrophysics Data System (ADS)
Mascaro, Giuseppe
2018-04-01
This study uses daily rainfall records of a dense network of 240 gauges in central Arizona to gain insights on (i) the variability of the seasonal distributions of rainfall extremes; (ii) how the seasonal distributions affect the shape of the annual distribution; and (iii) the presence of spatial patterns and orographic control for these distributions. For this aim, recent methodological advancements in peak-over-threshold analysis and application of the Generalized Pareto Distribution (GPD) were used to assess the suitability of the GPD hypothesis and improve the estimation of its parameters, while limiting the effect of short sample sizes. The distribution of daily rainfall extremes was found to be heavy-tailed (i.e., GPD shape parameter ξ > 0) during the summer season, dominated by convective monsoonal thunderstorms. The exponential distribution (a special case of GPD with ξ = 0) was instead showed to be appropriate for modeling wintertime daily rainfall extremes, mainly caused by cold fronts transported by westerly flow. The annual distribution exhibited a mixed behavior, with lighter upper tails than those found in summer. A hybrid model mixing the two seasonal distributions was demonstrated capable of reproducing the annual distribution. Organized spatial patterns, mainly controlled by elevation, were observed for the GPD scale parameter, while ξ did not show any clear control of location or orography. The quantiles returned by the GPD were found to be very similar to those provided by the National Oceanic and Atmospheric Administration (NOAA) Atlas 14, which used the Generalized Extreme Value (GEV) distribution. Results of this work are useful to improve statistical modeling of daily rainfall extremes at high spatial resolution and provide diagnostic tools for assessing the ability of climate models to simulate extreme events.
Lateral position detection and control for friction stir systems
Fleming, Paul; Lammlein, David H.; Cook, George E.; Wilkes, Don Mitchell; Strauss, Alvin M.; Delapp, David R.; Hartman, Daniel A.
2012-06-05
An apparatus and computer program are disclosed for processing at least one workpiece using a rotary tool with rotating member for contacting and processing the workpiece. The methods include oscillating the rotary tool laterally with respect to a selected propagation path for the rotating member with respect to the workpiece to define an oscillation path for the rotating member. The methods further include obtaining force signals or parameters related to the force experienced by the rotary tool at least while the rotating member is disposed at the extremes of the oscillation. The force signals or parameters associated with the extremes can then be analyzed to determine a lateral position of the selected path with respect to a target path and a lateral offset value can be determined based on the lateral position. The lateral distance between the selected path and the target path can be decreased based on the lateral offset value.
Lateral position detection and control for friction stir systems
Fleming, Paul [Boulder, CO; Lammlein, David H [Houston, TX; Cook, George E [Brentwood, TN; Wilkes, Don Mitchell [Nashville, TN; Strauss, Alvin M [Nashville, TN; Delapp, David R [Ashland City, TN; Hartman, Daniel A [Fairhope, AL
2011-11-08
Friction stir methods are disclosed for processing at least one workpiece using a rotary tool with rotating member for contacting and processing the workpiece. The methods include oscillating the rotary tool laterally with respect to a selected propagation path for the rotating member with respect to the workpiece to define an oscillation path for the rotating member. The methods further include obtaining force signals or parameters related to the force experienced by the rotary tool at least while the rotating member is disposed at the extremes of the oscillation. The force signals or parameters associated with the extremes can then be analyzed to determine a lateral position of the selected path with respect to a target path and a lateral offset value can be determined based on the lateral position. The lateral distance between the selected path and the target path can be decreased based on the lateral offset value.
An Ensemble-Based Forecasting Framework to Optimize Reservoir Releases
NASA Astrophysics Data System (ADS)
Ramaswamy, V.; Saleh, F.
2017-12-01
Increasing frequency of extreme precipitation events are stressing the need to manage water resources on shorter timescales. Short-term management of water resources becomes proactive when inflow forecasts are available and this information can be effectively used in the control strategy. This work investigates the utility of short term hydrological ensemble forecasts for operational decision making during extreme weather events. An advanced automated hydrologic prediction framework integrating a regional scale hydrologic model, GIS datasets and the meteorological ensemble predictions from the European Center for Medium Range Weather Forecasting (ECMWF) was coupled to an implicit multi-objective dynamic programming model to optimize releases from a water supply reservoir. The proposed methodology was evaluated by retrospectively forecasting the inflows to the Oradell reservoir in the Hackensack River basin in New Jersey during the extreme hydrologic event, Hurricane Irene. Additionally, the flexibility of the forecasting framework was investigated by forecasting the inflows from a moderate rainfall event to provide important perspectives on using the framework to assist reservoir operations during moderate events. The proposed forecasting framework seeks to provide a flexible, assistive tool to alleviate the complexity of operational decision-making.
Stone, Brian; Hess, Jeremy J.; Frumkin, Howard
2010-01-01
Background Extreme heat events (EHEs) are increasing in frequency in large U.S. cities and are responsible for a greater annual number of climate-related fatalities, on average, than any other form of extreme weather. In addition, low-density, sprawling patterns of urban development have been associated with enhanced surface temperatures in urbanized areas. Objectives In this study. we examined the association between urban form at the level of the metropolitan region and the frequency of EHEs over a five-decade period. Methods We employed a widely published sprawl index to measure the association between urban form in 2000 and the mean annual rate of change in EHEs between 1956 and 2005. Results We found that the rate of increase in the annual number of EHEs between 1956 and 2005 in the most sprawling metropolitan regions was more than double the rate of increase observed in the most compact metropolitan regions. Conclusions The design and management of land use in metropolitan regions may offer an important tool for adapting to the heat-related health effects associated with ongoing climate change. PMID:21114000
NASA Astrophysics Data System (ADS)
Reinstorf, F.
2016-12-01
Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.
NASA Astrophysics Data System (ADS)
Reinstorf, Frido; Kramer, Stefanie; Koch, Thomas; Seifert, Sven; Monninkhoff, Bertram; Pfützner, Bernd
2017-04-01
Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.
A Short-term ESPERTA-based Forecast Tool for Moderate-to-extreme Solar Proton Events
NASA Astrophysics Data System (ADS)
Laurenza, M.; Alberti, T.; Cliver, E. W.
2018-04-01
The ESPERTA (Empirical model for Solar Proton Event Real Time Alert) forecast tool has a Probability of Detection (POD) of 63% for all >10 MeV events with proton peak intensity ≥10 pfu (i.e., ≥S1 events, S1 referring to minor storms on the NOAA Solar Radiation Storms scale), from 1995 to 2014 with a false alarm rate (FAR) of 38% and a median (minimum) warning time (WT) of ∼4.8 (0.4) hr. The NOAA space weather scale includes four additional categories: moderate (S2), strong (S3), severe (S4), and extreme (S5). As S1 events have only minor impacts on HF radio propagation in the polar regions, the effective threshold for significant space radiation effects appears to be the S2 level (100 pfu), above which both biological and space operation impacts are observed along with increased effects on HF propagation in the polar regions. We modified the ESPERTA model to predict ≥S2 events and obtained a POD of 75% (41/55) and an FAR of 24% (13/54) for the 1995–2014 interval with a median (minimum) WT of ∼1.7 (0.2) hr based on predictions made at the time of the S1 threshold crossing. The improved performance of ESPERTA for ≥S2 events is a reflection of the big flare syndrome, which postulates that the measures of the various manifestations of eruptive solar flares increase as one considers increasingly larger events.
NASA Astrophysics Data System (ADS)
von Trentini, F.; Willkofer, F.; Wood, R. R.; Schmid, F. J.; Ludwig, R.
2017-12-01
The ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) focuses on the effects of climate change on hydro-meteorological extreme events and their implications for water management in Bavaria and Québec. Therefore, a hydro-meteorological model chain is applied. It employs high performance computing capacity of the Leibniz Supercomputing Centre facility SuperMUC to dynamically downscale 50 members of the Global Circulation Model CanESM2 over European and Eastern North American domains using the Canadian Regional Climate Model (RCM) CRCM5. Over Europe, the unique single model ensemble is conjointly analyzed with the latest information provided through the CORDEX-initiative, to better assess the influence of natural climate variability and climatic change in the dynamics of extreme events. Furthermore, these 50 members of a single RCM will enhance extreme value statistics (extreme return periods) by exploiting the available 1500 model years for the reference period from 1981 to 2010. Hence, the RCM output is applied to drive the process based, fully distributed, and deterministic hydrological model WaSiM in high temporal (3h) and spatial (500m) resolution. WaSiM and the large ensemble are further used to derive a variety of hydro-meteorological patterns leading to severe flood events. A tool for virtual perfect prediction shall provide a combination of optimal lead time and management strategy to mitigate certain flood events following these patterns.
Crop insurance evaluation in response to extreme events
NASA Astrophysics Data System (ADS)
Moriondo, Marco; Ferrise, Roberto; Bindi, Marco
2013-04-01
Crop yield insurance has been indicated as a tool to manage the uncertainties of crop yields (Sherrick et al., 2004) but the changes in crop yield variability as expected in the near future should be carefully considered for a better quantitative assessment of farmer's revenue risk and insurance values in a climatic change regime (Moriondo et al., 2011). Under this point of view, mechanistic crop growth models coupled to the output of General/Regional Circulation Models (GCMs, RCMs) offer a valuable tool to evaluate crop responses to climatic change and this approach has been extensively used to describe crop yield distribution in response to climatic change considering changes in both mean climate and variability. In this work, we studied the effect of a warmer climate on crop yield distribution of durum wheat (Triticum turgidum L. subsp durum) in order to assess the economic significance of climatic change in a risk decision context. Specifically, the outputs of 6 RCMs (Tmin, Tmax, Rainfall, Global Radiation) (van der Linden and Mitchell 2009) have been statistically downscaled by a stochastic weather generator over eight sites across the Mediterranean basin and used to feed the crop growth model Sirius Quality. Three time slices were considered i) the present period PP (average of the period 1975-1990, [CO2]=350 ppm), 2020 (average of the period 2010-2030, SRES scenario A1b, [CO2]=415 ppm) and 2040 (average of the period 2030-2050, SRES scenario A1b, [CO2]=480 ppm). The effect of extreme climate events (i.e. heat stress at anthesis stage) was also considered. The outputs of these simulations were used to estimate the expected payout per hectare from insurance triggered when yields fall below a specific threshold defined as "the insured yield". For each site, the threshold was calculated as a fraction (70%) of the median of yield distribution under PP that represents the percentage of median yield above which indemnity payments are triggered. The results indicated that when the effect of extreme events was not considered, climate change had a low or no impact on crop yield distribution in 2020 and 2040. This resulted into an expected payout close to what observed in the present period. Conversely, the simulation of the effect of extreme events highly affected the PDFs by reducing the expected yield. This highlights that insured yield in future projections may be overestimated when not considering the impact of extremes, leading to distortions in the risk management of crop insurance companies. References Moriondo M, Giannakopoulos C, Bindi M (2011) Climate ch'ange impact assessment: the role of climate extremes in crop yield simulation. Clim Change 104:679-701 Sherrick BJ, Zanini FC, Schnitkey GD, Irwin SH (2004) Crop Insurance Valuation under Alternative Yield Distributions. American Journal of Agricultural Economics, 86:406-419. van der Linden P, Mitchell JFB (eds) (2009) ENSEMBLES: climate change and its impacts: summary of research and results from the ENSEMBLES project. Met Office Hadley Centre, FitzRoy Road, Exeter EX1 3 PB, UK. 160 pp
Synthesis and characterization of attosecond light vortices in the extreme ultraviolet
Géneaux, R.; Camper, A.; Auguste, T.; Gobert, O.; Caillat, J.; Taïeb, R.; Ruchon, T.
2016-01-01
Infrared and visible light beams carrying orbital angular momentum (OAM) are currently thoroughly studied for their extremely broad applicative prospects, among which are quantum information, micromachining and diagnostic tools. Here we extend these prospects, presenting a comprehensive study for the synthesis and full characterization of optical vortices carrying OAM in the extreme ultraviolet (XUV) domain. We confirm the upconversion rules of a femtosecond infrared helically phased beam into its high-order harmonics, showing that each harmonic order carries the total number of OAM units absorbed in the process up to very high orders (57). This allows us to synthesize and characterize helically shaped XUV trains of attosecond pulses. To demonstrate a typical use of these new XUV light beams, we show our ability to generate and control, through photoionization, attosecond electron beams carrying OAM. These breakthroughs pave the route for the study of a series of fundamental phenomena and the development of new ultrafast diagnosis tools using either photonic or electronic vortices. PMID:27573787
Possible alternatives to critical elements in coatings for extreme applications
NASA Astrophysics Data System (ADS)
Grilli, Maria Luisa; Valerini, Daniele; Piticescu, Radu Robert; Bellezze, Tiziano; Yilmaz, Mehmet; Rinaldi, Antonio; Cuesta-López, Santiago; Rizzo, Antonella
2018-03-01
Surface functionalisation and protection have been used since a long time for improving specific properties of materials such as lubrication, water repellence, brightness, and for increasing durability of objects and tools. Among the different kinds of surface treatments used to achieve the required properties, the use of coatings is fundamental to guarantee substrate durability in harsh environments. Extreme working conditions of temperature, pressure, irradiation, wear and corrosion occur in several applications, thus very often requiring bulk material protection by means of coatings. In this study, three main classes of coatings used in extreme conditions are considered: i) hard and superhard coatings for application in machining tools, ii) coatings for high temperatures (thermal barrier coatings), and iii) coatings against corrosion. The presence of critical elements in such coatings (Cr, Y, W, Co, etc.) is analysed and the possibility to use CRMs-free substitutes is reviewed. The role of multilayers and nanocomposites in tailoring coating performances is also discussed for thermal barrier and superhard coatings.
Synthesis and characterization of attosecond light vortices in the extreme ultraviolet
Géneaux, R.; Camper, A.; Auguste, T.; ...
2016-08-30
Infrared and visible light beams carrying orbital angular momentum (OAM) are currently thoroughly studied for their extremely broad applicative prospects, among which are quantum information, micromachining and diagnostic tools. Here we extend these prospects, presenting a comprehensive study for the synthesis and full characterization of optical vortices carrying OAM in the extreme ultraviolet (XUV) domain. We confirm the upconversion rules of a femtosecond infrared helically phased beam into its high-order harmonics, showing that each harmonic order carries the total number of OAM units absorbed in the process up to very high orders (57). This allows us to synthesize and characterizemore » helically shaped XUV trains of attosecond pulses. To demonstrate a typical use of these new XUV light beams, we show our ability to generate and control, through photoionization, attosecond electron beams carrying OAM. Furthermore, these breakthroughs pave the route for the study of a series of fundamental phenomena and the development of new ultrafast diagnosis tools using either photonic or electronic vortices.« less
Estimation of resist sensitivity for extreme ultraviolet lithography using an electron beam
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oyama, Tomoko Gowa, E-mail: ohyama.tomoko@qst.go.jp; Oshima, Akihiro; Tagawa, Seiichi, E-mail: tagawa@sanken.osaka-u.ac.jp
2016-08-15
It is a challenge to obtain sufficient extreme ultraviolet (EUV) exposure time for fundamental research on developing a new class of high sensitivity resists for extreme ultraviolet lithography (EUVL) because there are few EUV exposure tools that are very expensive. In this paper, we introduce an easy method for predicting EUV resist sensitivity by using conventional electron beam (EB) sources. If the chemical reactions induced by two ionizing sources (EB and EUV) are the same, the required absorbed energies corresponding to each required exposure dose (sensitivity) for the EB and EUV would be almost equivalent. Based on this theory, wemore » calculated the resist sensitivities for the EUV/soft X-ray region. The estimated sensitivities were found to be comparable to the experimentally obtained sensitivities. It was concluded that EB is a very useful exposure tool that accelerates the development of new resists and sensitivity enhancement processes for 13.5 nm EUVL and 6.x nm beyond-EUVL (BEUVL).« less
Decision-support tools for Extreme Weather and Climate Events in the Northeast United States
NASA Astrophysics Data System (ADS)
Kumar, S.; Lowery, M.; Whelchel, A.
2013-12-01
Decision-support tools were assessed for the 2013 National Climate Assessment technical input document, "Climate Change in the Northeast, A Sourcebook". The assessment included tools designed to generate and deliver actionable information to assist states and highly populated urban and other communities in assessment of climate change vulnerability and risk, quantification of effects, and identification of adaptive strategies in the context of adaptation planning across inter-annual, seasonal and multi-decadal time scales. State-level adaptation planning in the Northeast has generally relied on qualitative vulnerability assessments by expert panels and stakeholders, although some states have undertaken initiatives to develop statewide databases to support vulnerability assessments by urban and local governments, and state agencies. The devastation caused by Superstorm Sandy in October 2012 has raised awareness of the potential for extreme weather events to unprecedented levels and created urgency for action, especially in coastal urban and suburban communities that experienced pronounced impacts - especially in New Jersey, New York and Connecticut. Planning approaches vary, but any adaptation and resiliency planning process must include the following: - Knowledge of the probable change in a climate variable (e.g., precipitation, temperature, sea-level rise) over time or that the climate variable will attain a certain threshold deemed to be significant; - Knowledge of intensity and frequency of climate hazards (past, current or future events or conditions with potential to cause harm) and their relationship with climate variables; - Assessment of climate vulnerabilities (sensitive resources, infrastructure or populations exposed to climate-related hazards); - Assessment of relative risks to vulnerable resources; - Identification and prioritization of adaptive strategies to address risks. Many organizations are developing decision-support tools to assist in the urban planning process by addressing some of these needs. In this paper we highlight the decision tools available today, discuss their application in selected case studies, and present a gap analysis with opportunities for innovation and future work.
ATHLETE: A Limbed Vehicle for Solar System Exploration
NASA Technical Reports Server (NTRS)
Wilcox, Brian H.
2012-01-01
As part of the Human-Robot Systems project funded by NASA, the Jet Propulsion Laboratory has developed a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. Each vehicle is based on six wheels at the ends of six multi-degree-of-freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through or at least out of extreme terrain, the wheels and wheel actuators can be sized for nominal terrain. There are substantial mass savings in the wheel and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be about 25% lighter than a conventional mobility chassis. A side benefit of this approach is that each limb has sufficient degrees-of-freedom to use as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb.
Effects of visualization on algorithm comprehension
NASA Astrophysics Data System (ADS)
Mulvey, Matthew
Computer science students are expected to learn and apply a variety of core algorithms which are an essential part of the field. Any one of these algorithms by itself is not necessarily extremely complex, but remembering the large variety of algorithms and the differences between them is challenging. To address this challenge, we present a novel algorithm visualization tool designed to enhance students understanding of Dijkstra's algorithm by allowing them to discover the rules of the algorithm for themselves. It is hoped that a deeper understanding of the algorithm will help students correctly select, adapt and apply the appropriate algorithm when presented with a problem to solve, and that what is learned here will be applicable to the design of other visualization tools designed to teach different algorithms. Our visualization tool is currently in the prototype stage, and this thesis will discuss the pedagogical approach that informs its design, as well as the results of some initial usability testing. Finally, to clarify the direction for further development of the tool, four different variations of the prototype were implemented, and the instructional effectiveness of each was assessed by having a small sample participants use the different versions of the prototype and then take a quiz to assess their comprehension of the algorithm.
Karmakar, Sougata; Pal, Madhu Sudan; Majumdar, Deepti; Majumdar, Dhurjati
2012-01-01
Ergonomic evaluation of visual demands becomes crucial for the operators/users when rapid decision making is needed under extreme time constraint like navigation task of jet aircraft. Research reported here comprises ergonomic evaluation of pilot's vision in a jet aircraft in virtual environment to demonstrate how vision analysis tools of digital human modeling software can be used effectively for such study. Three (03) dynamic digital pilot models, representative of smallest, average and largest Indian pilot population were generated from anthropometric database and interfaced with digital prototype of the cockpit in Jack software for analysis of vision within and outside the cockpit. Vision analysis tools like view cones, eye view windows, blind spot area, obscuration zone, reflection zone etc. were employed during evaluation of visual fields. Vision analysis tool was also used for studying kinematic changes of pilot's body joints during simulated gazing activity. From present study, it can be concluded that vision analysis tool of digital human modeling software was found very effective in evaluation of position and alignment of different displays and controls in the workstation based upon their priorities within the visual fields and anthropometry of the targeted users, long before the development of its physical prototype.
Parmar, Sanjay; Gandhi, Dorcas BC; Rempel, Gina Ruth; Restall, Gayle; Sharma, Monika; Narayan, Amitesh; Pandian, Jeyaraj; Naik, Nilashri; Savadatti, Ravi R; Kamate, Mahesh Appasaheb
2017-01-01
Background It is difficult to engage young children with cerebral palsy (CP) in repetitive, tedious therapy. As such, there is a need for innovative approaches and tools to motivate these children. We developed the low-cost, computer game-based rehabilitation platform CGR that combines fine manipulation and gross movement exercises with attention and planning game activities appropriate for young children with CP. Objective The objective of this study is to provide evidence of the therapeutic value of CGR to improve upper extremity (UE) motor function for children with CP. Methods This randomized controlled, single-blind, clinical trial with an active control arm will be conducted at 4 sites. Children diagnosed with CP between the ages of 4 and 10 years old with moderate UE impairments and fine motor control abnormalities will be recruited. Results We will test the difference between experimental and control groups using the Quality of Upper Extremity Skills Test (QUEST) and Peabody Developmental Motor Scales, Second Edition (PDMS-2) outcome measures. The parents of the children and the therapist experiences with the interventions and tools will be explored using semi-structured interviews using the qualitative description approach. Conclusions This research protocol, if effective, will provide evidence for the therapeutic value and feasibility of CGR in the pediatric rehabilitation of UE function. Trial Registration Clinicaltrials.gov NCT02728375; http:https://clinicaltrials.gov/ct2/show/NCT02728375 (Archived by WebCite at http://www.webcitation.org/6qDjvszvh) PMID:28526673
NASA Astrophysics Data System (ADS)
Fraisse, C.; Pequeno, D.; Staub, C. G.; Perry, C.
2016-12-01
Climate variability, particularly the occurrence of extreme weather conditions such as dry spells and heat stress during sensitive crop developmental phases can substantially increase the prospect of reduced crop yields. Yield losses or crop failure risk due to stressful weather conditions vary mainly due to stress severity and exposure time and duration. The magnitude of stress effects is also crop specific, differing in terms of thresholds and adaptation to environmental conditions. To help producers in the Southeast USA mitigate and monitor the risk of crop losses due to extreme weather events we developed a web-based tool that evaluates the risk of extreme weather events during the season taking into account the crop development stages. Producers can enter their plans for the upcoming season in a given field (e.g. crop, variety, planting date, acreage etc.), select or not a specific El Nino Southern Oscillation (ENSO) phase, and will be presented with the probabilities (ranging from 0 -100%) of extreme weather events occurring during sensitive phases of the growing season for the selected conditions. The DSSAT models CERES-Maize, CROPGRO-Soybean, CROPGRO-Cotton, and N-Wheat phenology models have been translated from FORTRAN to a standalone versions in R language. These models have been tested in collaboration with Extension faculty and producers during the 2016 season and their usefulness for risk mitigation and monitoring evaluated. A companion AgroClimate app was also developed to help producers track and monitor phenology development during the cropping season.
Kong, Y K; Lee, S J; Lee, K S; Kim, G R; Kim, D M
2015-10-01
Researchers have been using various ergonomic tools to study occupational musculoskeletal diseases in industrial contexts. However, in agricultural work, where the work environment is poorer and the socio-psychological stress is high due to the high labor intensities of the industry, current research efforts have been scarce, and the number of available tools is small. In our preliminary studies, which focused on a limited number of body parts and other working elements, we developed separate evaluation tools for the upper and lower extremities. The current study was conducted to develop a whole-body ergonomic assessment tool for agricultural work that integrates the existing assessment tools for lower and upper extremities developed in the preliminary studies and to verify the relevance of the integrated assessment tool. To verify the relevance of the Agricultural Whole-Body Assessment (AWBA) tool, we selected 50 different postures that occur frequently in agricultural work. Our results showed that the AWBA-determined risk levels were similar to the subjective risk levels determined by experts. In addition, as the risk level increased, the average risk level increased to a similar extent. Moreover, the differences in risk levels between the AWBA and expert assessments were mostly smaller than the differences in risk levels between other assessment tools and the expert assessments in this study. In conclusion, the AWBA tool developed in this study was demonstrated to be appropriate for use as a tool for assessing various postures commonly assumed in agricultural work. Moreover, we believe that our verification of the assessment tools will contribute to the enhancement of the quality of activities designed to prevent and control work-related musculoskeletal diseases in other industries.
Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xiu, Dongbin
2017-03-03
The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Quock, D. E. R.; Cianciarulo, M. B.; APS Engineering Support Division
2007-01-01
The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, themore » necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.« less
Dowling, N Maritza; Bolt, Daniel M; Deng, Sien; Li, Chenxi
2016-05-26
Patient-reported outcome (PRO) measures play a key role in the advancement of patient-centered care research. The accuracy of inferences, relevance of predictions, and the true nature of the associations made with PRO data depend on the validity of these measures. Errors inherent to self-report measures can seriously bias the estimation of constructs assessed by the scale. A well-documented disadvantage of self-report measures is their sensitivity to response style (RS) effects such as the respondent's tendency to select the extremes of a rating scale. Although the biasing effect of extreme responding on constructs measured by self-reported tools has been widely acknowledged and studied across disciplines, little attention has been given to the development and systematic application of methodologies to assess and control for this effect in PRO measures. We review the methodological approaches that have been proposed to study extreme RS effects (ERS). We applied a multidimensional item response theory model to simultaneously estimate and correct for the impact of ERS on trait estimation in a PRO instrument. Model estimates were used to study the biasing effects of ERS on sum scores for individuals with the same amount of the targeted trait but different levels of ERS. We evaluated the effect of joint estimation of multiple scales and ERS on trait estimates and demonstrated the biasing effects of ERS on these trait estimates when used as explanatory variables. A four-dimensional model accounting for ERS bias provided a better fit to the response data. Increasing levels of ERS showed bias in total scores as a function of trait estimates. The effect of ERS was greater when the pattern of extreme responding was the same across multiple scales modeled jointly. The estimated item category intercepts provided evidence of content independent category selection. Uncorrected trait estimates used as explanatory variables in prediction models showed downward bias. A comprehensive evaluation of the psychometric quality and soundness of PRO assessment measures should incorporate the study of ERS as a potential nuisance dimension affecting the accuracy and validity of scores and the impact of PRO data in clinical research and decision making.
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Davison, A. C.
2009-04-01
Various generations of satellites (e.g. TOMS, GOME, OMI) made spatial datasets of column ozone available to the scientific community. This study has a special focus on column ozone over the northern mid-latitudes. Tools from geostatistics and extreme value theory are applied to analyze variability, long-term trends and frequency distributions of extreme events in total ozone. In a recent case study (Rieder et al., 2009) new tools from extreme value theory (Coles, 2001; Ribatet, 2007) have been applied to the world's longest total ozone record from Arosa, Switzerland (e.g. Staehelin 1998a,b), in order to describe extreme events in low and high total ozone. Within the current study this analysis is extended to satellite datasets for the northern mid-latitudes. Further special emphasis is given on patterns and spatial correlations and the influence of changes in atmospheric dynamics (e.g. tropospheric and lower stratospheric pressure systems) on column ozone. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and Davison, A.C.: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
Test drilling in basalts, Lalamilo area, South Kohala District, Hawaii
Teasdale, Warren E.
1980-01-01
Test drilling has determined that a downhole-percussion airhammer can be used effectively to drill basalts in Hawaii. When used in conjunction with a foam-type drilling fluid, the hammer-bit penetration rate was rapid. Continuous drill cuttings from the materials penetrated were obtained throughout the borehole except from extremely fractured or weathered basalt zones where circulation was lost or limited. Cementing of these zones as soon as encountered reduced problems of stuck tools, washouts, and loss of drill-cuttings. Supplies and logistics on the Hawaiian Islands, always a major concern, require that all anticipated drilling supplies, spare rig and tool parts, drilling muds and additives, foam, and miscellaneous hardware be on hand before starting to drill. If not, the resulting rig downtime is costly in both time and money. (USGS)
SoFAST: Automated Flare Detection with the PROBA2/SWAP EUV Imager
NASA Astrophysics Data System (ADS)
Bonte, K.; Berghmans, D.; De Groof, A.; Steed, K.; Poedts, S.
2013-08-01
The Sun Watcher with Active Pixels and Image Processing (SWAP) EUV imager onboard PROBA2 provides a non-stop stream of coronal extreme-ultraviolet (EUV) images at a cadence of typically 130 seconds. These images show the solar drivers of space-weather, such as flares and erupting filaments. We have developed a software tool that automatically processes the images and localises and identifies flares. On one hand, the output of this software tool is intended as a service to the Space Weather Segment of ESA's Space Situational Awareness (SSA) program. On the other hand, we consider the PROBA2/SWAP images as a model for the data from the Extreme Ultraviolet Imager (EUI) instrument prepared for the future Solar Orbiter mission, where onboard intelligence is required for prioritising data within the challenging telemetry quota. In this article we present the concept of the software, the first statistics on its effectiveness and the online display in real time of its results. Our results indicate that it is not only possible to detect EUV flares automatically in an acquired dataset, but that quantifying a range of EUV dynamics is also possible. The method is based on thresholding of macropixelled image sequences. The robustness and simplicity of the algorithm is a clear advantage for future onboard use.
NASA Astrophysics Data System (ADS)
Hirano, Ryoichi; Iida, Susumu; Amano, Tsuyoshi; Watanabe, Hidehiro; Hatakeyama, Masahiro; Murakami, Takeshi; Suematsu, Kenichi; Terao, Kenji
2016-03-01
Novel projection electron microscope optics have been developed and integrated into a new inspection system named EBEYE-V30 ("Model EBEYE" is an EBARA's model code) , and the resulting system shows promise for application to half-pitch (hp) 16-nm node extreme ultraviolet lithography (EUVL) patterned mask inspection. To improve the system's inspection throughput for 11-nm hp generation defect detection, a new electron-sensitive area image sensor with a high-speed data processing unit, a bright and stable electron source, and an image capture area deflector that operates simultaneously with the mask scanning motion have been developed. A learning system has been used for the mask inspection tool to meet the requirements of hp 11-nm node EUV patterned mask inspection. Defects are identified by the projection electron microscope system using the "defectivity" from the characteristics of the acquired image. The learning system has been developed to reduce the labor and costs associated with adjustment of the detection capability to cope with newly-defined mask defects. We describe the integration of the developed elements into the inspection tool and the verification of the designed specification. We have also verified the effectiveness of the learning system, which shows enhanced detection capability for the hp 11-nm node.
Eltoukhy, Moataz; Kelly, Adam; Kim, Chang-Young; Jun, Hyung-Pil; Campbell, Richard; Kuenze, Christopher
2016-01-01
Cost effective, quantifiable assessment of lower extremity movement represents potential improvement over standard tools for evaluation of injury risk. Ten healthy participants completed three trials of a drop jump, overhead squat, and single leg squat task. Peak hip and knee kinematics were assessed using an 8 camera BTS Smart 7000DX motion analysis system and the Microsoft Kinect® camera system. The agreement and consistency between both uncorrected and correct Kinect kinematic variables and the BTS camera system were assessed using interclass correlations coefficients. Peak sagittal plane kinematics measured using the Microsoft Kinect® camera system explained a significant amount of variance [Range(hip) = 43.5-62.8%; Range(knee) = 67.5-89.6%] in peak kinematics measured using the BTS camera system. Across tasks, peak knee flexion angle and peak hip flexion were found to be consistent and in agreement when the Microsoft Kinect® camera system was directly compared to the BTS camera system but these values were improved following application of a corrective factor. The Microsoft Kinect® may not be an appropriate surrogate for traditional motion analysis technology, but it may have potential applications as a real-time feedback tool in pathological or high injury risk populations.
Theoretical model for plasmonic photothermal response of gold nanostructures solutions
NASA Astrophysics Data System (ADS)
Phan, Anh D.; Nga, Do T.; Viet, Nguyen A.
2018-03-01
Photothermal effects of gold core-shell nanoparticles and nanorods dispersed in water are theoretically investigated using the transient bioheat equation and the extended Mie theory. Properly calculating the absorption cross section is an extremely crucial milestone to determine the elevation of solution temperature. The nanostructures are assumed to be randomly and uniformly distributed in the solution. Compared to previous experiments, our theoretical temperature increase during laser light illumination provides, in various systems, both reasonable qualitative and quantitative agreement. This approach can be a highly reliable tool to predict photothermal effects in experimentally unexplored structures. We also validate our approach and discuss itslimitations.
Integration of UAV photogrammetry and SPH modelling of fluids to study runoff on real terrains.
Barreiro, Anxo; Domínguez, Jose M; C Crespo, Alejandro J; González-Jorge, Higinio; Roca, David; Gómez-Gesteira, Moncho
2014-01-01
Roads can experience runoff problems due to the intense rain discharge associated to severe storms. Two advanced tools are combined to analyse the interaction of complex water flows with real terrains. UAV (Unmanned Aerial Vehicle) photogrammetry is employed to obtain accurate topographic information on small areas, typically on the order of a few hectares. The Smoothed Particle Hydrodynamics (SPH) technique is applied by means of the DualSPHysics model to compute the trajectory of the water flow during extreme rain events. The use of engineering solutions to palliate flood events is also analysed. The study case simulates how the collected water can flow into a close road and how precautionary measures can be effective to drain water under extreme conditions. The amount of water arriving at the road is calculated under different protection scenarios and the efficiency of a ditch is observed to decrease when sedimentation reduces its depth.
Integration of UAV Photogrammetry and SPH Modelling of Fluids to Study Runoff on Real Terrains
Barreiro, Anxo; Domínguez, Jose M.; C. Crespo, Alejandro J.; González-Jorge, Higinio; Roca, David; Gómez-Gesteira, Moncho
2014-01-01
Roads can experience runoff problems due to the intense rain discharge associated to severe storms. Two advanced tools are combined to analyse the interaction of complex water flows with real terrains. UAV (Unmanned Aerial Vehicle) photogrammetry is employed to obtain accurate topographic information on small areas, typically on the order of a few hectares. The Smoothed Particle Hydrodynamics (SPH) technique is applied by means of the DualSPHysics model to compute the trajectory of the water flow during extreme rain events. The use of engineering solutions to palliate flood events is also analysed. The study case simulates how the collected water can flow into a close road and how precautionary measures can be effective to drain water under extreme conditions. The amount of water arriving at the road is calculated under different protection scenarios and the efficiency of a ditch is observed to decrease when sedimentation reduces its depth. PMID:25372035
Ruggedized downhole tool for real-time measurements and uses thereof
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hess, Ryan Falcone; Lindblom, Scott C.; Yelton, William G.
The present invention relates to ruggedized downhole tools and sensors, as well as uses thereof. In particular, these tools can operate under extreme conditions and, therefore, allow for real-time measurements in geothermal reservoirs or other potentially harsh environments. One exemplary sensor includes a ruggedized ion selective electrode (ISE) for detecting tracer concentrations in real-time. In one embodiment, the ISE includes a solid, non-conductive potting material and an ion selective material, which are disposed in a temperature-resistant electrode body. Other electrode configurations, tools, and methods are also described.
NASA Technical Reports Server (NTRS)
Norris, Jeffrey S.; Powell, Mark W.; Fox, Jason M.; Crockett, Thomas M.; Joswig, Joseph C.
2009-01-01
Cliffbot Maestro permits teleoperation of remote rovers for field testing in extreme environments. The application user interface provides two sets of tools for operations: stereo image browsing and command generation.
Machinability of Stellite 6 hardfacing
NASA Astrophysics Data System (ADS)
Benghersallah, M.; Boulanouar, L.; Le Coz, G.; Devillez, A.; Dudzinski, D.
2010-06-01
This paper reports some experimental findings concerning the machinability at high cutting speed of nickel-base weld-deposited hardfacings for the manufacture of hot tooling. The forging work involves extreme impacts, forces, stresses and temperatures. Thus, mould dies must be extremely resistant. The aim of the project is to create a rapid prototyping process answering to forging conditions integrating a Stellite 6 hardfacing deposed PTA process. This study talks about the dry machining of the hardfacing, using a two tips machining tool and a high speed milling machine equipped by a power consumption recorder Wattpilote. The aim is to show the machinability of the hardfacing, measuring the power and the tip wear by optical microscope and white light interferometer, using different strategies and cutting conditions.
Mucci, Viviana
2018-01-01
Chest ultrasonography (CU) is a noninvasive imaging technique able to provide an immediate diagnosis of the underlying aetiology of acute respiratory failure and traumatic chest injuries. Given the great technologies, it is now possible to perform accurate CU in remote and adverse environments including the combat field, extreme sport settings, and environmental disasters, as well as during space missions. Today, the usage of CU in the extreme emergency setting is more likely to occur, as this technique proved to be a fast diagnostic tool to assist resuscitation manoeuvres and interventional procedures in many cases. A scientific literature review is presented here. This was based on a systematic search of published literature, on the following online databases: PubMed and Scopus. The following words were used: “chest sonography,” “ thoracic ultrasound,” and “lung sonography,” in different combinations with “extreme sport,” “extreme environment,” “wilderness,” “catastrophe,” and “extreme conditions.” This manuscript reports the most relevant usages of CU in the extreme setting as well as technological improvements and current limitations. CU application in the extreme setting is further encouraged here. PMID:29736195
Transportation Resilience Tools from the U.S. Department of Transportation
NASA Astrophysics Data System (ADS)
Snow, C.; Rodehorst, B.; Miller, R.; Choate, A.; Hyman, R.; Kafalenos, R.; Beucler, B.
2014-12-01
The U.S. Department of Transportation (U.S. DOT) and ICF International have been working to develop tools and resources to help state departments of transportation (DOTs) and metropolitan planning organizations (MPOs) prepare for the impacts of climate change. U.S. DOT recently released a set of climate change and extreme weather tools for state DOTs and MPOs that address key challenges they have faced in increasing their climate change resilience. The tools were developed under the U.S. DOT Gulf Coast Study, Phase 2. The CMIP Climate Data Processing Tool provides an easy way for users to gather and process downscaled climate model data at the local level, and "translates" that data into information relevant to transportation engineers and planners. The Vulnerability Assessment Scoring Tool (VAST), provides a step-by-step approach for users to assess their vulnerability to climate change in a transparent, cost-effective way. The Transportation Climate Change Sensitivity Matrix provides detailed information on how 11 different climate stressors may affect transportation infrastructure and operations. These tools significantly advance the state of the practice for transportation agencies to respond to climate change impacts, and beta-versions have been used successfully by several state DOTs and MPOs. This presentation will focus on these tools, examples of how they can be applied within transportation agencies, and opportunities to apply the lessons learned from the tools—or even the tools themselves—beyond the transportation sector, including as part of the national Climate Resilience Toolkit.
Computational Failure Modeling of Lower Extremities
2012-01-01
bone fracture, ligament tear, and muscle rupture . While these injuries may seem well-defined through medical imaging, the process of injury and the...to vehicles from improvised explosives cause severe injuries to the lower extremities, in- cluding bone fracture, ligament tear, and muscle rupture ...modeling offers a powerful tool to explore the insult-to-injury process with high-resolution. When studying a complex dynamic process such as this, it is
NASA Astrophysics Data System (ADS)
Vogt, S.; Neumayer, F. F.; Serkyov, I.; Jesner, G.; Kelsch, R.; Geile, M.; Sommer, A.; Golle, R.; Volk, W.
2017-09-01
Steel is the most common material used in vehicles’ chassis, which makes its research an important topic for the automotive industry. Recently developed ultra-high-strength steels (UHSS) provide extreme tensile strength up to 1,500 MPa and combine great crashworthiness with good weight reduction potential. However, in order to reach the final shape of sheet metal parts additional cutting steps such as trimming and piercing are often required. The final trimming of quenched metal sheets presents a huge challenge to a conventional process, mainly because of the required extreme cutting force. The high cutting impact, due to the materials’ brittleness, causes excessive tool wear or even sudden tool failure. Therefore, a laser is commonly used for the cutting process, which is time and energy consuming. The purpose of this paper is to demonstrate the capability of a conventional blanking tool design in a continuous stroke piercing process using boron steel 22MnB5 sheets. Two different types of tool steel were tested for their suitability as active cutting elements: electro-slag remelted (ESR) cold work tool steel Bohler K340 ISODUR and powder-metallurgic (PM) high speed steel Bohler S390 MICROCLEAN. A FEM study provided information about an optimized punch design, which withstands buckling under high cutting forces. The wear behaviour of the process was assessed by the tool wear of the active cutting elements as well as the quality of cut surfaces.
Ion beam deposition system for depositing low defect density extreme ultraviolet mask blanks
NASA Astrophysics Data System (ADS)
Jindal, V.; Kearney, P.; Sohn, J.; Harris-Jones, J.; John, A.; Godwin, M.; Antohe, A.; Teki, R.; Ma, A.; Goodwin, F.; Weaver, A.; Teora, P.
2012-03-01
Extreme ultraviolet lithography (EUVL) is the leading next-generation lithography (NGL) technology to succeed optical lithography at the 22 nm node and beyond. EUVL requires a low defect density reflective mask blank, which is considered to be one of the top two critical technology gaps for commercialization of the technology. At the SEMATECH Mask Blank Development Center (MBDC), research on defect reduction in EUV mask blanks is being pursued using the Veeco Nexus deposition tool. The defect performance of this tool is one of the factors limiting the availability of defect-free EUVL mask blanks. SEMATECH identified the key components in the ion beam deposition system that is currently impeding the reduction of defect density and the yield of EUV mask blanks. SEMATECH's current research is focused on in-house tool components to reduce their contributions to mask blank defects. SEMATECH is also working closely with the supplier to incorporate this learning into a next-generation deposition tool. This paper will describe requirements for the next-generation tool that are essential to realize low defect density EUV mask blanks. The goal of our work is to enable model-based predictions of defect performance and defect improvement for targeted process improvement and component learning to feed into the new deposition tool design. This paper will also highlight the defect reduction resulting from process improvements and the restrictions inherent in the current tool geometry and components that are an impediment to meeting HVM quality EUV mask blanks will be outlined.
NASA Astrophysics Data System (ADS)
al Aamery, N. M. H.; Mahoney, D. T.; Fox, J.
2017-12-01
Future climate change projections suggest extreme impacts on watershed hydrologic systems for some regions of the world including pronounced increases in surface runoff and instream flows. Yet, there remains a lack of research focused on how future changes in hydrologic extremes, as well as relative hydrologic mean changes, impact sediment redistribution within a watershed and sediment flux from a watershed. The authors hypothesized that variations in mean and extreme changes in turn may impact sediments in depositional and erosional dominance in a manner that may not be obvious to the watershed manager. Therefore, the objectives of this study were to investigate the inner processes connecting the combined effect of extreme climate change projections on the vegetation, upland erosion, and instream processes to produce changes in sediment redistribution within watersheds. To do so, research methods were carried out by the authors including simulating sediment processes in forecast and hindcast periods for a lowland watershed system. Publically available climate realizations from several climate factors and the Soil Water Assessment Tool (SWAT) were used to predict hydrologic conditions for the South Elkhorn Watershed in central Kentucky, USA to 2050. The results of the simulated extreme and mean hydrological components were used in simulating upland erosion with the connectivity processes consideration and thereafter used in building and simulating the instream erosion and deposition of sediment processes with the consideration of surface fine grain lamina (SFGL) layer controlling the benthic ecosystem. Results are used to suggest the dominance of erosional and depositional redistribution of sediments under different scenarios associated with extreme and mean hydrologic forecasting. The results are discussed in reference to the benthic ecology of the stream system providing insight on how water managers might consider sediment redistribution in a changing climate.
A Protocol for Safe Lithiation Reactions Using Organolithium Reagents
Gau, Michael R.; Zdilla, Michael J.
2016-01-01
Organolithium reagents are powerful tools in the synthetic chemist's toolbox. However, the extreme pyrophoric nature of the most reactive reagents warrants proper technique, thorough training, and proper personal protective equipment. To aid in the training of researchers using organolithium reagents, a thorough, step-by-step protocol for the safe and effective use of tert-butyllithium on an inert gas line or within a glovebox is described. As a model reaction, preparation of lithium tert-butyl amide by the reaction of tert-butyl amine with one equivalent of tert-butyl lithium is presented. PMID:27911386
Photonic nonlinearities via quantum Zeno blockade.
Sun, Yu-Zhu; Huang, Yu-Ping; Kumar, Prem
2013-05-31
Realizing optical-nonlinear effects at a single-photon level is a highly desirable but also extremely challenging task, because of both fundamental and practical difficulties. We present an avenue to surmounting these difficulties by exploiting quantum Zeno blockade in nonlinear optical systems. Considering specifically a lithium-niobate microresonator, we find that a deterministic phase gate can be realized between single photons with near-unity fidelity. Supported by established techniques for fabricating and operating such devices, our approach can provide an enabling tool for all-optical applications in both classical and quantum domains.
NASA Technical Reports Server (NTRS)
Naghipour, P.; Pineda, E. J.; Arnold, S.
2014-01-01
Lightning is a major cause of damage in laminated composite aerospace structures during flight. Due to the dielectric nature of Carbon fiber reinforced polymers (CFRPs), the high energy induced by lightning strike transforms into extreme, localized surface temperature accompanied with a high-pressure shockwave resulting in extensive damage. It is crucial to develop a numerical tool capable of predicting the damage induced from a lightning strike to supplement extremely expensive lightning experiments. Delamination is one of the most significant failure modes resulting from a lightning strike. It can be extended well beyond the visible damage zone, and requires sophisticated techniques and equipment to detect. A popular technique used to model delamination is the cohesive zone approach. Since the loading induced from a lightning strike event is assumed to consist of extreme localized heating, the cohesive zone formulation should additionally account for temperature effects. However, the sensitivity to this dependency remains unknown. Therefore, the major focus point of this work is to investigate the importance of this dependency via defining various temperature dependency profiles for the cohesive zone properties, and analyzing the corresponding delamination area. Thus, a detailed numerical model consisting of multidirectional composite plies with temperature-dependent cohesive elements in between is subjected to lightning (excessive amount of heat and pressure) and delamination/damage expansion is studied under specified conditions.
NASA Astrophysics Data System (ADS)
Zeng, X. H.; Xue, P.; Wang, D.; Ni, D. R.; Xiao, B. L.; Ma, Z. Y.
2018-07-01
The effect of processing parameters on material flow and defect formation during friction stir welding (FSW) was investigated on 6.0-mm-thick 2014Al-T6 rolled plates with an artificially thickened oxide layer on the butt surface as the marker material. It was found that the "S" line in the stir zone (SZ) rotated with the pin and stayed on the retreating side (RS) and advancing side (AS) at low and high heat inputs, respectively. When the tool rotation rate was extremely low, the oxide layer under the pin moved to the RS first and then to the AS perpendicular to the welding direction, rather than rotating with the pin. The material flow was driven by the shear stresses produced by the forces at the pin-workpiece interface. With increases of the rotation rate, the depth of the shoulder-affected zone (SAZ) first decreased and then increased due to the decreasing shoulder friction force and increasing heat input. Insufficient material flow appeared in the whole of the SZ at low rotation rates and in the bottom of the SZ at high rotation rates, resulting in the formation of the "S" line. The extremely inadequate material flow is the reason for the lack of penetration and the kissing bonds in the bottom of the SZ at extremely low and low rotation rates, respectively.
NASA Astrophysics Data System (ADS)
Zeng, X. H.; Xue, P.; Wang, D.; Ni, D. R.; Xiao, B. L.; Ma, Z. Y.
2018-04-01
The effect of processing parameters on material flow and defect formation during friction stir welding (FSW) was investigated on 6.0-mm-thick 2014Al-T6 rolled plates with an artificially thickened oxide layer on the butt surface as the marker material. It was found that the "S" line in the stir zone (SZ) rotated with the pin and stayed on the retreating side (RS) and advancing side (AS) at low and high heat inputs, respectively. When the tool rotation rate was extremely low, the oxide layer under the pin moved to the RS first and then to the AS perpendicular to the welding direction, rather than rotating with the pin. The material flow was driven by the shear stresses produced by the forces at the pin-workpiece interface. With increases of the rotation rate, the depth of the shoulder-affected zone (SAZ) first decreased and then increased due to the decreasing shoulder friction force and increasing heat input. Insufficient material flow appeared in the whole of the SZ at low rotation rates and in the bottom of the SZ at high rotation rates, resulting in the formation of the "S" line. The extremely inadequate material flow is the reason for the lack of penetration and the kissing bonds in the bottom of the SZ at extremely low and low rotation rates, respectively.
Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.
2005-01-01
Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.
Pasanen, Kati; Krosshaug, Tron; Vasankari, Tommi; Kannus, Pekka; Heinonen, Ari; Kujala, Urho M; Avela, Janne; Perttunen, Jarmo; Parkkari, Jari
2018-01-01
Background/aim Poor frontal plane knee control can manifest as increased dynamic knee valgus during athletic tasks. The purpose of this study was to investigate the association between frontal plane knee control and the risk of acute lower extremity injuries. In addition, we wanted to study if the single-leg squat (SLS) test can be used as a screening tool to identify athletes with an increased injury risk. Methods A total of 306 basketball and floorball players participated in the baseline SLS test and a 12-month injury registration follow-up. Acute lower extremity time-loss injuries were registered. Frontal plane knee projection angles (FPKPA) during the SLS were calculated using a two-dimensional video analysis. Results Athletes displaying a high FPKPA were 2.7 times more likely to sustain a lower extremity injury (adjusted OR 2.67, 95% CI 1.23 to 5.83) and 2.4 times more likely to sustain an ankle injury (OR 2.37, 95% CI 1.13 to 4.98). There was no statistically significant association between FPKPA and knee injury (OR 1.49, 95% CI 0.56 to 3.98). The receiver operating characteristic curve analyses indicated poor combined sensitivity and specificity when FPKPA was used as a screening test for lower extremity injuries (area under the curve of 0.59) and ankle injuries (area under the curve of 0.58). Conclusions Athletes displaying a large FPKPA in the SLS test had an elevated risk of acute lower extremity and ankle injuries. However, the SLS test is not sensitive and specific enough to be used as a screening tool for future injury risk. PMID:29387448
NASA Astrophysics Data System (ADS)
Abaurrea, J.; Asín, J.; Cebrián, A. C.
2018-02-01
The occurrence of extreme heat events in maximum and minimum daily temperatures is modelled using a non-homogeneous common Poisson shock process. It is applied to five Spanish locations, representative of the most common climates over the Iberian Peninsula. The model is based on an excess over threshold approach and distinguishes three types of extreme events: only in maximum temperature, only in minimum temperature and in both of them (simultaneous events). It takes into account the dependence between the occurrence of extreme events in both temperatures and its parameters are expressed as functions of time and temperature related covariates. The fitted models allow us to characterize the occurrence of extreme heat events and to compare their evolution in the different climates during the observed period. This model is also a useful tool for obtaining local projections of the occurrence rate of extreme heat events under climate change conditions, using the future downscaled temperature trajectories generated by Earth System Models. The projections for 2031-60 under scenarios RCP4.5, RCP6.0 and RCP8.5 are obtained and analysed using the trajectories from four earth system models which have successfully passed a preliminary control analysis. Different graphical tools and summary measures of the projected daily intensities are used to quantify the climate change on a local scale. A high increase in the occurrence of extreme heat events, mainly in July and August, is projected in all the locations, all types of event and in the three scenarios, although in 2051-60 the increase is higher under RCP8.5. However, relevant differences are found between the evolution in the different climates and the types of event, with a specially high increase in the simultaneous ones.
The diagnostic management of upper extremity deep vein thrombosis: A review of the literature.
Kraaijpoel, Noémie; van Es, Nick; Porreca, Ettore; Büller, Harry R; Di Nisio, Marcello
2017-08-01
Upper extremity deep vein thrombosis (UEDVT) accounts for 4% to 10% of all cases of deep vein thrombosis. UEDVT may present with localized pain, erythema, and swelling of the arm, but may also be detected incidentally by diagnostic imaging tests performed for other reasons. Prompt and accurate diagnosis is crucial to prevent pulmonary embolism and long-term complications as the post-thrombotic syndrome of the arm. Unlike the diagnostic management of deep vein thrombosis (DVT) of the lower extremities, which is well established, the work-up of patients with clinically suspected UEDVT remains uncertain with limited evidence from studies of small size and poor methodological quality. Currently, only one prospective study evaluated the use of an algorithm, similar to the one used for DVT of the lower extremities, for the diagnostic workup of clinically suspected UEDVT. The algorithm combined clinical probability assessment, D-dimer testing and ultrasonography and appeared to safely and effectively exclude UEDVT. However, before recommending its use in routine clinical practice, external validation of this strategy and improvements of the efficiency are needed, especially in high-risk subgroups in whom the performance of the algorithm appeared to be suboptimal, such as hospitalized or cancer patients. In this review, we critically assess the accuracy and efficacy of current diagnostic tools and provide clinical guidance for the diagnostic management of clinically suspected UEDVT. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Lavers, David A.; Pappenberger, Florian; Richardson, David S.; Zsoter, Ervin
2016-11-01
In winter, heavy precipitation and floods along the west coasts of midlatitude continents are largely caused by intense water vapor transport (integrated vapor transport (IVT)) within the atmospheric river of extratropical cyclones. This study builds on previous findings that showed that forecasts of IVT have higher predictability than precipitation, by applying and evaluating the European Centre for Medium-Range Weather Forecasts Extreme Forecast Index (EFI) for IVT in ensemble forecasts during three winters across Europe. We show that the IVT EFI is more able (than the precipitation EFI) to capture extreme precipitation in forecast week 2 during forecasts initialized in a positive North Atlantic Oscillation (NAO) phase; conversely, the precipitation EFI is better during the negative NAO phase and at shorter leads. An IVT EFI example for storm Desmond in December 2015 highlights its potential to identify upcoming hydrometeorological extremes, which may prove useful to the user and forecasting communities.
Extreme Weather Events and Climate Change Attribution
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thomas, Katherine
A report from the National Academies of Sciences, Engineering, and Medicine concludes it is now possible to estimate the influence of climate change on some types of extreme events. The science of extreme event attribution has advanced rapidly in recent years, giving new insight to the ways that human-caused climate change can influence the magnitude or frequency of some extreme weather events. This report examines the current state of science of extreme weather attribution, and identifies ways to move the science forward to improve attribution capabilities. Confidence is strongest in attributing types of extreme events that are influenced by climatemore » change through a well-understood physical mechanism, such as, the more frequent heat waves that are closely connected to human-caused global temperature increases, the report finds. Confidence is lower for other types of events, such as hurricanes, whose relationship to climate change is more complex and less understood at present. For any extreme event, the results of attribution studies hinge on how questions about the event's causes are posed, and on the data, modeling approaches, and statistical tools chosen for the analysis.« less
Routine versus Catastrophic Influences on the Developing Child
Odgers, Candice L.; Jaffee, Sara R.
2014-01-01
Exposure to toxic stress accelerates the wear and tear on children’s developing bodies and leaves a lasting mark on adult health. Prior research has focused mainly on children exposed to extreme forms of adversity, such as maltreatment and extreme neglect. However, repeated exposure to less severe, but often chronic stressors is likely to play as large, if not larger, of a role in forecasting children’s future mental and physical health. New tools from neuroscience, biology, epigenetics, and the social sciences are helping to isolate when and how the foundations for adult health are shaped by childhood experiences. We are now in the position to understand how adversity, in both extreme and more mundane forms, contributes to the adult health burden and to identify features in children’s families and environments that can be strengthened to buffer the effects of toxic stressors. We are now positioned to develop and implement innovative approaches to child policy and practice that are rooted in an understanding of how exposure to toxic stressors can become biologically embedded. The stage is set for the creation of new interventions—on both grand and micro scales—to reduce previously intractable health disparities. PMID:23297656
2009-06-01
visualisation tool. These tools are currently in use at the Surveillance and Control Training Unit (SACTU) in Williamtown, New South Wales, and the School...itself by facilitating the brevity and sharpness of learning points. The playback of video and audio was considered an extremely useful method of...The task assessor’s comments were supported by wall projections and audio replays of relevant mission segments that were controlled by an AAR
Michelle F. Tacconelli; Edward F. Loewenstein
2012-01-01
Natural resource managers must often balance multiple objectives on a single property. When these objectives are seemingly conflicting, the managerâs job can be extremely difficult and complex. This paper presents a decision support tool, designed to aid land managers in optimizing wildlife habitat needs while accomplishing additional objectives such as ecosystem...
Aerts, Bas R; Kuijer, P Paul; Beumer, Annechien; Eygendaal, Denise; Frings-Dresen, Monique H
2018-04-17
To test a 17-item questionnaire, the WOrk-Related Questionnaire for UPper extremity disorders (WORQ-UP), for dimensionality of the items (factor analysis) and internal consistency. Cross-sectional study. Outpatient clinic. A consecutive sample of patients (N=150) consisting of all new referral patients (either from a general physician or other hospital) who visited the orthopedic outpatient clinic because of an upper extremity musculoskeletal disorder. Not applicable. Number and dimensionality of the factors in the WORQ-UP. Four factors with eigenvalues (EVs) >1.0 were found. The factors were named exertion, dexterity, tools & equipment, and mobility. The EVs of the factors were, respectively, 5.78, 2.38, 1.81, and 1.24. The factors together explained 65.9% of the variance. The Cronbach alpha values for these factors were, respectively, .88, .74, .87, and .66. The 17 items of the WORQ-UP resemble 4 factors-exertion, dexterity, tools & equipment, and mobility-with a good internal consistency. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lindsey, Rebecca; Goldman, Nir; Fried, Laurence
2017-06-01
Atomistic modeling of chemistry at extreme conditions remains a challenge, despite continuing advances in computing resources and simulation tools. While first principles methods provide a powerful predictive tool, the time and length scales associated with chemistry at extreme conditions (ns and μm, respectively) largely preclude extension of such models to molecular dynamics. In this work, we develop a simulation approach that retains the accuracy of density functional theory (DFT) while decreasing computational effort by several orders of magnitude. We generate n-body descriptions for atomic interactions by mapping forces arising from short density functional theory (DFT) trajectories on to simple Chebyshev polynomial series. We examine the importance of including greater than 2-body interactions, model transferability to different state points, and discuss approaches to ensure smooth and reasonable model shape outside of the distance domain sampled by the DFT training set. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Mercury monohalides: suitability for electron electric dipole moment searches.
Prasannaa, V S; Vutha, A C; Abe, M; Das, B P
2015-05-08
Heavy polar diatomic molecules are the primary tools for searching for the T-violating permanent electric dipole moment of the electron (eEDM). Valence electrons in some molecules experience extremely large effective electric fields due to relativistic interactions. These large effective electric fields are crucial to the success of polar-molecule-based eEDM search experiments. Here we report on the results of relativistic ab initio calculations of the effective electric fields in a series of molecules that are highly sensitive to an eEDM, the mercury monohalides (HgF, HgCl, HgBr, and HgI). We study the influence of the halide anions on E_{eff}, and identify HgBr and HgI as attractive candidates for future electric dipole moment search experiments.
Demography and Public Health Emergency Preparedness: Making the Connection
Katz, Rebecca
2009-01-01
The tools and techniques of population sciences are extremely relevant to the discipline of public health emergency preparedness: protecting and securing the population’s health requires information about that population. While related fields such as security studies have successfully integrated demographic tools into their research and literature, the theoretical and practical connection between the methods of demography and the practice of public health emergency preparedness is weak. This article suggests the need to further the interdisciplinary use of demography by examining the need for a systematic use of population science techniques in public health emergency preparedness. Ultimately, we demonstrate how public health emergency preparedness can incorporate demography to develop more effective preparedness plans. Important policy implications emerge: demographers and preparedness experts need to collaborate more formally in order to facilitate community resilience and mitigate the consequences of public health emergencies. PMID:20694030
Using geographical information systems and cartograms as a health service quality improvement tool.
Lovett, Derryn A; Poots, Alan J; Clements, Jake T C; Green, Stuart A; Samarasundera, Edgar; Bell, Derek
2014-07-01
Disease prevalence can be spatially analysed to provide support for service implementation and health care planning, these analyses often display geographic variation. A key challenge is to communicate these results to decision makers, with variable levels of Geographic Information Systems (GIS) knowledge, in a way that represents the data and allows for comprehension. The present research describes the combination of established GIS methods and software tools to produce a novel technique of visualising disease admissions and to help prevent misinterpretation of data and less optimal decision making. The aim of this paper is to provide a tool that supports the ability of decision makers and service teams within health care settings to develop services more efficiently and better cater to the population; this tool has the advantage of information on the position of populations, the size of populations and the severity of disease. A standard choropleth of the study region, London, is used to visualise total emergency admission values for Chronic Obstructive Pulmonary Disease and bronchiectasis using ESRI's ArcGIS software. Population estimates of the Lower Super Output Areas (LSOAs) are then used with the ScapeToad cartogram software tool, with the aim of visualising geography at uniform population density. An interpolation surface, in this case ArcGIS' spline tool, allows the creation of a smooth surface over the LSOA centroids for admission values on both standard and cartogram geographies. The final product of this research is the novel Cartogram Interpolation Surface (CartIS). The method provides a series of outputs culminating in the CartIS, applying an interpolation surface to a uniform population density. The cartogram effectively equalises the population density to remove visual bias from areas with a smaller population, while maintaining contiguous borders. CartIS decreases the number of extreme positive values not present in the underlying data as can be found in interpolation surfaces. This methodology provides a technique for combining simple GIS tools to create a novel output, CartIS, in a health service context with the key aim of improving visualisation communication techniques which highlight variation in small scale geographies across large regions. CartIS more faithfully represents the data than interpolation, and visually highlights areas of extreme value more than cartograms, when either is used in isolation. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Kakuda, Tsuneo; Shojo, Hideki; Tanaka, Mayumi; Nambiar, Phrabhakaran; Minaguchi, Kiyoshi; Umetsu, Kazuo; Adachi, Noboru
2016-01-01
Mitochondrial DNA (mtDNA) serves as a powerful tool for exploring matrilineal phylogeographic ancestry, as well as for analyzing highly degraded samples, because of its polymorphic nature and high copy numbers per cell. The recent advent of complete mitochondrial genome sequencing has led to improved techniques for phylogenetic analyses based on mtDNA, and many multiplex genotyping methods have been developed for the hierarchical analysis of phylogenetically important mutations. However, few high-resolution multiplex genotyping systems for analyzing East-Asian mtDNA can be applied to extremely degraded samples. Here, we present a multiplex system for analyzing mitochondrial single nucleotide polymorphisms (mtSNPs), which relies on a novel amplified product-length polymorphisms (APLP) method that uses inosine-flapped primers and is specifically designed for the detailed haplogrouping of extremely degraded East-Asian mtDNAs. We used fourteen 6-plex polymerase chain reactions (PCRs) and subsequent electrophoresis to examine 81 haplogroup-defining SNPs and 3 insertion/deletion sites, and we were able to securely assign the studied mtDNAs to relevant haplogroups. Our system requires only 1×10−13 g (100 fg) of crude DNA to obtain a full profile. Owing to its small amplicon size (<110 bp), this new APLP system was successfully applied to extremely degraded samples for which direct sequencing of hypervariable segments using mini-primer sets was unsuccessful, and proved to be more robust than conventional APLP analysis. Thus, our new APLP system is effective for retrieving reliable data from extremely degraded East-Asian mtDNAs. PMID:27355212
Kakuda, Tsuneo; Shojo, Hideki; Tanaka, Mayumi; Nambiar, Phrabhakaran; Minaguchi, Kiyoshi; Umetsu, Kazuo; Adachi, Noboru
2016-01-01
Mitochondrial DNA (mtDNA) serves as a powerful tool for exploring matrilineal phylogeographic ancestry, as well as for analyzing highly degraded samples, because of its polymorphic nature and high copy numbers per cell. The recent advent of complete mitochondrial genome sequencing has led to improved techniques for phylogenetic analyses based on mtDNA, and many multiplex genotyping methods have been developed for the hierarchical analysis of phylogenetically important mutations. However, few high-resolution multiplex genotyping systems for analyzing East-Asian mtDNA can be applied to extremely degraded samples. Here, we present a multiplex system for analyzing mitochondrial single nucleotide polymorphisms (mtSNPs), which relies on a novel amplified product-length polymorphisms (APLP) method that uses inosine-flapped primers and is specifically designed for the detailed haplogrouping of extremely degraded East-Asian mtDNAs. We used fourteen 6-plex polymerase chain reactions (PCRs) and subsequent electrophoresis to examine 81 haplogroup-defining SNPs and 3 insertion/deletion sites, and we were able to securely assign the studied mtDNAs to relevant haplogroups. Our system requires only 1×10-13 g (100 fg) of crude DNA to obtain a full profile. Owing to its small amplicon size (<110 bp), this new APLP system was successfully applied to extremely degraded samples for which direct sequencing of hypervariable segments using mini-primer sets was unsuccessful, and proved to be more robust than conventional APLP analysis. Thus, our new APLP system is effective for retrieving reliable data from extremely degraded East-Asian mtDNAs.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; di Rocco, Stefania; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; Peter, Thomas; Davison, Anthony C.
2010-05-01
Tools from geostatistics and extreme value theory are applied to analyze spatial correlations in total ozone for the southern mid-latitudes. The dataset used in this study is the NIWA-assimilated total ozone dataset (Bodeker et al., 2001; Müller et al., 2008). Recently new tools from extreme value theory (Coles, 2001; Ribatet, 2007) have been applied to the world's longest total ozone record from Arosa, Switzerland (e.g. Staehelin 1998a,b) and 5 other long-term ground based stations to describe extreme events in low and high total ozone (Rieder et al., 2010a,b,c). Excursions in the frequency of extreme events reveal "fingerprints" of dynamical factors such as ENSO or NAO, and chemical factors, such as cold Arctic vortex ozone losses, as well as major volcanic eruptions of the 20th century (e.g. Gunung Agung, El Chichón, Mt. Pinatubo). Furthermore, atmospheric loading in ozone depleting substances lead to a continuous modification of column ozone in the northern hemisphere also with respect to extreme values (partly again in connection with polar vortex contributions). It is shown that application of extreme value theory allows the identification of many more of such fingerprints than conventional time series analysis on basis of annual and seasonal mean values. Especially, the analysis shows the strong influence of dynamics, revealing that even moderate ENSO and NAO events have a discernible effect on total ozone (Rieder et al., 2010b,c). Within the current study patterns in spatial correlation and frequency distributions of extreme events (e.g. ELOs and EHOs) are studied for the southern mid-latitudes. It is analyzed if "fingerprints"found for features in the northern hemisphere occur also in the southern mid-latitudes. New insights in spatial patterns of total ozone for the southern mid-latitudes are presented. Within this study the influence of changes in atmospheric dynamics (e.g. tropospheric and lower stratospheric pressure systems, ENSO) as well as influence of major volcanic eruptions (e.g. Mt. Pinatubo) and ozone depleting substances (ODS) on column ozone over the southern mid-latitudes is analyzed for the time period 1979-2007. References: Bodeker, G.E., J.C. Scott, K. Kreher, and R.L. McKenzie, Global ozone trends in potential vorticity coordinates using TOMS and GOME intercompared against the Dobson network: 1978-1998, J. Geophys. Res., 106 (D19), 23029-23042, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Müller, R., Grooß, J.-U., Lemmen, C., Heinze, D., Dameris, M., and Bodeker, G.: Simple measures of ozone depletion in the polar stratosphere, Atmos. Chem. Phys., 8, 251-264, 2008. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD. Rieder, H.E., Jancso, L.M., Staehelin, J., Maeder, J.A., Ribatet, Peter, T., and A.D., Davison (2010): Extreme events in total ozone over the northern mid-latitudes: A case study based on long-term data sets from 5 ground-based stations, in preparation. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
Kim, TaeHoon; Kim, SeongSik; Lee, ByoungHee
2016-03-01
The purpose of this study was to investigate whether action observational training (AOT) plus brain-computer interface-based functional electrical stimulation (BCI-FES) has a positive influence on motor recovery of paretic upper extremity in patients with stroke. This was a hospital-based, randomized controlled trial with a blinded assessor. Thirty patients with a first-time stroke were randomly allocated to one of two groups: the BCI-FES group (n = 15) and the control group (n = 15). The BCI-FES group administered to AOT plus BCI-FES on the paretic upper extremity five times per week during 4 weeks while both groups received conventional therapy. The primary outcomes were the Fugl-Meyer Assessment of the Upper Extremity, Motor Activity Log (MAL), Modified Barthel Index and range of motion of paretic arm. A blinded assessor evaluated the outcomes at baseline and 4 weeks. All baseline outcomes did not differ significantly between the two groups. After 4 weeks, the Fugl-Meyer Assessment of the Upper Extremity sub-items (total, shoulder and wrist), MAL (MAL-Activity of Use and Quality of Movement), Modified Barthel Index and wrist flexion range of motion were significantly higher in the BCI-FES group (p < 0.05). AOT plus BCI-based FES is effective in paretic arm rehabilitation by improving the upper extremity performance. The motor improvements suggest that AOT plus BCI-based FES can be used as a therapeutic tool for stroke rehabilitation. The limitations of the study are that subjects had a certain limited level of upper arm function, and the sample size was comparatively small; hence, it is recommended that future large-scale trials should consider stratified and lager populations according to upper arm function. Copyright © 2015 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Leta, O. T.; El-Kadi, A. I.; Dulaiova, H.
2016-12-01
Extreme events, such as flooding and drought, are expected to occur at increased frequencies worldwide due to climate change influencing the water cycle. This is particularly critical for tropical islands where the local freshwater resources are very sensitive to climate. This study examined the impact of climate change on extreme streamflow, reservoir water volume and outflow for the Nuuanu watershed, using the Soil and Water Assessment Tool (SWAT) model. Based on the sensitive parameters screened by the Latin Hypercube-One-factor-At-a-Time (LH-OAT) method, SWAT was calibrated and validated to daily streamflow using the SWAT Calibration and Uncertainty Program (SWAT-CUP) at three streamflow gauging stations. Results showed that SWAT adequately reproduced the observed daily streamflow hydrographs at all stations. This was verified with Nash-Sutcliffe Efficiency that resulted in acceptable values of 0.58 to 0.88, whereby more than 90% of observations were bracketed within 95% model prediction uncertainty interval for both calibration and validation periods, signifying the potential applicability of SWAT for future prediction. The climate change impact on extreme flows, reservoir water volume and outflow was assessed under the Representative Concentration Pathways of 4.5 and 8.5 scenarios. We found wide changes in extreme peak and low flows ranging from -44% to 20% and -50% to -2%, respectively, compared to baseline. Consequently, the amount of water stored in Nuuanu reservoir will be decreased up to 27% while the corresponding outflow rates are expected to decrease up to 37% relative to the baseline. In addition, the stored water and extreme flows are highly sensitive to rainfall change when compared to temperature and solar radiation changes. It is concluded that the decrease in extreme low and peak flows can have serious consequences, such as flooding, drought, with detrimental effects on riparian ecological functioning. This study's results are expected to aid in reservoir operation as well as in identifying appropriate climate change adaptation strategies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, G; Qin, A; Zhang, J
Purpose: With the implementation of Cone-beam Computed-Tomography (CBCT) in proton treatment, we introduces a quick and effective tool to verify the patient’s daily setup and geometry changes based on the Water-Equivalent-Thickness Projection-Image(WETPI) from individual beam angle. Methods: A bilateral head neck cancer(HNC) patient previously treated via VMAT was used in this study. The patient received 35 daily CBCT during the whole treatment and there is no significant weight change. The CT numbers of daily CBCTs were corrected by mapping the CT numbers from simulation CT via Deformable Image Registration(DIR). IMPT plan was generated using 4-field IMPT robust optimization (3.5% rangemore » and 3mm setup uncertainties) with beam angle 60, 135, 300, 225 degree. WETPI within CTV through all beam directions were calculated. 3%/3mm gamma index(GI) were used to provide a quantitative comparison between initial sim-CT and mapped daily CBCT. To simulate an extreme case where human error is involved, a couch bar was manually inserted in front of beam angle 225 degree of one CBCT. WETPI was compared in this scenario. Results: The average of GI passing rate of this patient from different beam angles throughout the treatment course is 91.5 ± 8.6. In the cases with low passing rate, it was found that the difference between shoulder and neck angle as well as the head rest often causes major deviation. This indicates that the most challenge in treating HNC is the setup around neck area. In the extreme case where a couch bar is accidently inserted in the beam line, GI passing rate drops to 52 from 95. Conclusion: WETPI and quantitative gamma analysis give clinicians, therapists and physicists a quick feedback of the patient’s setup accuracy or geometry changes. The tool could effectively avoid some human errors. Furthermore, this tool could be used potentially as an initial signal to trigger plan adaptation.« less
NASA Astrophysics Data System (ADS)
Kuleshov, Y.; Jones, D.; Spillman, C. M.
2012-04-01
Climate change and climate extremes have a major impact on Australia and Pacific Island countries. Of particular concern are tropical cyclones and extreme ocean temperatures, the first being the most destructive events for terrestrial systems, while the latter has the potential to devastate ocean ecosystems through coral bleaching. As a practical response to climate change, under the Pacific-Australia Climate Change Science and Adaptation Planning program (PACCSAP), we are developing enhanced web-based information tools for providing seasonal forecasts for climatic extremes in the Western Pacific. Tropical cyclones are the most destructive weather systems that impact on coastal areas. Interannual variability in the intensity and distribution of tropical cyclones is large, and presently greater than any trends that are ascribable to climate change. In the warming environment, predicting tropical cyclone occurrence based on historical relationships, with predictors such as sea surface temperatures (SSTs) now frequently lying outside of the range of past variability meaning that it is not possible to find historical analogues for the seasonal conditions often faced by Pacific countries. Elevated SSTs are the primary trigger for mass coral bleaching events, which can lead to widespread damage and mortality on reef systems. Degraded coral reefs present many problems, including long-term loss of tourism and potential loss or degradation of fisheries. The monitoring and prediction of thermal stress events enables the support of a range of adaptive and management activities that could improve reef resilience to extreme conditions. Using the climate model POAMA (Predictive Ocean-Atmosphere Model for Australia), we aim to improve accuracy of seasonal forecasts of tropical cyclone activity and extreme SSTs for the regions of Western Pacific. Improved knowledge of extreme climatic events, with the assistance of tailored forecast tools, will help enhance the resilience and adaptive capacity of Australia and Pacific Island Countries under climate change. Acknowledgement The research discussed in this paper was conducted with the support of the PACCSAP supported by the AusAID and Department of Climate Change and Energy Efficiency and delivered by the Bureau of Meteorology and CSIRO.
Using synchrotron light to accelerate EUV resist and mask materials learning
NASA Astrophysics Data System (ADS)
Naulleau, Patrick; Anderson, Christopher N.; Baclea-an, Lorie-Mae; Denham, Paul; George, Simi; Goldberg, Kenneth A.; Jones, Gideon; McClinton, Brittany; Miyakawa, Ryan; Mochi, Iacopo; Montgomery, Warren; Rekawa, Seno; Wallow, Tom
2011-03-01
As commercialization of extreme ultraviolet lithography (EUVL) progresses, direct industry activities are being focused on near term concerns. The question of long term extendibility of EUVL, however, remains crucial given the magnitude of the investments yet required to make EUVL a reality. Extendibility questions are best addressed using advanced research tools such as the SEMATECH Berkeley microfield exposure tool (MET) and actinic inspection tool (AIT). Utilizing Lawrence Berkeley National Laboratory's Advanced Light Source facility as the light source, these tools benefit from the unique properties of synchrotron light enabling research at nodes generations ahead of what is possible with commercial tools. The MET for example uses extremely bright undulator radiation to enable a lossless fully programmable coherence illuminator. Using such a system, resolution enhancing illuminations achieving k1 factors of 0.25 can readily be attained. Given the MET numerical aperture of 0.3, this translates to an ultimate resolution capability of 12 nm. Using such methods, the SEMATECH Berkeley MET has demonstrated resolution in resist to 16-nm half pitch and below in an imageable spin-on hard mask. At a half pitch of 16 nm, this material achieves a line-edge roughness of 2 nm with a correlation length of 6 nm. These new results demonstrate that the observed stall in ultimate resolution progress in chemically amplified resists is a materials issue rather than a tool limitation. With a resolution limit of 20-22 nm, the CAR champion from 2008 remains as the highest performing CAR tested to date. To enable continued advanced learning in EUV resists, SEMATECH has initiated a plan to implement a 0.5 NA microfield tool at the Advanced Light Source synchrotron facility. This tool will be capable of printing down to 8-nm half pitch.
Tool Integration Framework for Bio-Informatics
2007-04-01
Java NetBeans [11] based Integrated Development Environment (IDE) for developing modules and packaging computational tools. The framework is extremely...integrate an Eclipse front-end for Desktop Integration. Eclipse was chosen over Netbeans owing to a higher acceptance, better infrastructure...5.0. This version of Dashboard ran with NetBeans IDE 3.6 requiring Java Runtime 1.4 on a machine with Windows XP. The toolchain is executed by
WEC Design Response Toolbox v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey
2016-03-30
The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.
Software for Dosage Individualization of Voriconazole for Immunocompromised Patients
VanGuilder, Michael; Donnelly, J. Peter; Blijlevens, Nicole M. A.; Brüggemann, Roger J. M.; Jelliffe, Roger W.; Neely, Michael N.
2013-01-01
The efficacy of voriconazole is potentially compromised by considerable pharmacokinetic variability. There are increasing insights into voriconazole concentrations that are safe and effective for treatment of invasive fungal infections. Therapeutic drug monitoring is increasingly advocated. Software to aid in the individualization of dosing would be an extremely useful clinical tool. We developed software to enable the individualization of voriconazole dosing to attain predefined serum concentration targets. The process of individualized voriconazole therapy was based on concepts of Bayesian stochastic adaptive control. Multiple-model dosage design with feedback control was used to calculate dosages that achieved desired concentration targets with maximum precision. The performance of the software program was assessed using the data from 10 recipients of an allogeneic hematopoietic stem cell transplant (HSCT) receiving intravenous (i.v.) voriconazole. The program was able to model the plasma concentrations with a high level of precision, despite the wide range of concentration trajectories and interindividual pharmacokinetic variability. The voriconazole concentrations predicted after the last dosages were largely concordant with those actually measured. Simulations provided an illustration of the way in which the software can be used to adjust dosages of patients falling outside desired concentration targets. This software appears to be an extremely useful tool to further optimize voriconazole therapy and aid in therapeutic drug monitoring. Further prospective studies are now required to define the utility of the controller in daily clinical practice. PMID:23380734
Exploring Antarctic Land Surface Temperature Extremes Using Condensed Anomaly Databases
NASA Astrophysics Data System (ADS)
Grant, Glenn Edwin
Satellite observations have revolutionized the Earth Sciences and climate studies. However, data and imagery continue to accumulate at an accelerating rate, and efficient tools for data discovery, analysis, and quality checking lag behind. In particular, studies of long-term, continental-scale processes at high spatiotemporal resolutions are especially problematic. The traditional technique of downloading an entire dataset and using customized analysis code is often impractical or consumes too many resources. The Condensate Database Project was envisioned as an alternative method for data exploration and quality checking. The project's premise was that much of the data in any satellite dataset is unneeded and can be eliminated, compacting massive datasets into more manageable sizes. Dataset sizes are further reduced by retaining only anomalous data of high interest. Hosting the resulting "condensed" datasets in high-speed databases enables immediate availability for queries and exploration. Proof of the project's success relied on demonstrating that the anomaly database methods can enhance and accelerate scientific investigations. The hypothesis of this dissertation is that the condensed datasets are effective tools for exploring many scientific questions, spurring further investigations and revealing important information that might otherwise remain undetected. This dissertation uses condensed databases containing 17 years of Antarctic land surface temperature anomalies as its primary data. The study demonstrates the utility of the condensate database methods by discovering new information. In particular, the process revealed critical quality problems in the source satellite data. The results are used as the starting point for four case studies, investigating Antarctic temperature extremes, cloud detection errors, and the teleconnections between Antarctic temperature anomalies and climate indices. The results confirm the hypothesis that the condensate databases are a highly useful tool for Earth Science analyses. Moreover, the quality checking capabilities provide an important method for independent evaluation of dataset veracity.
Playbook Data Analysis Tool: Collecting Interaction Data from Extremely Remote Users
NASA Technical Reports Server (NTRS)
Kanefsky, Bob; Zheng, Jimin; Deliz, Ivonne; Marquez, Jessica J.; Hillenius, Steven
2017-01-01
Typically, user tests for software tools are conducted in person. At NASA, the users may be located at the bottom of the ocean in a pressurized habitat, above the atmosphere in the International Space Station, or in an isolated capsule on a simulated asteroid mission. The Playbook Data Analysis Tool (P-DAT) is a human-computer interaction (HCI) evaluation tool that the NASA Ames HCI Group has developed to record user interactions with Playbook, the group's existing planning-and-execution software application. Once the remotely collected user interaction data makes its way back to Earth, researchers can use P-DAT for in-depth analysis. Since a critical component of the Playbook project is to understand how to develop more intuitive software tools for astronauts to plan in space, P-DAT helps guide us in the development of additional easy-to-use features for Playbook, informing the design of future crew autonomy tools.P-DAT has demonstrated the capability of discreetly capturing usability data in amanner that is transparent to Playbook’s end-users. In our experience, P-DAT data hasalready shown its utility, revealing potential usability patterns, helping diagnose softwarebugs, and identifying metrics and events that are pertinent to Playbook usage aswell as spaceflight operations. As we continue to develop this analysis tool, P-DATmay yet provide a method for long-duration, unobtrusive human performance collectionand evaluation for mission controllers back on Earth and researchers investigatingthe effects and mitigations related to future human spaceflight performance.
NASA Astrophysics Data System (ADS)
Ganesh, Shruthi Vatsyayani
With the advent of microfluidic technologies for molecular diagnostics, a lot of emphasis has been placed on developing diagnostic tools for resource poor regions in the form of Extreme Point of Care devices. To ensure commercial viability of such a device there is a need to develop an accurate sample to answer system, which is robust, portable, isolated yet highly sensitive and cost effective. This need has been a driving force for research involving integration of different microsystems like droplet microfluidics, Compact-disc (CD)microfluidics along with sample preparation and detection modules on a single platform. This work attempts to develop a proof of concept prototype of one such device using existing CD microfluidics tools to generate stable droplets used in point of care diagnostics (POC diagnostics). Apart from using a fairly newer technique for droplet generation and stabilization, the work aims to develop this method focused towards diagnostics for rural healthcare. The motivation for this work is first described with an emphasis on the current need for diagnostic testing in rural health-care and the general guidelines prescribed by WHO for such a sample to answer system. Furthermore, a background on CD and droplet microfluidics is presented to understand the merits and de-merits of each system and the need for integrating the two. This phase of the thesis also includes different methods employed/demonstrated to generate droplets on a spinning platform. An overview on the detection platforms is also presented to understand the challenges involved in building an extreme point of care device. In the third phase of the thesis, general manufacturing techniques and materials used to accomplish this work is presented. Lastly, design trials for droplet generation is presented. The shortcomings of these trials are solved by investigating mechanisms pertaining to design modification and use of agarose based droplet generation to ensure a more robust sample processing method. This method is further characterized and compared with non-agarose based system and the results are analyzed. In conclusion, future prospects of this work are discussed in relation to extreme POC applications.
NASA Astrophysics Data System (ADS)
Fox-Rabinovich, G. S.; Veldhuis, S. C.; Dosbaeva, G. K.; Yamamoto, K.; Kovalev, A. I.; Wainstein, D. L.; Gershman, I. S.; Shuster, L. S.; Beake, B. D.
2008-04-01
The development of effective hard coatings for high performance dry machining, which is associated with high stress/temperatures during friction, is a major challenge. Newly developed synergistically alloyed nanocrystalline adaptive Ti0.2Al0.55Cr0.2Si0.03Y0.02N plasma vapor deposited hard coatings exhibit excellent tool life under conditions of high performance dry machining of hardened steel, especially under severe and extreme cutting conditions. The coating is capable of sustaining cutting speeds as high as 600 m/min. Comprehensive investigation of the microstructure and properties of the coating was performed. The structure of the coating before and after service has been characterized by high resolution transmission electron microscopy. Micromechanical characteristics of the coating have been investigated at elevated temperatures. Oxidation resistance of the coating has been studied by using thermogravimetry within a temperature range of 25-1100 °C in air. The coefficient of friction of the coatings was studied within a temperature range of 25-1200 °C. To determine the causes of excellent tool life and improved wear behavior of the TiAlCrSiYN coatings, its surface structure characteristics after service have been investigated by using x-ray photoelectron spectroscopy and extended energy-loss fine spectroscopy. One of the major features of this coating is the dynamic formation of the protective tribo-oxide films (dissipative structures) on the surface during friction with a sapphire and mullite crystal structure. Aluminum- and silicon-rich tribofilms with dangling bonds form on the surface as well. These tribofilms act in synergy and protect the surface so efficiently that it is able to sustain extreme operating conditions. Moreover, the Ti0.2Al0.55Cr0.2Si0.03Y0.02N coating possesses some features of a complex adaptive behavior because it has a number of improved characteristics (tribological adaptability, ultrafine nanocrystalline structure, hot hardness and plasticity, and oxidation stability) that work synergistically as a whole. Due to the complex adaptive behavior, this coating represents a higher ordered system that has an ability to achieve unattainable wear resistance under strongly intensifying and extreme tribological conditions.
External Tank Program Legacy of Success
NASA Technical Reports Server (NTRS)
Welzyn, Ken; Pilet, Jeff
2010-01-01
I.Goal: a) Extensive TPS damage caused by extreme hail storm. b) Repair plan required to restore TPS to minimize program manifest impacts. II. Challenges: a) Skeptical technical community - Concerned about interactions of damage with known/unknown failure modes. b) Schedule pressure to accommodate ISS program- Next tank still at MAF c)Limited ET resources. III. How d We Do It?: a) Developed unique engineering requirements and tooling to minimize repairs. b) Performed large amount of performance testing to demonstrate understanding of repairs and residual conditions. c) Effectively communicated results to technical community and management to instill confidence in expected performance.
NASA Technical Reports Server (NTRS)
Brooks, David E.; Gassman, Holly; Beering, Dave R.; Welch, Arun; Hoder, Douglas J.; Ivancic, William D.
1999-01-01
Transmission Control Protocol (TCP) is the underlying protocol used within the Internet for reliable information transfer. As such, there is great interest to have all implementations of TCP efficiently interoperate. This is particularly important for links exhibiting long bandwidth-delay products. The tools exist to perform TCP analysis at low rates and low delays. However, for extremely high-rate and lone-delay links such as 622 Mbps over geosynchronous satellites, new tools and testing techniques are required. This paper describes the tools and techniques used to analyze and debug various TCP implementations over high-speed, long-delay links.
Decision Support System for hydrological extremes
NASA Astrophysics Data System (ADS)
Bobée, Bernard; El Adlouni, Salaheddine
2014-05-01
The study of the tail behaviour of extreme event distributions is important in several applied statistical fields such as hydrology, finance, and telecommunications. For example in hydrology, it is important to estimate adequately extreme quantiles in order to build and manage safe and effective hydraulic structures (dams, for example). Two main classes of distributions are used in hydrological frequency analysis: the class D of sub-exponential (Gamma (G2), Gumbel, Halphen type A (HA), Halphen type B (HB)…) and the class C of regularly varying distributions (Fréchet, Log-Pearson, Halphen type IB …) with a heavier tail. A Decision Support System (DSS) based on the characterization of the right tail, corresponding low probability of excedence p (high return period T=1/p, in hydrology), has been developed. The DSS allows discriminating between the class C and D and in its last version, a new prior step is added in order to test Lognormality. Indeed, the right tail of the Lognormal distribution (LN) is between the tails of distributions of the classes C and D; studies indicated difficulty with the discrimination between LN and distributions of the classes C and D. Other tools are useful to discriminate between distributions of the same class D (HA, HB and G2; see other communication). Some numerical illustrations show that, the DSS allows discriminating between Lognormal, regularly varying and sub-exponential distributions; and lead to coherent conclusions. Key words: Regularly varying distributions, subexponential distributions, Decision Support System, Heavy tailed distribution, Extreme value theory
Identifying and characterizing key nodes among communities based on electrical-circuit networks.
Zhu, Fenghui; Wang, Wenxu; Di, Zengru; Fan, Ying
2014-01-01
Complex networks with community structures are ubiquitous in the real world. Despite many approaches developed for detecting communities, we continue to lack tools for identifying overlapping and bridging nodes that play crucial roles in the interactions and communications among communities in complex networks. Here we develop an algorithm based on the local flow conservation to effectively and efficiently identify and distinguish the two types of nodes. Our method is applicable in both undirected and directed networks without a priori knowledge of the community structure. Our method bypasses the extremely challenging problem of partitioning communities in the presence of overlapping nodes that may belong to multiple communities. Due to the fact that overlapping and bridging nodes are of paramount importance in maintaining the function of many social and biological networks, our tools open new avenues towards understanding and controlling real complex networks with communities accompanied with the key nodes.
A Remote Sensing-Based Tool for Assessing Rainfall-Driven Hazards
Wright, Daniel B.; Mantilla, Ricardo; Peters-Lidard, Christa D.
2018-01-01
RainyDay is a Python-based platform that couples rainfall remote sensing data with Stochastic Storm Transposition (SST) for modeling rainfall-driven hazards such as floods and landslides. SST effectively lengthens the extreme rainfall record through temporal resampling and spatial transposition of observed storms from the surrounding region to create many extreme rainfall scenarios. Intensity-Duration-Frequency (IDF) curves are often used for hazard modeling but require long records to describe the distribution of rainfall depth and duration and do not provide information regarding rainfall space-time structure, limiting their usefulness to small scales. In contrast, RainyDay can be used for many hazard applications with 1-2 decades of data, and output rainfall scenarios incorporate detailed space-time structure from remote sensing. Thanks to global satellite coverage, RainyDay can be used in inaccessible areas and developing countries lacking ground measurements, though results are impacted by remote sensing errors. RainyDay can be useful for hazard modeling under nonstationary conditions. PMID:29657544
Moriello, Gabriele; Denio, Christopher; Abraham, Megan; DeFrancesco, Danielle; Townsley, Jill
2013-10-01
The purpose of this case report was to document outcomes following an intense exercise program integrating yoga with physical therapy exercise in a male with Parkinson's disease. The participant performed an intense 1½-hour program (Phase A) incorporating strengthening, balance, agility and yoga exercises twice weekly for 12 weeks. He then completed a new home exercise program developed by the researchers (Phase B) for 12 weeks. His score on the Parkinson's Disease Questionnaire improved 16 points while his score on the High Level Mobility Assessment tool improved 11 points. There were also improvements in muscle length of several lower extremity muscles, in upper and lower extremity muscle strength, in dynamic balance and he continues to work full time 29 months later. There were no improvements in thoracic posture or aerobic power. This intense program was an effective dose of exercise for someone with Parkinson's disease and allowed him to continue to participate in work, leisure, and community activities. Copyright © 2013 Elsevier Ltd. All rights reserved.
A Remote Sensing-Based Tool for Assessing Rainfall-Driven Hazards.
Wright, Daniel B; Mantilla, Ricardo; Peters-Lidard, Christa D
2017-04-01
RainyDay is a Python-based platform that couples rainfall remote sensing data with Stochastic Storm Transposition (SST) for modeling rainfall-driven hazards such as floods and landslides. SST effectively lengthens the extreme rainfall record through temporal resampling and spatial transposition of observed storms from the surrounding region to create many extreme rainfall scenarios. Intensity-Duration-Frequency (IDF) curves are often used for hazard modeling but require long records to describe the distribution of rainfall depth and duration and do not provide information regarding rainfall space-time structure, limiting their usefulness to small scales. In contrast, RainyDay can be used for many hazard applications with 1-2 decades of data, and output rainfall scenarios incorporate detailed space-time structure from remote sensing. Thanks to global satellite coverage, RainyDay can be used in inaccessible areas and developing countries lacking ground measurements, though results are impacted by remote sensing errors. RainyDay can be useful for hazard modeling under nonstationary conditions.
Quasinormal Modes and Strong Cosmic Censorship.
Cardoso, Vitor; Costa, João L; Destounis, Kyriakos; Hintz, Peter; Jansen, Aron
2018-01-19
The fate of Cauchy horizons, such as those found inside charged black holes, is intrinsically connected to the decay of small perturbations exterior to the event horizon. As such, the validity of the strong cosmic censorship (SCC) conjecture is tied to how effectively the exterior damps fluctuations. Here, we study massless scalar fields in the exterior of Reissner-Nordström-de Sitter black holes. Their decay rates are governed by quasinormal modes of the black hole. We identify three families of modes in these spacetimes: one directly linked to the photon sphere, well described by standard WKB-type tools; another family whose existence and time scale is closely related to the de Sitter horizon; finally, a third family which dominates for near-extremally charged black holes and which is also present in asymptotically flat spacetimes. The last two families of modes seem to have gone unnoticed in the literature. We give a detailed description of linear scalar perturbations of such black holes, and conjecture that SCC is violated in the near extremal regime.
Quasinormal Modes and Strong Cosmic Censorship
NASA Astrophysics Data System (ADS)
Cardoso, Vitor; Costa, João L.; Destounis, Kyriakos; Hintz, Peter; Jansen, Aron
2018-01-01
The fate of Cauchy horizons, such as those found inside charged black holes, is intrinsically connected to the decay of small perturbations exterior to the event horizon. As such, the validity of the strong cosmic censorship (SCC) conjecture is tied to how effectively the exterior damps fluctuations. Here, we study massless scalar fields in the exterior of Reissner-Nordström-de Sitter black holes. Their decay rates are governed by quasinormal modes of the black hole. We identify three families of modes in these spacetimes: one directly linked to the photon sphere, well described by standard WKB-type tools; another family whose existence and time scale is closely related to the de Sitter horizon; finally, a third family which dominates for near-extremally charged black holes and which is also present in asymptotically flat spacetimes. The last two families of modes seem to have gone unnoticed in the literature. We give a detailed description of linear scalar perturbations of such black holes, and conjecture that SCC is violated in the near extremal regime.
A Remote Sensing-Based Tool for Assessing Rainfall-Driven Hazards
NASA Technical Reports Server (NTRS)
Wright, Daniel B.; Mantilla, Ricardo; Peters-Lidard, Christa D.
2017-01-01
RainyDay is a Python-based platform that couples rainfall remote sensing data with Stochastic Storm Transposition (SST) for modeling rainfall-driven hazards such as floods and landslides. SST effectively lengthens the extreme rainfall record through temporal resampling and spatial transposition of observed storms from the surrounding region to create many extreme rainfall scenarios. Intensity-Duration-Frequency (IDF) curves are often used for hazard modeling but require long records to describe the distribution of rainfall depth and duration and do not provide information regarding rainfall space-time structure, limiting their usefulness to small scales. In contrast, Rainy Day can be used for many hazard applications with 1-2 decades of data, and output rainfall scenarios incorporate detailed space-time structure from remote sensing. Thanks to global satellite coverage, Rainy Day can be used in inaccessible areas and developing countries lacking ground measurements, though results are impacted by remote sensing errors. Rainy Day can be useful for hazard modeling under nonstationary conditions.
Mo/Si and Mo/Be multilayer thin films on Zerodur substrates for extreme-ultraviolet lithography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mirkarimi, Paul B.; Bajt, Sasa; Wall, Mark A.
2000-04-01
Multilayer-coated Zerodur optics are expected to play a pivotal role in an extreme-ultraviolet (EUV) lithography tool. Zerodur is a multiphase, multicomponent material that is a much more complicated substrate than commonly used single-crystal Si or fused-silica substrates. We investigate the effect of Zerodur substrates on the performance of high-EUV reflectance Mo/Si and Mo/Be multilayer thin films. For Mo/Si the EUV reflectance had a nearly linear dependence on substrate roughness for roughness values of 0.06-0.36 nm rms, and the FWHM of the reflectance curves (spectral bandwidth) was essentially constant over this range. For Mo/Be the EUV reflectance was observed to decreasemore » more steeply than Mo/Si for roughness values greater than approximately 0.2-0.3 nm. Little difference was observed in the EUV reflectivity of multilayer thin films deposited on different substrates as long as the substrate roughness values were similar. (c) 2000 Optical Society of America.« less
Mo/Si and Mo/Be multilayer thin films on Zerodur substrates for extreme-ultraviolet lithography.
Mirkarimi, P B; Bajt, S; Wall, M A
2000-04-01
Multilayer-coated Zerodur optics are expected to play a pivotal role in an extreme-ultraviolet (EUV) lithography tool. Zerodur is a multiphase, multicomponent material that is a much more complicated substrate than commonly used single-crystal Si or fused-silica substrates. We investigate the effect of Zerodur substrates on the performance of high-EUV reflectance Mo/Si and Mo/Be multilayer thin films. For Mo/Si the EUV reflectance had a nearly linear dependence on substrate roughness for roughness values of 0.06-0.36 nm rms, and the FWHM of the reflectance curves (spectral bandwidth) was essentially constant over this range. For Mo/Be the EUV reflectance was observed to decrease more steeply than Mo/Si for roughness values greater than approximately 0.2-0.3 nm. Little difference was observed in the EUV reflectivity of multilayer thin films deposited on different substrates as long as the substrate roughness values were similar.
Hu, Haixiang; Zhang, Xin; Ford, Virginia; Luo, Xiao; Qi, Erhui; Zeng, Xuefeng; Zhang, Xuejun
2016-11-14
Edge effect is regarded as one of the most difficult technical issues in a computer controlled optical surfacing (CCOS) process. Traditional opticians have to even up the consequences of the two following cases. Operating CCOS in a large overhang condition affects the accuracy of material removal, while in a small overhang condition, it achieves a more accurate performance, but leaves a narrow rolled-up edge, which takes time and effort to remove. In order to control the edge residuals in the latter case, we present a new concept of the 'heterocercal' tool influence function (TIF). Generated from compound motion equipment, this type of TIF can 'transfer' the material removal from the inner place to the edge, meanwhile maintaining the high accuracy and efficiency of CCOS. We call it the 'heterocercal' TIF, because of the inspiration from the heterocercal tails of sharks, whose upper lobe provides most of the explosive power. The heterocercal TIF was theoretically analyzed, and physically realized in CCOS facilities. Experimental and simulation results showed good agreement. It enables significant control of the edge effect and convergence of entire surface errors in large tool-to-mirror size-ratio conditions. This improvement will largely help manufacturing efficiency in some extremely large optical system projects, like the tertiary mirror of the Thirty Meter Telescope.
A NOISE ADAPTIVE FUZZY EQUALIZATION METHOD FOR PROCESSING SOLAR EXTREME ULTRAVIOLET IMAGES
DOE Office of Scientific and Technical Information (OSTI.GOV)
Druckmueller, M., E-mail: druckmuller@fme.vutbr.cz
A new image enhancement tool ideally suited for the visualization of fine structures in extreme ultraviolet images of the corona is presented in this paper. The Noise Adaptive Fuzzy Equalization method is particularly suited for the exceptionally high dynamic range images from the Atmospheric Imaging Assembly instrument on the Solar Dynamics Observatory. This method produces artifact-free images and gives significantly better results than methods based on convolution or Fourier transform which are often used for that purpose.
Effective Tooling for Linked Data Publishing in Scientific Research
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purohit, Sumit; Smith, William P.; Chappell, Alan R.
Challenges that make it difficult to find, share, and combine published data, such as data heterogeneity and resource discovery, have led to increased adoption of semantic data standards and data publishing technologies. To make data more accessible, interconnected and discoverable, some domains are being encouraged to publish their data as Linked Data. Consequently, this trend greatly increases the amount of data that semantic web tools are required to process, store, and interconnect. In attempting to process and manipulate large data sets, tools–ranging from simple text editors to modern triplestores– eventually breakdown upon reaching undefined thresholds. This paper offers a systematicmore » approach that data publishers can use to categorize suitable tools to meet their data publishing needs. We present a real-world use case, the Resource Discovery for Extreme Scale Collaboration (RDESC), which features a scientific dataset(maximum size of 1.4 billion triples) used to evaluate a toolbox for data publishing in climate research. This paper also introduces a semantic data publishing software suite developed for the RDESC project.« less
A Comprehensive Look at Polypharmacy and Medication Screening Tools for the Older Cancer Patient
DeGregory, Kathlene A.; Morris, Amy L.; Ramsdale, Erika E.
2016-01-01
Inappropriate medication use and polypharmacy are extremely common among older adults. Numerous studies have discussed the importance of a comprehensive medication assessment in the general geriatric population. However, only a handful of studies have evaluated inappropriate medication use in the geriatric oncology patient. Almost a dozen medication screening tools exist for the older adult. Each available tool has the potential to improve aspects of the care of older cancer patients, but no single tool has been developed for this population. We extensively reviewed the literature (MEDLINE, PubMed) to evaluate and summarize the most relevant medication screening tools for older patients with cancer. Findings of this review support the use of several screening tools concurrently for the elderly patient with cancer. A deprescribing tool should be developed and included in a comprehensive geriatric oncology assessment. Finally, prospective studies are needed to evaluate such a tool to determine its feasibility and impact in older patients with cancer. Implications for Practice: The prevalence of polypharmacy increases with advancing age. Older adults are more susceptible to adverse effects of medications. “Prescribing cascades” are common, whereas “deprescribing” remains uncommon; thus, older patients tend to accumulate medications over time. Older patients with cancer are at high risk for adverse drug events, in part because of the complexity and intensity of cancer treatment. Additionally, a cancer diagnosis often alters assessments of life expectancy, clinical status, and competing risk. Screening for polypharmacy and potentially inappropriate medications could reduce the risk for adverse drug events, enhance quality of life, and reduce health care spending for older cancer patients. PMID:27151653
Ensemble reconstruction of spatio-temporal extreme low-flow events in France since 1871
NASA Astrophysics Data System (ADS)
Caillouet, Laurie; Vidal, Jean-Philippe; Sauquet, Eric; Devers, Alexandre; Graff, Benjamin
2017-06-01
The length of streamflow observations is generally limited to the last 50 years even in data-rich countries like France. It therefore offers too small a sample of extreme low-flow events to properly explore the long-term evolution of their characteristics and associated impacts. To overcome this limit, this work first presents a daily 140-year ensemble reconstructed streamflow dataset for a reference network of near-natural catchments in France. This dataset, called SCOPE Hydro (Spatially COherent Probabilistic Extended Hydrological dataset), is based on (1) a probabilistic precipitation, temperature, and reference evapotranspiration downscaling of the Twentieth Century Reanalysis over France, called SCOPE Climate, and (2) continuous hydrological modelling using SCOPE Climate as forcings over the whole period. This work then introduces tools for defining spatio-temporal extreme low-flow events. Extreme low-flow events are first locally defined through the sequent peak algorithm using a novel combination of a fixed threshold and a daily variable threshold. A dedicated spatial matching procedure is then established to identify spatio-temporal events across France. This procedure is furthermore adapted to the SCOPE Hydro 25-member ensemble to characterize in a probabilistic way unrecorded historical events at the national scale. Extreme low-flow events are described and compared in a spatially and temporally homogeneous way over 140 years on a large set of catchments. Results highlight well-known recent events like 1976 or 1989-1990, but also older and relatively forgotten ones like the 1878 and 1893 events. These results contribute to improving our knowledge of historical events and provide a selection of benchmark events for climate change adaptation purposes. Moreover, this study allows for further detailed analyses of the effect of climate variability and anthropogenic climate change on low-flow hydrology at the scale of France.
Extreme storm surge and wind wave climate scenario simulations at the Venetian littoral
NASA Astrophysics Data System (ADS)
Lionello, P.; Galati, M. B.; Elvini, E.
Scenario climate projections for extreme marine storms producing storm surges and wind waves are very important for the northern flat coast of the Adriatic Sea, where the area at risk includes a unique cultural and environmental heritage, and important economic activities. This study uses a shallow water model and a spectral wave model for computing the storm surge and the wind wave field, respectively, from the sea level pressure and wind fields that have been computed by the RegCM regional climate model. Simulations cover the period 1961-1990 for the present climate (control simulations) and the period 2071-2100 for the A2 and B2 scenarios. Generalized Extreme Value analysis is used for estimating values for the 10 and 100 year return times. The adequacy of these modeling tools for a reliable estimation of the climate change signal, without needing further downscaling is shown. However, this study has mainly a methodological value, because issues such as interdecadal variability and intermodel variability cannot be addressed, since the analysis is based on single model 30-year long simulations. The control simulation looks reasonably accurate for extreme value analysis, though it overestimates/underestimates the frequency of high/low surge and wind wave events with respect to observations. Scenario simulations suggest higher frequency of intense storms for the B2 scenario, but not for the A2. Likely, these differences are not the effect of climate change, but of climate multidecadal variability. Extreme storms are stronger in future scenarios, but differences are not statistically significant. Therefore this study does not provide convincing evidence for more stormy conditions in future scenarios.
Dallinga, Joan M; Benjaminse, Anne; Lemmink, Koen A P M
2012-09-01
Injuries to lower extremities are common in team sports such as soccer, basketball, volleyball, football and field hockey. Considering personal grief, disabling consequences and high costs caused by injuries to lower extremities, the importance for the prevention of these injuries is evident. From this point of view it is important to know which screening tools can identify athletes who are at risk of injury to their lower extremities. The aim of this article is to determine the predictive values of anthropometric and/or physical screening tests for injuries to the leg, anterior cruciate ligament (ACL), knee, hamstring, groin and ankle in team sports. A systematic review was conducted in MEDLINE (1966 to September 2011), EMBASE (1989 to September 2011) and CINAHL (1982 to September 2011). Based on inclusion criteria defined a priori, titles, abstracts and full texts were analysed to find relevant studies. The analysis showed that different screening tools can be predictive for injuries to the knee, ACL, hamstring, groin and ankle. For injuries in general there is some support in the literature to suggest that general joint laxity is a predictive measure for leg injuries. The anterior right/left reach distance >4 cm and the composite reach distance <4.0% of limb length in girls measured with the star excursion balance test (SEBT) may predict leg injuries. Furthermore, an increasing age, a lower hamstring/quadriceps (H : Q) ratio and a decreased range of motion (ROM) of hip abduction may predict the occurrence of leg injuries. Hyperextension of the knee, side-to-side differences in anterior-posterior knee laxity and differences in knee abduction moment between both legs are suggested to be predictive tests for sustaining an ACL injury and height was a predictive screening tool for knee ligament injuries. There is some evidence that when age increases, the probability of sustaining a hamstring injury increases. Debate exists in the analysed literature regarding measurement of the flexibility of the hamstring as a predictive screening tool, as well as using the H : Q ratio. Hip-adduction-to-abduction strength is a predictive test for hip adductor muscle strain. Studies do not agree on whether ROM of the hamstring is a predictive screening tool for groin injury. Body mass index and the age of an athlete could contribute to an ankle sprain. There is support in the literature to suggest that greater strength of the plantar flexors may be a predictive measure for sustaining an ankle injury. Furthermore, there is some agreement that the measurement of postural sway is a predictive test for an ankle injury. The screening tools mentioned above can be recommended to medical staff and coaches for screening their athletes. Future research should focus on prospective studies in larger groups and should follow athletes over several seasons.
A Semi-passive Planar Manipulandum for Upper-Extremity Rehabilitation.
Chang, Chih-Kang; Washabaugh, Edward P; Gwozdziowski, Andrew; Remy, C David; Krishnan, Chandramouli
2018-07-01
Robotic rehabilitation is a promising approach to treat individuals with neurological or orthopedic disorders. However, despite significant advancements in the field of rehabilitation robotics, this technology has found limited traction in clinical practice. A key reason for this issue is that most robots are expensive, bulky, and not scalable for in-home rehabilitation. Here, we introduce a semi-passive rehabilitation robot (SepaRRo) that uses controllable passive actuators (i.e., brakes) to provide controllable resistances at the end-effector over a large workspace in a manner that is cost-effective and safe for in-home use. We also validated the device through theoretical analyses, hardware experiments, and human subject experiments. We found that by including kinematic redundancies in the robot's linkages, the device was able to provide controllable resistances to purely resist the movement of the end-effector, or to gently steer (i.e., perturb) its motion away from the intended path. When testing these capabilities on human subjects, we found that many of the upper-extremity muscles could be selectively targeted based on the forcefield prescribed to the user. These results indicate that SepaRRo could serve as a low-cost therapeutic tool for upper-extremity rehabilitation; however, further testing is required to evaluate its therapeutic benefits in patient population.
NASA Astrophysics Data System (ADS)
Panulla, Brian J.; More, Loretta D.; Shumaker, Wade R.; Jones, Michael D.; Hooper, Robert; Vernon, Jeffrey M.; Aungst, Stanley G.
2009-05-01
Rapid improvements in communications infrastructure and sophistication of commercial hand-held devices provide a major new source of information for assessing extreme situations such as environmental crises. In particular, ad hoc collections of humans can act as "soft sensors" to augment data collected by traditional sensors in a net-centric environment (in effect, "crowd-sourcing" observational data). A need exists to understand how to task such soft sensors, characterize their performance and fuse the data with traditional data sources. In order to quantitatively study such situations, as well as study distributed decision-making, we have developed an Extreme Events Laboratory (EEL) at The Pennsylvania State University. This facility provides a network-centric, collaborative situation assessment and decision-making capability by supporting experiments involving human observers, distributed decision making and cognition, and crisis management. The EEL spans the information chain from energy detection via sensors, human observations, signal and image processing, pattern recognition, statistical estimation, multi-sensor data fusion, visualization and analytics, and modeling and simulation. The EEL command center combines COTS and custom collaboration tools in innovative ways, providing capabilities such as geo-spatial visualization and dynamic mash-ups of multiple data sources. This paper describes the EEL and several on-going human-in-the-loop experiments aimed at understanding the new collective observation and analysis landscape.
NASA Astrophysics Data System (ADS)
Sun, N.; Yearsley, J. R.; Nijssen, B.; Lettenmaier, D. P.
2014-12-01
Urban stream quality is particularly susceptible to extreme precipitation events and land use change. Although the projected effects of extreme events and land use change on hydrology have been resonably well studied, the impacts on urban water quality have not been widely examined due in part to the scale mismatch between global climate models and the spatial scales required to represent urban hydrology and water quality signals. Here we describe a grid-based modeling system that integrates the Distributed Hydrology Soil Vegetation Model (DHSVM) and urban water quality module adpated from EPA's Storm Water Management Model (SWMM) and Soil and water assessment tool (SWAT). Using the model system, we evaluate, for four partially urbanized catchments within the Puget Sound basin, urban water quality under current climate conditions, and projected potential changes in urban water quality associated with future changes in climate and land use. We examine in particular total suspended solids, toal nitrogen, total phosphorous, and coliform bacteria, with catchment representations at the 150-meter spatial resolution and the sub-daily timestep. We report long-term streamflow and water quality predictions in response to extreme precipitation events of varying magnitudes in the four partially urbanized catchments. Our simulations show that urban water quality is highly sensitive to both climatic and land use change.
Nie, Bingbing; Zhou, Qing
2016-10-02
Pedestrian lower extremity represents the most frequently injured body region in car-to-pedestrian accidents. The European Directive concerning pedestrian safety was established in 2003 for evaluating pedestrian protection performance of car models. However, design changes have not been quantified since then. The goal of this study was to investigate front-end profiles of representative passenger car models and the potential influence on pedestrian lower extremity injury risk. The front-end styling of sedans and sport utility vehicles (SUV) released from 2008 to 2011 was characterized by the geometrical parameters related to pedestrian safety and compared to representative car models before 2003. The influence of geometrical design change on the resultant risk of injury to pedestrian lower extremity-that is, knee ligament rupture and long bone fracture-was estimated by a previously developed assessment tool assuming identical structural stiffness. Based on response surface generated from simulation results of a human body model (HBM), the tool provided kinematic and kinetic responses of pedestrian lower extremity resulted from a given car's front-end design. Newer passenger cars exhibited a "flatter" front-end design. The median value of the sedan models provided 87.5 mm less bottom depth, and the SUV models exhibited 94.7 mm less bottom depth. In the lateral impact configuration similar to that in the regulatory test methods, these geometrical changes tend to reduce the injury risk of human knee ligament rupture by 36.6 and 39.6% based on computational approximation. The geometrical changes did not significantly influence the long bone fracture risk. The present study reviewed the geometrical changes in car front-ends along with regulatory concerns regarding pedestrian safety. A preliminary quantitative benefit of the lower extremity injury reduction was estimated based on these geometrical features. Further investigation is recommended on the structural changes and inclusion of more accident scenarios.
Estensoro, Itziar; Ballester-Lozano, Gabriel; Benedito-Palos, Laura; Grammes, Fabian; Martos-Sitcha, Juan Antonio; Mydland, Liv-Torunn; Calduch-Giner, Josep Alvar; Fuentes, Juan; Karalazos, Vasileios; Ortiz, Álvaro; Øverland, Margareth; Pérez-Sánchez, Jaume
2016-01-01
There is a constant need to find feed additives that improve health and nutrition of farmed fish and lessen the intestinal inflammation induced by plant-based ingredients. The objective of this study was to evaluate the effects of adding an organic acid salt to alleviate some of the detrimental effects of extreme plant-ingredient substitution of fish meal (FM) and fish oil (FO) in gilthead sea bream diet. Three experiments were conducted. In a first trial (T1), the best dose (0.4%) of sodium butyrate (BP-70 ®NOREL) was chosen after a short (9-weeks) feeding period. In a second longer trial (T2) (8 months), four diets were used: a control diet containing 25% FM (T2-D1) and three experimental diets containing 5% FM (T2-D2, T2-D3, T2-D4). FO was the only added oil in D1, while a blend of plant oils replaced 58% and 84% of FO in T2-D2, and T2-D3 and T2-D4, respectively. The latter was supplemented with 0.4% BP-70. In a third trial (T3), two groups of fish were fed for 12 and 38 months with D1, D3 and D4 diets of T2. The effects of dietary changes were studied using histochemical, immunohistochemical, molecular and electrophysiological tools. The extreme diet (T2-D3) modified significantly the transcriptomic profile, especially at the anterior intestine, up-regulating the expression of inflammatory markers, in coincidence with a higher presence of granulocytes and lymphocytes in the submucosa, and changing genes involved in antioxidant defences, epithelial permeability and mucus production. Trans-epithelial electrical resistance (Rt) was also decreased (T3-D3). Most of these modifications were returned to control values with the addition of BP-70. None of the experimental diets modified the staining pattern of PCNA, FABP2 or ALPI. These results further confirm the potential of this additive to improve or reverse the detrimental effects of extreme fish diet formulations. PMID:27898676
Estensoro, Itziar; Ballester-Lozano, Gabriel; Benedito-Palos, Laura; Grammes, Fabian; Martos-Sitcha, Juan Antonio; Mydland, Liv-Torunn; Calduch-Giner, Josep Alvar; Fuentes, Juan; Karalazos, Vasileios; Ortiz, Álvaro; Øverland, Margareth; Sitjà-Bobadilla, Ariadna; Pérez-Sánchez, Jaume
2016-01-01
There is a constant need to find feed additives that improve health and nutrition of farmed fish and lessen the intestinal inflammation induced by plant-based ingredients. The objective of this study was to evaluate the effects of adding an organic acid salt to alleviate some of the detrimental effects of extreme plant-ingredient substitution of fish meal (FM) and fish oil (FO) in gilthead sea bream diet. Three experiments were conducted. In a first trial (T1), the best dose (0.4%) of sodium butyrate (BP-70 ®NOREL) was chosen after a short (9-weeks) feeding period. In a second longer trial (T2) (8 months), four diets were used: a control diet containing 25% FM (T2-D1) and three experimental diets containing 5% FM (T2-D2, T2-D3, T2-D4). FO was the only added oil in D1, while a blend of plant oils replaced 58% and 84% of FO in T2-D2, and T2-D3 and T2-D4, respectively. The latter was supplemented with 0.4% BP-70. In a third trial (T3), two groups of fish were fed for 12 and 38 months with D1, D3 and D4 diets of T2. The effects of dietary changes were studied using histochemical, immunohistochemical, molecular and electrophysiological tools. The extreme diet (T2-D3) modified significantly the transcriptomic profile, especially at the anterior intestine, up-regulating the expression of inflammatory markers, in coincidence with a higher presence of granulocytes and lymphocytes in the submucosa, and changing genes involved in antioxidant defences, epithelial permeability and mucus production. Trans-epithelial electrical resistance (Rt) was also decreased (T3-D3). Most of these modifications were returned to control values with the addition of BP-70. None of the experimental diets modified the staining pattern of PCNA, FABP2 or ALPI. These results further confirm the potential of this additive to improve or reverse the detrimental effects of extreme fish diet formulations.
NASA Astrophysics Data System (ADS)
Tryby, M.; Fries, J. S.; Baranowski, C.
2014-12-01
Extreme precipitation events can cause significant impacts to drinking water and wastewater utilities, including facility damage, water quality impacts, service interruptions and potential risks to human health and the environment due to localized flooding and combined sewer overflows (CSOs). These impacts will become more pronounced with the projected increases in frequency and intensity of extreme precipitation events due to climate change. To model the impacts of extreme precipitation events, wastewater utilities often develop Intensity, Duration, and Frequency (IDF) rainfall curves and "design storms" for use in the U.S. Environmental Protection Agency's (EPA) Storm Water Management Model (SWMM). Wastewater utilities use SWMM for planning, analysis, and facility design related to stormwater runoff, combined and sanitary sewers, and other drainage systems in urban and non-urban areas. SWMM tracks (1) the quantity and quality of runoff made within each sub-catchment; and (2) the flow rate, flow depth, and quality of water in each pipe and channel during a simulation period made up of multiple time steps. In its current format, EPA SWMM does not consider climate change projection data. Climate change may affect the relationship between intensity, duration, and frequency described by past rainfall events. Therefore, EPA is integrating climate projection data available in the Climate Resilience Evaluation and Awareness Tool (CREAT) into SWMM. CREAT is a climate risk assessment tool for utilities that provides downscaled climate change projection data for changes in the amount of rainfall in a 24-hour period for various extreme precipitation events (e.g., from 5-year to 100-year storm events). Incorporating climate change projections into SWMM will provide wastewater utilities with more comprehensive data they can use in planning for future storm events, thereby reducing the impacts to the utility and customers served from flooding and stormwater issues.
Battistoni, Andrea; Bencivenga, Filippo; Fioretto, Daniele; Masciovecchio, Claudio
2014-10-15
In this Letter, we present a simple method to avoid the well-known spurious contributions in the Brillouin light scattering (BLS) spectrum arising from the finite aperture of collection optics. The method relies on the use of special spatial filters able to select the scattered light with arbitrary precision around a given value of the momentum transfer (Q). We demonstrate the effectiveness of such filters by analyzing the BLS spectra of a reference sample as a function of scattering angle. This practical and inexpensive method could be an extremely useful tool to fully exploit the potentiality of Brillouin acoustic spectroscopy, as it will easily allow for effective Q-variable experiments with unparalleled luminosity and resolution.
Acclimatization to extreme heat
NASA Astrophysics Data System (ADS)
Warner, M. E.; Ganguly, A. R.; Bhatia, U.
2017-12-01
Heat extremes throughout the globe, as well as in the United States, are expected to increase. These heat extremes have been shown to impact human health, resulting in some of the highest levels of lives lost as compared with similar natural disasters. But in order to inform decision makers and best understand future mortality and morbidity, adaptation and mitigation must be considered. Defined as the ability for individuals or society to change behavior and/or adapt physiologically, acclimatization encompasses the gradual adaptation that occurs over time. Therefore, this research aims to account for acclimatization to extreme heat by using a hybrid methodology that incorporates future air conditioning use and installation patterns with future temperature-related time series data. While previous studies have not accounted for energy usage patterns and market saturation scenarios, we integrate such factors to compare the impact of air conditioning as a tool for acclimatization, with a particular emphasis on mortality within vulnerable communities.
Facilitating Co-Design for Extreme-Scale Systems Through Lightweight Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Engelmann, Christian; Lauer, Frank
This work focuses on tools for investigating algorithm performance at extreme scale with millions of concurrent threads and for evaluating the impact of future architecture choices to facilitate the co-design of high-performance computing (HPC) architectures and applications. The approach focuses on lightweight simulation of extreme-scale HPC systems with the needed amount of accuracy. The prototype presented in this paper is able to provide this capability using a parallel discrete event simulation (PDES), such that a Message Passing Interface (MPI) application can be executed at extreme scale, and its performance properties can be evaluated. The results of an initial prototype aremore » encouraging as a simple 'hello world' MPI program could be scaled up to 1,048,576 virtual MPI processes on a four-node cluster, and the performance properties of two MPI programs could be evaluated at up to 16,384 virtual MPI processes on the same system.« less
Unveiling non-stationary coupling between Amazon and ocean during recent extreme events
NASA Astrophysics Data System (ADS)
Ramos, Antônio M. de T.; Zou, Yong; de Oliveira, Gilvan Sampaio; Kurths, Jürgen; Macau, Elbert E. N.
2018-02-01
The interplay between extreme events in the Amazon's precipitation and the anomaly in the temperature of the surrounding oceans is not fully understood, especially its causal relations. In this paper, we investigate the climatic interaction between these regions from 1999 until 2012 using modern tools of complex system science. We identify the time scale of the coupling quantitatively and unveil the non-stationary influence of the ocean's temperature. The findings show consistently the distinctions between the coupling in the recent major extreme events in Amazonia, such as the two droughts that happened in 2005 and 2010 and the three floods during 1999, 2009 and 2012. Interestingly, the results also reveal the influence over the anomalous precipitation of Southwest Amazon has become increasingly lagged. The analysis can shed light on the underlying dynamics of the climate network system and consequently can improve predictions of extreme rainfall events.
NASA Astrophysics Data System (ADS)
O'Neill, B. C.; Kauffman, B.; Lawrence, P.
2016-12-01
Integrated analysis of questions regarding land, water, and energy resources often requires integration of models of different types. One type of integration is between human and earth system models, since both societal and physical processes influence these resources. For example, human processes such as changes in population, economic conditions, and policies govern the demand for land, water and energy, while the interactions of these resources with physical systems determine their availability and environmental consequences. We have begun to develop and use a toolkit for linking human and earth system models called the Toolbox for Human-Earth System Integration and Scaling (THESIS). THESIS consists of models and software tools to translate, scale, and synthesize information from and between human system models and earth system models (ESMs), with initial application to linking the NCAR integrated assessment model, iPETS, with the NCAR earth system model, CESM. Initial development is focused on urban areas and agriculture, sectors that are both explicitly represented in both CESM and iPETS. Tools are being made available to the community as they are completed (see https://www2.cgd.ucar.edu/sections/tss/iam/THESIS_tools). We discuss four general types of functions that THESIS tools serve (Spatial Distribution, Spatial Properties, Consistency, and Outcome Evaluation). Tools are designed to be modular and can be combined in order to carry out more complex analyses. We illustrate their application to both the exposure of population to climate extremes and to the evaluation of climate impacts on the agriculture sector. For example, projecting exposure to climate extremes involves use of THESIS tools for spatial population, spatial urban land cover, the characteristics of both, and a tool to bring urban climate information together with spatial population information. Development of THESIS tools is continuing and open to the research community.
NASA Technical Reports Server (NTRS)
Sundaram, Meenakshi
2005-01-01
NASA and the aerospace industry are extremely serious about reducing the cost and improving the performance of launch vehicles both manned or unmanned. In the aerospace industry, sharing infrastructure for manufacturing more than one type spacecraft is becoming a trend to achieve economy of scale. An example is the Boeing Decatur facility where both Delta II and Delta IV launch vehicles are made. The author is not sure how Boeing estimates the costs of each spacecraft made in the same facility. Regardless of how a contractor estimates the cost, NASA in its popular cost estimating tool, NASA Air force Cost Modeling (NAFCOM) has to have a method built in to account for the effect of infrastructure sharing. Since there is no provision in the most recent version of NAFCOM2002 to take care of this, it has been found by the Engineering Cost Community at MSFC that the tool overestimates the manufacturing cost by as much as 30%. Therefore, the objective of this study is to develop a methodology to assess the impact of infrastructure sharing so that better operations cost estimates may be made.
NASA Technical Reports Server (NTRS)
Ramesham, Rajeshuni
2012-01-01
This paper provides the experimental test results of advanced CCGA packages tested in extreme temperature thermal environments. Standard optical inspection and x-ray non-destructive inspection tools were used to assess the reliability of high density CCGA packages for deep space extreme temperature missions. Ceramic column grid array (CCGA) packages have been increasing in use based on their advantages such as high interconnect density, very good thermal and electrical performances, compatibility with standard surface-mount packaging assembly processes, and so on. CCGA packages are used in space applications such as in logic and microprocessor functions, telecommunications, payload electronics, and flight avionics. As these packages tend to have less solder joint strain relief than leaded packages or more strain relief over lead-less chip carrier packages, the reliability of CCGA packages is very important for short-term and long-term deep space missions. We have employed high density CCGA 1152 and 1272 daisy chained electronic packages in this preliminary reliability study. Each package is divided into several daisy-chained sections. The physical dimensions of CCGA1152 package is 35 mm x 35 mm with a 34 x 34 array of columns with a 1 mm pitch. The dimension of the CCGA1272 package is 37.5 mm x 37.5 mm with a 36 x 36 array with a 1 mm pitch. The columns are made up of 80% Pb/20%Sn material. CCGA interconnect electronic package printed wiring polyimide boards have been assembled and inspected using non-destructive x-ray imaging techniques. The assembled CCGA boards were subjected to extreme temperature thermal atmospheric cycling to assess their reliability for future deep space missions. The resistance of daisy-chained interconnect sections were monitored continuously during thermal cycling. This paper provides the experimental test results of advanced CCGA packages tested in extreme temperature thermal environments. Standard optical inspection and x-ray non-destructive inspection tools were used to assess the reliability of high density CCGA packages for deep space extreme temperature missions. Keywords: Extreme temperatures, High density CCGA qualification, CCGA reliability, solder joint failures, optical inspection, and x-ray inspection.
Comprehensive evaluation of transportation projects : a toolkit for sketch planning.
DOT National Transportation Integrated Search
2010-10-01
A quick-response project-planning tool can be extremely valuable in anticipating the congestion, safety, : emissions, and other impacts of large-scale network improvements and policy implementations. This report : identifies the advantages and limita...
Rand, Debbie; Zeilig, Gabi; Kizony, Rachel
2015-06-18
Impaired dexterity of the weaker upper extremity is common post stroke and it is recommended that these individuals practice many repetitions of movement to regain function. However, stroke rehabilitation methods do not achieve the required intensity to be effective. Touchscreen tablet technology may be used as a motivating tool for self-training impaired dexterity of the weaker upper extremity post stroke. Rehab-let is a self-training protocol utilizing game apps on a touchscreen for practicing movement of the weaker upper extremity. We will conduct a pilot randomized controlled trial to assess Rehab-let compared to traditional self-training to improve dexterity of the weaker hand, and to increase self-training time and satisfaction in individuals with subacute stroke. Forty individuals with stroke undergoing subacute rehabilitation will be randomly allocated to Rehab-let or a traditional self-training program using therapeutic aids such as balls, blocks and pegs. All participants will be requested to perform self-training for 60 minutes a day, 5 times a week for 4 weeks. Dexterity assessed by The Nine Hole Peg Test is the main outcome measure. Assessments will be administered pre and post the self-training intervention by assessors blind to the group allocation. The outcomes of this study will inform the design of a fully powered randomized controlled trial to evaluate the effectiveness of Rehab-let. If found to be effective, Rehab-let can be used during subacute rehabilitation to increase treatment intensity and improve dexterity. Potentially, Rehab-let can also be used after discharge and might be ideal for individuals with mild stroke who are often not referred to formal rehabilitation. Current Controlled Trials NCT02136433 registered on 17 September 2014.
Jensen, Mads R; Birkballe, Susanne; Nørregaard, Susan; Karlsmark, Tonny
2012-07-01
Tissue dielectric constant (TDC) measurement may become an important tool in the clinical evaluation of chronic lower extremity swelling in women; however, several factors are known to influence TDC measurements, and comparative data on healthy lower extremities are few. Thirty-four healthy women volunteered. Age, BMI, moisturizer use and hair removal were registered. Three blinded investigators performed TDC measurements in a randomized sequence on clearly marked locations on the foot, the ankle and the lower leg. The effective measuring depth was 2.5 mm. The mean TDC was 37.8 ± 5.5 (mean ± SD) on the foot, 29.0 ± 3.1 on the ankle and 30.5 ± 3.9 on the lower leg. TDC was highly dependent on measuring site (P<0.001) but did not vary significantly between investigators (P=0.127). Neither age, BMI, hair removal nor moisturizer use had any significant effect on the lower leg TDC. Intraclass correlation coefficients were 0.77 for the foot, 0.94 for the ankle and 0.94 for the lower leg. The TDC on the foot was significantly higher compared with ankle and lower leg values. Foot measurements should be interpreted cautiously because of questionable interobserver agreement. The interobserver agreement was high on lower leg and ankle measurements. Neither age, BMI, hair removal nor moisturizer use had any significant on effect on the lower leg TDC. TDC values of 35.2 for the ankle and 38.3 for the lower leg are suggested as upper normal reference limits in women. © 2012 The Authors Clinical Physiology and Functional Imaging © 2012 Scandinavian Society of Clinical Physiology and Nuclear Medicine.
Usability of Operational Performance Support Tools - Findings from Sea Test II
NASA Technical Reports Server (NTRS)
Byrne, Vicky; Litaker, Harry; McGuire, Kerry
2014-01-01
Sea Test II, aka NASA Extreme Environment Mission Operations 17(NEEMO 17) took place in the Florida Aquarius undersea habitat. This confined underwater environment provides a excellent analog for space habitation providing similarities to space habitation such as hostile environment, difficult logistics, autonomous operations, and remote communications. This study collected subjective feedback on the usability of two performance support tools during the Sea Test II mission, Sept 10-14, 2013; Google Glass and iPAD. The two main objectives: - Assess the overall functionality and usability of each performance support tool in a mission analog environment. - Assess the advantages and disadvantages of each tool when performing operational procedures and Just-In-Time-Training (JITT).
The Durham Adaptive Optics Simulation Platform (DASP): Current status
NASA Astrophysics Data System (ADS)
Basden, A. G.; Bharmal, N. A.; Jenkins, D.; Morris, T. J.; Osborn, J.; Peng, J.; Staykov, L.
2018-01-01
The Durham Adaptive Optics Simulation Platform (DASP) is a Monte-Carlo modelling tool used for the simulation of astronomical and solar adaptive optics systems. In recent years, this tool has been used to predict the expected performance of the forthcoming extremely large telescope adaptive optics systems, and has seen the addition of several modules with new features, including Fresnel optics propagation and extended object wavefront sensing. Here, we provide an overview of the features of DASP and the situations in which it can be used. Additionally, the user tools for configuration and control are described.
A computer tool to support in design of industrial Ethernet.
Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues
2009-04-01
This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.
OPTIMIZING THROUGH CO-EVOLUTIONARY AVALANCHES
DOE Office of Scientific and Technical Information (OSTI.GOV)
S. BOETTCHER; A. PERCUS
2000-08-01
We explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problems. The method, called extremal optimization, is inspired by ''self-organized critically,'' a concept introduced to describe emergent complexity in many physical systems. In contrast to Genetic Algorithms which operate on an entire ''gene-pool'' of possible solutions, extremal optimization successively replaces extremely undesirable elements of a sub-optimal solution with new, random ones. Large fluctuations, called ''avalanches,'' ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements approximation methods inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Those phase transitions are found in the parameter space of most optimization problems, and have recently been conjectured to be the origin of some of the hardest instances in computational complexity. We will demonstrate how extremal optimization can be implemented for a variety of combinatorial optimization problems. We believe that extremal optimization will be a useful tool in the investigation of phase transitions in combinatorial optimization problems, hence valuable in elucidating the origin of computational complexity.« less
NASA Astrophysics Data System (ADS)
Ford, Sean M.; McCandless, Andrew B.; Liu, Xuezhu; Soper, Steven A.
2001-09-01
In this paper we present embossing tools that were fabricated using both UV and X-ray lithography. The embossing tools created were used to emboss microfluidic channels for bioanalytical applications. Specifically, two tools were fabricated. One, using x-ray lithography, was fabricated for electrophoretic separations of DNA restriction fragment analysis. A second tool, fabricated using SU8, was designed for micro PCR applications. Depths of both tools were approximately 100 micrometers . Both tools were made by directly electroforming nickel on a stainless steel base. Fabrication time for the tool fabricated using x-ray lithography was less than 1 week, and largely depended on the availability of the x-ray source. The SU8 embossing tool was fabricated in less than 24 hours. The resulting nickel electroforms from both processes were extremely robust and did not fail under embossing conditions required for PMMA and/or polycarbonate. Some problems removing SU8 after electroforming were sen for smaller size gaps between nickel structures.
Wogensen, Elise; Malá, Hana
2015-01-01
The objective of the present paper is to review the current status of exercise as a tool to promote cognitive rehabilitation after acquired brain injury (ABI) in animal model-based research. Searches were conducted on the PubMed, Scopus, and psycINFO databases in February 2014. Search strings used were: exercise (and) animal model (or) rodent (or) rat (and) traumatic brain injury (or) cerebral ischemia (or) brain irradiation. Studies were selected if they were (1) in English, (2) used adult animals subjected to acquired brain injury, (3) used exercise as an intervention tool after inflicted injury, (4) used exercise paradigms demanding movement of all extremities, (5) had exercise intervention effects that could be distinguished from other potential intervention effects, and (6) contained at least one measure of cognitive and/or emotional function. Out of 2308 hits, 22 publications fulfilled the criteria. The studies were examined relative to cognitive effects associated with three themes: exercise type (forced or voluntary), timing of exercise (early or late), and dose-related factors (intensity, duration, etc.). The studies indicate that exercise in many cases can promote cognitive recovery after brain injury. However, the optimal parameters to ensure cognitive rehabilitation efficacy still elude us, due to considerable methodological variations between studies. PMID:26509085
A Novel Web Application to Analyze and Visualize Extreme Heat Events
NASA Astrophysics Data System (ADS)
Li, G.; Jones, H.; Trtanj, J.
2016-12-01
Extreme heat is the leading cause of weather-related deaths in the United States annually and is expected to increase with our warming climate. However, most of these deaths are preventable with proper tools and services to inform the public about heat waves. In this project, we have investigated the key indicators of a heat wave, the vulnerable populations, and the data visualization strategies of how those populations most effectively absorb heat wave data. A map-based web app has been created that allows users to search and visualize historical heat waves in the United States incorporating these strategies. This app utilizes daily maximum temperature data from NOAA Global Historical Climatology Network which contains about 2.7 million data points from over 7,000 stations per year. The point data are spatially aggregated into county-level data using county geometry from US Census Bureau and stored in Postgres database with PostGIS spatial capability. GeoServer, a powerful map server, is used to serve the image and data layers (WMS and WFS). The JavaScript-based web-mapping platform Leaflet is used to display the temperature layers. A number of functions have been implemented for the search and display. Users can search for extreme heat events by county or by date. The "by date" option allows a user to select a date and a Tmax threshold which then highlights all of the areas on the map that meet those date and temperature parameters. The "by county" option allows the user to select a county on the map which then retrieves a list of heat wave dates and daily Tmax measurements. This visualization is clean, user-friendly, and novel because while this sort of time, space, and temperature measurements can be found by querying meteorological datasets, there does not exist a tool that neatly packages this information together in an easily accessible and non-technical manner, especially in a time where climate change urges a better understanding of heat waves.
Socio-economic Impact Analysis for Near Real-Time Flood Detection in the Lower Mekong River Basin
NASA Astrophysics Data System (ADS)
Oddo, P.; Ahamed, A.; Bolten, J. D.
2017-12-01
Flood events pose a severe threat to communities in the Lower Mekong River Basin. The combination of population growth, urbanization, and economic development exacerbate the impacts of these flood events. Flood damage assessments are frequently used to quantify the economic losses in the wake of storms. These assessments are critical for understanding the effects of flooding on the local population, and for informing decision-makers about future risks. Remote sensing systems provide a valuable tool for monitoring flood conditions and assessing their severity more rapidly than traditional post-event evaluations. The frequency and severity of extreme flood events are projected to increase, further illustrating the need for improved flood monitoring and impact analysis. In this study we implement a socio-economic damage model into a decision support tool with near real-time flood detection capabilities (NASA's Project Mekong). Surface water extent for current and historical floods is found using multispectral Moderate-resolution Imaging Spectroradiometer (MODIS) 250-meter imagery and the spectral Normalized Difference Vegetation Index (NDVI) signatures of permanent water bodies (MOD44W). Direct and indirect damages to populations, infrastructure, and agriculture are assessed using the 2011 Southeast Asian flood as a case study. Improved land cover and flood depth assessments result in a more refined understanding of losses throughout the Mekong River Basin. Results suggest that rapid initial estimates of flood impacts can provide valuable information to governments, international agencies, and disaster responders in the wake of extreme flood events.
NASA Technical Reports Server (NTRS)
Frady, Greg; Nesman, Thomas; Zoladz, Thomas; Szabo, Roland
2010-01-01
For many years, the capabilities to determine the root-cause failure of component failures have been limited to the analytical tools and the state of the art data acquisition systems. With this limited capability, many anomalies have been resolved by adding material to the design to increase robustness without the ability to determine if the design solution was satisfactory until after a series of expensive test programs were complete. The risk of failure and multiple design, test, and redesign cycles were high. During the Space Shuttle Program, many crack investigations in high energy density turbomachines, like the SSME turbopumps and high energy flows in the main propulsion system, have led to the discovery of numerous root-cause failures and anomalies due to the coexistences of acoustic forcing functions, structural natural modes, and a high energy excitation, such as an edge tone or shedding flow, leading the technical community to understand many of the primary contributors to extremely high frequency high cycle fatique fluid-structure interaction anomalies. These contributors have been identified using advanced analysis tools and verified using component and system tests during component ground tests, systems tests, and flight. The structural dynamics and fluid dynamics communities have developed a special sensitivity to the fluid-structure interaction problems and have been able to adjust and solve these problems in a time effective manner to meet budget and schedule deadlines of operational vehicle programs, such as the Space Shuttle Program over the years.
WE-G-BRC-02: Risk Assessment for HDR Brachytherapy
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayadev, J.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
WE-G-BRC-01: Risk Assessment for Radiosurgery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, G.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
WE-G-BRC-03: Risk Assessment for Physics Plan Review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parker, S.
2016-06-15
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
Failure Mode and Effects Analysis (FMEA) originated as an industrial engineering technique used for risk management and safety improvement of complex processes. In the context of radiotherapy, the AAPM Task Group 100 advocates FMEA as the framework of choice for establishing clinical quality management protocols. However, there is concern that widespread adoption of FMEA in radiation oncology will be hampered by the perception that implementation of the tool will have a steep learning curve, be extremely time consuming and labor intensive, and require additional resources. To overcome these preconceptions and facilitate the introduction of the tool into clinical practice, themore » medical physics community must be educated in the use of this tool and the ease in which it can be implemented. Organizations with experience in FMEA should share their knowledge with others in order to increase the implementation, effectiveness and productivity of the tool. This session will include a brief, general introduction to FMEA followed by a focus on practical aspects of implementing FMEA for specific clinical procedures including HDR brachytherapy, physics plan review and radiosurgery. A description of common equipment and devices used in these procedures and how to characterize new devices for safe use in patient treatments will be presented. This will be followed by a discussion of how to customize FMEA techniques and templates to one’s own clinic. Finally, cases of common failure modes for specific procedures (described previously) will be shown and recommended intervention methodologies and outcomes reviewed. Learning Objectives: Understand the general concept of failure mode and effect analysis Learn how to characterize new equipment for safety Be able to identify potential failure modes for specific procedures and learn mitigation techniques Be able to customize FMEA examples and templates for use in any clinic.« less
Rapid Fabrication of Lightweight SiC Optics using Reactive Atom Plasma (RAP) Processing
NASA Technical Reports Server (NTRS)
Fiske, Peter S.
2006-01-01
Reactive Atom Plasma (RAP) processing is a non-contact, plasma-based processing technology that can be used to generate damage-free optical surfaces. We have developed tools and processes using RAP that allow us to shape extremely lightweight mirror Surfaces made from extremely hard-to-machine materials (e.g. SiC). We will describe our latest results using RAP in combination with other technologies to produce finished lightweight SiC mirrors and also discuss applications for RAP in the rapid fabrication of mirror segments for reflective and grazing incidence telescopes.
Attai, Deanna J; Cowher, Michael S; Al-Hamadani, Mohammed; Schoger, Jody M; Staley, Alicia C; Landercasper, Jeffrey
2015-07-30
Despite reported benefits, many women do not attend breast cancer support groups. Abundant online resources for support exist, but information regarding the effectiveness of participation is lacking. We report the results of a Twitter breast cancer support community participant survey. The aim was to determine the effectiveness of social media as a tool for breast cancer patient education and decreasing anxiety. The Breast Cancer Social Media Twitter support community (#BCSM) began in July 2011. Institutional review board approval with a waiver of informed consent was obtained for a deidentified survey that was posted for 2 weeks on Twitter and on the #BCSM blog and Facebook page. There were 206 respondents to the survey. In all, 92.7% (191/206) were female. Respondents reported increased knowledge about breast cancer in the following domains: overall knowledge (80.9%, 153/189), survivorship (85.7%, 162/189), metastatic breast cancer (79.4%, 150/189), cancer types and biology (70.9%, 134/189), clinical trials and research (66.1%, 125/189), treatment options (55.6%, 105/189), breast imaging (56.6%, 107/189), genetic testing and risk assessment (53.9%, 102/189), and radiotherapy (43.4%, 82/189). Participation led 31.2% (59/189) to seek a second opinion or bring additional information to the attention of their treatment team and 71.9% (136/189) reported plans to increase their outreach and advocacy efforts as a result of participation. Levels of reported anxiety before and after participation were analyzed: 29 of 43 (67%) patients who initially reported "high or extreme" anxiety reported "low or no" anxiety after participation (P<.001). Also, no patients initially reporting low or no anxiety before participation reported an increase to high or extreme anxiety after participation. This study demonstrates that breast cancer patients' perceived knowledge increases and their anxiety decreases by participation in a Twitter social media support group.
Platform of integrated tools to support environmental studies and management of dredging activities.
Feola, Alessandra; Lisi, Iolanda; Salmeri, Andrea; Venti, Francesco; Pedroncini, Andrea; Gabellini, Massimo; Romano, Elena
2016-01-15
Dredging activities can cause environmental impacts due to, among other, the increase of the Suspended Solid Concentration (SSC) and their subsequent dispersion and deposition (DEP) far from the dredging point. The dynamics of the resulting dredging plume can strongly differ in spatial and temporal evolution. This evolution, for both conventional mechanical and hydraulic dredges, depends on the different mechanisms of sediment release in water column and the site-specific environmental conditions. Several numerical models are currently in use to simulate the dredging plume dynamics. Model results can be analysed to study dispersion and advection processes at different depths and distances from the dredging source. Usually, scenarios with frequent and extreme meteomarine conditions are chosen and extreme values of parameters (i.e. maximum intensity or total duration) are evaluated for environmental assessment. This paper presents a flexible, consistent and integrated methodological approach. Statistical parameters and indexes are derived from the analysis of SSC and DEP simulated time-series to numerically estimate their spatial (vertical and horizontal) and seasonal variability, thereby allowing a comparison of the effects of hydraulic and mechanical dredges. Events that exceed defined thresholds are described in term of magnitude, duration and frequency. A new integrated index combining these parameters, SSCnum, is proposed for environmental assessment. Maps representing the proposed parameters allow direct comparison of effects due to different (mechanical and hydraulic) dredges at progressive distances from the dredging zone. Results can contribute towards identification and assessment of the potential environmental effects of a proposed dredging project. A suitable evaluation of alternative technical choices, appropriate mitigation, management and monitoring measure is allowed in this framework. Environmental Risk Assessment and Decision Support Systems (DSS) may take advantage of the proposed tool. The approach is applied to a hypothetical dredging project in the Augusta Harbour (Eastern coast of Sicily Island-Italy). Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Gaetani, Francesco; Baptiste Filippi, Jean; Simeoni, Albert; D'Andrea, Mirko
2010-05-01
Haines Index (HI) was developed by USDA Forest Service to measure the atmosphere's contribution to the growth potential of a wildfire. The Haines Index combines two atmospheric factors that are known to have an effect on wildfires: Stability and Dryness. As operational tools, HI proved its ability to predict plume dominated high intensity wildfires. However, since HI does not take into account the fuel continuity, composition and moisture conditions and the effects of wind and topography on fire behaviour, its use as forecasting tool should be carefully considered. In this work we propose the use of HI, predicted from HR Limited Area Model forecasts, coupled with a Fire Weather model (i.e., RISICO system) fully operational in Italy since 2003. RISICO is based on dynamic models able to represent in space and in time the effects that environment and vegetal physiology have on fuels and, in turn, on the potential behaviour of wildfires. The system automatically acquires from remote databases a thorough data-set of input information both of in situ and spatial nature. Meteorological observations, radar data, Limited Area Model weather forecasts, EO data, and fuel data are managed by a Unified Interface able to process a wide set of different data. Specific semi-physical models are used in the system to simulate the dynamics of the fuels (load and moisture contents of dead and live fuel) and the potential fire behaviour (rate of spread and linear intensity). A preliminary validation of this approach will be provided with reference to Sardinia and Corsica Islands, two major islands of the Mediterranean See frequently affected by extreme plume dominated wildfires. A time series of about 3000 wildfires burnt in Sardinia and Corsica in 2007 and 2008 will be used to evaluate the capability of HI coupled with the outputs of the Fire Weather model to forecast the actual risk in time and in space.
Use of Magnetic Resonance Imaging to Monitor Iron Overload
Wood, John C.
2014-01-01
SYNOPSIS Treatment of iron overload requires robust estimates of total body iron burden and its response to iron chelation therapy. Compliance with chelation therapy varies considerably among patients and individual reporting is notoriously unreliable. Even with perfect compliance, intersubject variability in chelator effectiveness is extremely high, necessitating reliable iron estimates to guide dose titration. In addition, each chelator has a unique profile with respect to clearing iron stores from different organs. This chapter will present the tools available to clinicians monitoring their patients, focusing on non-invasive magnetic resonance imaging methods because they have become the de-facto standard of care. PMID:25064711
Neuromodulation of the lumbar spinal locomotor circuit.
AuYong, Nicholas; Lu, Daniel C
2014-01-01
The lumbar spinal cord contains the necessary circuitry to independently drive locomotor behaviors. This function is retained following spinal cord injury (SCI) and is amenable to rehabilitation. Although the effectiveness of task-specific training and pharmacologic modulation has been repeatedly demonstrated in animal studies, results from human studies are less striking. Recently, lumbar epidural stimulation (EDS) along with locomotor training was shown to restore weight-bearing function and lower-extremity voluntary control in a chronic, motor-complete human SCI subject. Related animal studies incorporating EDS as part of the therapeutic regiment are also encouraging. EDS is emerging as a promising neuromodulatory tool for SCI. Copyright © 2014 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gamble, John King; Nielsen, Erik; Baczewski, Andrew David
This paper describes our work over the past few years to use tools from quantum chemistry to describe electronic structure of nanoelectronic devices. These devices, dubbed "artificial atoms", comprise a few electrons, con ned by semiconductor heterostructures, impurities, and patterned electrodes, and are of intense interest due to potential applications in quantum information processing, quantum sensing, and extreme-scale classical logic. We detail two approaches we have employed: nite-element and Gaussian basis sets, exploring the interesting complications that arise when techniques that were intended to apply to atomic systems are instead used for artificial, solid-state devices.
Overview of Fundamental High-Lift Research for Transport Aircraft at NASA
NASA Technical Reports Server (NTRS)
Leavitt, L. D.; Washburn, A. E.; Wahls, R. A.
2007-01-01
NASA has had a long history in fundamental and applied high lift research. Current programs provide a focus on the validation of technologies and tools that will enable extremely short take off and landing coupled with efficient cruise performance, simple flaps with flow control for improved effectiveness, circulation control wing concepts, some exploration into new aircraft concepts, and partnership with Air Force Research Lab in mobility. Transport high-lift development testing will shift more toward mid and high Rn facilities at least until the question: "How much Rn is required" is answered. This viewgraph presentation provides an overview of High-Lift research at NASA.
Taylor, Louise H.; Wallace, Ryan M.; Balaram, Deepashree; Lindenmayer, Joann M.; Eckery, Douglas C.; Mutonono-Watkiss, Beryl; Parravani, Ellie; Nel, Louis H.
2017-01-01
Free-roaming dogs and rabies transmission are integrally linked across many low-income countries, and large unmanaged dog populations can be daunting to rabies control program planners. Dog population management (DPM) is a multifaceted concept that aims to improve the health and well-being of free-roaming dogs, reduce problems they may cause, and may also aim to reduce dog population size. In theory, DPM can facilitate more effective rabies control. Community engagement focused on promoting responsible dog ownership and better veterinary care could improve the health of individual animals and dog vaccination coverage, thus reducing rabies transmission. Humane DPM tools, such as sterilization, could theoretically reduce dog population turnover and size, allowing rabies vaccination coverage to be maintained more easily. However, it is important to understand local dog populations and community attitudes toward them in order to determine whether and how DPM might contribute to rabies control and which DPM tools would be most successful. In practice, there is very limited evidence of DPM tools achieving reductions in the size or turnover of dog populations in canine rabies-endemic areas. Different DPM tools are frequently used together and combined with rabies vaccinations, but full impact assessments of DPM programs are not usually available, and therefore, evaluation of tools is difficult. Surgical sterilization is the most frequently documented tool and has successfully reduced dog population size and turnover in a few low-income settings. However, DPM programs are mostly conducted in urban settings and are usually not government funded, raising concerns about their applicability in rural settings and sustainability over time. Technical demands, costs, and the time necessary to achieve population-level impacts are major barriers. Given their potential value, we urgently need more evidence of the effectiveness of DPM tools in the context of canine rabies control. Cheaper, less labor-intensive tools for dog sterilization will be extremely valuable in realizing the potential benefits of reduced population turnover and size. No one DPM tool will fit all situations, but if DPM objectives are achieved dog populations may be stabilized or even reduced, facilitating higher dog vaccination coverages that will benefit rabies elimination efforts. PMID:28740850
NASA Astrophysics Data System (ADS)
Tiwari, A.
2016-12-01
Coastal metropolitans in South Asia represent the most densely populated and congested urban spaces ranking among the largest urban settlements of the planet. These megacities are characterized by inadequate infrastructure, lack of mitigation tools, and weak resilience of urban ecosystems. Additionally, climate change has increased vulnerability of poor and marginalized population living in rapidly growing coastal megacities to increased frequency, severity and intensity of extreme weather events. This has adversely affected local counter strategies and adaptation tools, transforming such events into hazards with the inability to respond and mitigate. Study aimed to develop a participatory framework for risk reduction in Greater Mumbai Metropolitan by Structure Remodeling (SR) in integral GIS. Research utilized terrain analysis tools and vulnerability mapping, and identified risk susceptible fabric and checked its scope for SR without: 1.adding to its (often) complex fragmentation, and 2.without interference with the ecosystem services accommodated by it. Surfaces available included paved ground, streetscapes commercial facades, rooftops,public spaces, open as well as dark spaces. Remodeling altered certain characteristics in the intrinsic or extrinsic cross-section profile or in both (if suitable) with infrastructure measures (grey, green, blue) that collectively involved ecosystem services and maintained natural hydrological connection. This method fairly reduced exposure of vulnerable surface and minimized risk to achieve extremity-neutral state. Harmonizing with public perception and incorporating priorities of local authorities, the method is significant as it rises above the fundamental challenges arising during management of (often) conflicting perspectives and interests of multiplicity of stakeholders involved at various levels in urban climate governance while ensuring inclusive solutions with reduced vulnerability and increased resilience. Additionally this method has vast potential to replicate for climate smart planning beyond the study region as it clearly ensures barrier free climate-communication process for decision making while looking for long term feasible outcomes of remodeled surface through the most affordable and innovative tools.
Three Pillars of Success: The Partners, The Messenger, The Communication Strategies
NASA Astrophysics Data System (ADS)
Turrin, M.; Ryan, W. B. F.; Pfirman, S. L.
2017-12-01
Our ability to deal with climate impacts in coastal cities and bring change, hinges on our ability to effectively communicate impacts. Incorporating sea level rise and climate impacts into city planning and community action plans is too often done in response to a devastating impact rather than through preventative planning. In New York the impact came in the form of Hurricane Sandy. Prior to Sandy, NYC, NY State and regional scientists had prepared planning documents, reports and communications directed at public officials and decision makers, warning of potential impacts from a changing climate. Presentations and reports identified the most exposed locations and infrastructure, but disbelief and a false sense of time mired any meaningful change. Effective communication about climate and impacts is at the root of planning and resilience. To be meaningful it must come from a trusted messenger, use well vetted materials that address both larger climate processes and drivers and local impacts, be accessible to the non-science community, and incorporate multiple modes of communication. The Polar Explorer: Sea Level app is a tool that has been used to this end (http://www.polarexplorer.org). An interactive multi-layered communication tool, it uses vetted data structured through a series of commonly asked questions and displayed through visualizations. We have been partnering with New York State, local community groups, and state and educational organizations to reach a broad cross section of the public with information useful for planning. We have co-presented at conferences for local planning and advisory groups, and incorporated the use of the app into local planning charrettes and have found the visualizations, interactivity of the delivery and the layered scaffolding make the app a useful tool for planners and decision makers. The app includes the physical science drivers of climate change and the social science impacts, and a look at the past the present and future projections. For planners and coastal managers the section on "Who is Vulnerable?' highlighting areas most often impacted by weather and extreme events, provides data useful in planning for extreme events. Whatever the challenges changing climate brings, we must address the challenge of communicating in an interactive, visual and accessible way.
Bright high-repetition-rate source of narrowband extreme-ultraviolet harmonics beyond 22 eV
Wang, He; Xu, Yiming; Ulonska, Stefan; Robinson, Joseph S.; Ranitovic, Predrag; Kaindl, Robert A.
2015-01-01
Novel table-top sources of extreme-ultraviolet light based on high-harmonic generation yield unique insight into the fundamental properties of molecules, nanomaterials or correlated solids, and enable advanced applications in imaging or metrology. Extending high-harmonic generation to high repetition rates portends great experimental benefits, yet efficient extreme-ultraviolet conversion of correspondingly weak driving pulses is challenging. Here, we demonstrate a highly-efficient source of femtosecond extreme-ultraviolet pulses at 50-kHz repetition rate, utilizing the ultraviolet second-harmonic focused tightly into Kr gas. In this cascaded scheme, a photon flux beyond ≈3 × 1013 s−1 is generated at 22.3 eV, with 5 × 10−5 conversion efficiency that surpasses similar harmonics directly driven by the fundamental by two orders-of-magnitude. The enhancement arises from both wavelength scaling of the atomic dipole and improved spatio-temporal phase matching, confirmed by simulations. Spectral isolation of a single 72-meV-wide harmonic renders this bright, 50-kHz extreme-ultraviolet source a powerful tool for ultrafast photoemission, nanoscale imaging and other applications. PMID:26067922
The effect of ergonomic laparoscopic tool handle design on performance and efficiency.
Tung, Kryztopher D; Shorti, Rami M; Downey, Earl C; Bloswick, Donald S; Merryweather, Andrew S
2015-09-01
Many factors can affect a surgeon's performance in the operating room; these may include surgeon comfort, ergonomics of tool handle design, and fatigue. A laparoscopic tool handle designed with ergonomic considerations (pistol grip) was tested against a current market tool with a traditional pinch grip handle. The goal of this study is to quantify the impact ergonomic design considerations which have on surgeon performance. We hypothesized that there will be measurable differences between the efficiency while performing FLS surgical trainer tasks when using both tool handle designs in three categories: time to completion, technical skill, and subjective user ratings. The pistol grip incorporates an ergonomic interface intended to reduce contact stress points on the hand and fingers, promote a more neutral operating wrist posture, and reduce hand tremor and fatigue. The traditional pinch grip is a laparoscopic tool developed by Stryker Inc. widely used during minimal invasive surgery. Twenty-three (13 M, 10 F) participants with no existing upper extremity musculoskeletal disorders or experience performing laparoscopic procedures were selected to perform in this study. During a training session prior to testing, participants performed practice trials in a SAGES FLS trainer with both tools. During data collection, participants performed three evaluation tasks using both handle designs (order was randomized, and each trial completed three times). The tasks consisted of FLS peg transfer, cutting, and suturing tasks. Feedback from test participants indicated that they significantly preferred the ergonomic pistol grip in every category (p < 0.05); most notably, participants experienced greater degrees of discomfort in their hands after using the pinch grip tool. Furthermore, participants completed cutting and peg transfer tasks in a shorter time duration (p < 0.05) with the pistol grip than with the pinch grip design; there was no significant difference between completion times for the suturing task. Finally, there was no significant interaction between tool type and errors made during trials. There was a significant preference for as well as lower pain experienced during use of the pistol grip tool as seen from the survey feedback. Both evaluation tasks (cutting and peg transfer) were also completed significantly faster with the pistol grip tool. Finally, due to the high degree of variability in the error data, it was not possible to draw any meaningful conclusions about the effect of tool design on the number or degree of errors made.
NASA Astrophysics Data System (ADS)
Holoien, Thomas W.-S.; Marshall, Philip J.; Wechsler, Risa H.
2017-06-01
We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of a subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Holoien, Thomas W. -S.; Marshall, Philip J.; Wechsler, Risa H.
We describe two new open-source tools written in Python for performing extreme deconvolution Gaussian mixture modeling (XDGMM) and using a conditioned model to re-sample observed supernova and host galaxy populations. XDGMM is new program that uses Gaussian mixtures to perform density estimation of noisy data using extreme deconvolution (XD) algorithms. Additionally, it has functionality not available in other XD tools. It allows the user to select between the AstroML and Bovy et al. fitting methods and is compatible with scikit-learn machine learning algorithms. Most crucially, it allows the user to condition a model based on the known values of amore » subset of parameters. This gives the user the ability to produce a tool that can predict unknown parameters based on a model that is conditioned on known values of other parameters. EmpiriciSN is an exemplary application of this functionality, which can be used to fit an XDGMM model to observed supernova/host data sets and predict likely supernova parameters using a model conditioned on observed host properties. It is primarily intended to simulate realistic supernovae for LSST data simulations based on empirical galaxy properties.« less
NEEMO 21: Tools, Techniques, Technologies & Training for Science Exploration EVA
NASA Technical Reports Server (NTRS)
Graff, Trevor
2016-01-01
The 21st mission of the NASA Extreme Environment Mission Operations (NEEMO) was a highly integrated operational test and evaluation of tools, techniques, technologies, and training for science driven exploration during Extravehicular Activity (EVA).The 16-day mission was conducted from the Aquarius habitat, an underwater laboratory, off the coast of Key Largo, FL. The unique facility, authentic science objectives, and diverse skill-sets of the crew/team facilitate the planning and design for future space exploration.
Estimation of toxicity using the Toxicity Estimation Software Tool (TEST)
Tens of thousands of chemicals are currently in commerce, and hundreds more are introduced every year. Since experimental measurements of toxicity are extremely time consuming and expensive, it is imperative that alternative methods to estimate toxicity are developed.
Citing ASA24® Dietary Assessment Tool in Publications & Presentations
In order to show and maintain support for ASA24, documenting its use in through publications is extremely useful to the National Cancer Institute (NCI). Please cite ASA24 as follows, depending on the version used in your study.
NASA Astrophysics Data System (ADS)
Halperin, A.; Walton, P.
2015-12-01
As the science of extreme event attribution grows, there is an increasing need to understand how the public responds to this type of climate change communication. Extreme event attribution has the unprecedented potential to locate the effects of climate change in the here and now, but there is little information about how different facets of the public might respond to these local framings of climate change. Drawing on theories of place attachment and psychological distance, this paper explores how people with different beliefs and values shift their willingness to mitigate and adapt to climate change in response to local or global communication of climate change impacts. Results will be presented from a recent survey of over 600 Californians who were each presented with one of three experimental conditions: 1) a local framing of the role of climate change in the California drought 2) a global framing of climate change and droughts worldwide, or 3) a control condition of no text. Participants were categorized into groups based on their prior beliefs about climate change according to the Six Americas classification scheme (Leiserowitz et al., 2011). The results from the survey in conjunction with qualitative results from follow-up interviews shed insight into the importance of place in communicating climate change for people in each of the Six Americas. Additional results examine the role of gender and political affiliation in mediating responses to climate change communication. Despite research that advocates unequivocally for local framing of climate change, this study offers a more nuanced perspective of under which circumstances extreme event attribution might be an effective tool for changing behaviors. These results could be useful for scientists who wish to gain a better understanding of how their event attribution research is perceived or for educators who want to target their message to audiences where it could have the most impact.
Computer Games as Therapy for Persons with Stroke.
Lauterbach, Sarah A; Foreman, Matt H; Engsberg, Jack R
2013-02-01
Stroke affects approximately 800,000 individuals each year, with 65% having residual impairments. Studies have demonstrated that mass practice leads to regaining motor function in affected extremities; however, traditional therapy does not include the repetitions needed for this recovery. Videogames have been shown to be good motivators to complete repetitions. Advances in technology and low-cost hardware bring new opportunities to use computer games during stroke therapy. This study examined the use of the Microsoft (Redmond, WA) Kinect™ and Flexible Action and Articulated Skeleton Toolkit (FAAST) software as a therapy tool to play existing free computer games on the Internet. Three participants attended a 1-hour session where they played two games with upper extremity movements as game controls. Video was taken for analysis of movement repetitions, and questions were answered about participant history and their perceptions of the games. Participants remained engaged through both games; regardless of previous computer use all participants successfully played two games. Five minutes of game play averaged 34 repetitions of the affected extremity. The Intrinsic Motivation Inventory showed a high level of satisfaction in two of the three participants. The Kinect Sensor with the FAAST software has the potential to be an economical tool to be used alongside traditional therapy to increase the number of repetitions completed in a motivating and engaging way for clients.
White, M.A.; Schmidt, J.C.; Topping, D.J.
2005-01-01
Wavelet analysis is a powerful tool with which to analyse the hydrologic effects of dam construction and operation on river systems. Using continuous records of instantaneous discharge from the Lees Ferry gauging station and records of daily mean discharge from upstream tributaries, we conducted wavelet analyses of the hydrologic structure of the Colorado River in Grand Canyon. The wavelet power spectrum (WPS) of daily mean discharge provided a highly compressed and integrative picture of the post-dam elimination of pronounced annual and sub-annual flow features. The WPS of the continuous record showed the influence of diurnal and weekly power generation cycles, shifts in discharge management, and the 1996 experimental flood in the post-dam period. Normalization of the WPS by local wavelet spectra revealed the fine structure of modulation in discharge scale and amplitude and provides an extremely efficient tool with which to assess the relationships among hydrologic cycles and ecological and geomorphic systems. We extended our analysis to sections of the Snake River and showed how wavelet analysis can be used as a data mining technique. The wavelet approach is an especially promising tool with which to assess dam operation in less well-studied regions and to evaluate management attempts to reconstruct desired flow characteristics. Copyright ?? 2005 John Wiley & Sons, Ltd.
Predictive Modeling of Risk Associated with Temperature Extremes over Continental US
NASA Astrophysics Data System (ADS)
Kravtsov, S.; Roebber, P.; Brazauskas, V.
2016-12-01
We build an extremely statistically accurate, essentially bias-free empirical emulator of atmospheric surface temperature and apply it for meteorological risk assessment over the domain of continental US. The resulting prediction scheme achieves an order-of-magnitude or larger gain of numerical efficiency compared with the schemes based on high-resolution dynamical atmospheric models, leading to unprecedented accuracy of the estimated risk distributions. The empirical model construction methodology is based on our earlier work, but is further modified to account for the influence of large-scale, global climate change on regional US weather and climate. The resulting estimates of the time-dependent, spatially extended probability of temperature extremes over the simulation period can be used as a risk management tool by insurance companies and regulatory governmental agencies.
James, G. Andrew; Lu, Zhong-Lin; VanMeter, John W.; Sathian, K.; Hu, Xiaoping P.; Butler, Andrew J.
2013-01-01
Background A promising paradigm in human neuroimaging is the study of slow (<0.1 Hz) spontaneous fluctuations in the hemodynamic response measured by functional magnetic resonance imaging (fMRI). Spontaneous activity (i.e., resting state) refers to activity that cannot be attributed to specific inputs or outputs, that is, activity intrinsically generated by the brain. Method This article presents pilot data examining neural connectivity in patients with poststroke hemiparesis before and after 3 weeks of upper extremity rehabilitation in the Accelerated Skill Acquisition Program (ASAP). Resting-state fMRI data acquired pre and post therapy were analyzed using an exploratory adaptation of structural equation modeling (SEM) to evaluate therapy-related changes in motor network effective connectivity. Results Each ASAP patient showed behavioral improvement. ASAP patients also showed increased influence of the affected hemisphere premotor cortex (a-PM) upon the unaffected hemisphere premotor cortex (u-PM) following therapy. The influence of a-PM on affected hemisphere primary motor cortex (a-M1) also increased with therapy for 3 of 5 patients, including those with greatest behavioral improvement. Conclusions Our findings suggest that network analyses of resting-state fMRI constitute promising tools for functional characterization of functional brain disorders, for intergroup comparisons, and potentially for assessing effective connectivity within single subjects; all of which have important implications for stroke rehabilitation. PMID:19740732
Leonard, M; Graham, S; Bonacum, D
2004-10-01
Effective communication and teamwork is essential for the delivery of high quality, safe patient care. Communication failures are an extremely common cause of inadvertent patient harm. The complexity of medical care, coupled with the inherent limitations of human performance, make it critically important that clinicians have standardised communication tools, create an environment in which individuals can speak up and express concerns, and share common "critical language" to alert team members to unsafe situations. All too frequently, effective communication is situation or personality dependent. Other high reliability domains, such as commercial aviation, have shown that the adoption of standardised tools and behaviours is a very effective strategy in enhancing teamwork and reducing risk. We describe our ongoing patient safety implementation using this approach within Kaiser Permanente, a non-profit American healthcare system providing care for 8.3 million patients. We describe specific clinical experience in the application of surgical briefings, properties of high reliability perinatal care, the value of critical event training and simulation, and benefits of a standardised communication process in the care of patients transferred from hospitals to skilled nursing facilities. Additionally, lessons learned as to effective techniques in achieving cultural change, evidence of improving the quality of the work environment, practice transfer strategies, critical success factors, and the evolving methods of demonstrating the benefit of such work are described.
NASA Technical Reports Server (NTRS)
Saini, Subhash; Frumkin, Michael; Hribar, Michelle; Jin, Hao-Qiang; Waheed, Abdul; Yan, Jerry
1998-01-01
Porting applications to new high performance parallel and distributed computing platforms is a challenging task. Since writing parallel code by hand is extremely time consuming and costly, porting codes would ideally be automated by using some parallelization tools and compilers. In this paper, we compare the performance of the hand written NAB Parallel Benchmarks against three parallel versions generated with the help of tools and compilers: 1) CAPTools: an interactive computer aided parallelization too] that generates message passing code, 2) the Portland Group's HPF compiler and 3) using compiler directives with the native FORTAN77 compiler on the SGI Origin2000.
Timler, Dariusz; Bogusiak, Katarzyna; Kasielska-Trojan, Anna; Neskoromna-Jędrzejczak, Aneta; Gałązkowski, Robert; Szarpak, Łukasz
2016-02-01
The aim of the study was to verify the effectiveness of short text messages (short message service, or SMS) as an additional notification tool in case of fire or a mass casualty incident in a hospital. A total of 2242 SMS text messages were sent to 59 hospital workers divided into 3 groups (n=21, n=19, n=19). Messages were sent from a Samsung GT-S8500 Wave cell phone and Orange Poland was chosen as the telecommunication provider. During a 3-month trial period, messages were sent between 3:35 PM and midnight with no regular pattern. Employees were asked to respond by telling how much time it would take them to reach the hospital in case of a mass casualty incident. The mean reaction time (SMS reply) was 36.41 minutes. The mean declared time of arrival to the hospital was 100.5 minutes. After excluding 10% of extreme values for declared arrival time, the mean arrival time was estimated as 38.35 minutes. Short text messages (SMS) can be considered an additional tool for notifying medical staff in case of a mass casualty incident.
NASA Astrophysics Data System (ADS)
Kacprzyk, Janusz; Zadrożny, Sławomir
2010-05-01
We present how the conceptually and numerically simple concept of a fuzzy linguistic database summary can be a very powerful tool for gaining much insight into the very essence of data. The use of linguistic summaries provides tools for the verbalisation of data analysis (mining) results which, in addition to the more commonly used visualisation, e.g. via a graphical user interface, can contribute to an increased human consistency and ease of use, notably for supporting decision makers via the data-driven decision support system paradigm. Two new relevant aspects of the analysis are also outlined which were first initiated by the authors. First, following Kacprzyk and Zadrożny, it is further considered how linguistic data summarisation is closely related to some types of solutions used in natural language generation (NLG). This can make it possible to use more and more effective and efficient tools and techniques developed in NLG. Second, similar remarks are given on relations to systemic functional linguistics. Moreover, following Kacprzyk and Zadrożny, comments are given on an extremely relevant aspect of scalability of linguistic summarisation of data, using a new concept of a conceptual scalability.
NASA Astrophysics Data System (ADS)
Shearer, E. J.; Nguyen, P.; Ombadi, M.; Palacios, T.; Huynh, P.; Furman, D.; Tran, H.; Braithwaite, D.; Hsu, K. L.; Sorooshian, S.; Logan, W. S.
2017-12-01
During the 2017 hurricane season, three major hurricanes-Harvey, Irma, and Maria-devastated the Atlantic coast of the US and the Caribbean Islands. Harvey set the record for the rainiest storm in continental US history, Irma was the longest-lived powerful hurricane ever observed, and Maria was the costliest storm in Puerto Rican history. The recorded maximum precipitation totals for these storms were 65, 16, and 20 inches respectively. These events provided the Center for Hydrometeorology and Remote Sensing (CHRS) an opportunity to test its global real-time satellite precipitation observation system, iRain, for extreme storm events. The iRain system has been under development through a collaboration between CHRS at the University of California, Irvine (UCI) and UNESCO's International Hydrological Program (IHP). iRain provides near real-time high resolution (0.04°, approx. 4km) global (60°N - 60°S) satellite precipitation data estimated by the PERSIANN-Cloud Classification System (PERSIANN-CCS) algorithm developed by the scientists at CHRS. The user-interactive and web-accessible iRain system allows users to visualize and download real-time global satellite precipitation estimates and track the development and path of the current 50 largest storms globally from data generated by the PERSIANN-CCS algorithm. iRain continuously proves to be an effective tool for measuring real-time precipitation amounts of extreme storms-especially in locations that do not have extensive rain gauge or radar coverage. Such areas include large portions of the world's oceans and over continents such as Africa and Asia. CHRS also created a mobile app version of the system named "iRain UCI", available for iOS and Android devices. During these storms, real-time rainfall data generated by PERSIANN-CCS was consistently comparable to radar and rain gauge data. This presentation evaluates iRain's efficiency as a tool for extreme precipitation monitoring and provides an evaluation of the PERSIANN-CCS real-time rainfall estimates during Hurricanes Harvey, Irma, and Maria in relation to radar and rain gauge data using continuous (correlation, root mean square error, and bias) and categorical (POD and FAR) indices. These results present the relative skill of PERSIANN-CCS real-time data to radar and rain gauge data.
Demonstrating urban outdoor lighting for pedestrian safety and security : final report.
DOT National Transportation Integrated Search
2015-12-31
The goal of this project is to provide statistical inference for the communitys willingness to : pay for improvements in the resiliency to extreme events of the transportation system in : New York City.This objective seeks to provide better tools ...
Analysis of extreme rainfall events using attributes control charts in temporal rainfall processes
NASA Astrophysics Data System (ADS)
Villeta, María; Valencia, Jose Luis; Saá-Requejo, Antonio; María Tarquis, Ana
2015-04-01
The impacts of most intense rainfall events on agriculture and insurance industry can be very severe. This research focuses in the analysis of extreme rainfall events throughout the use of attributes control charts, which constitutes a usual tool in Statistical Process Control (SPC) but unusual in climate studios. Here, series of daily precipitations for the years 1931-2009 within a Spanish region are analyzed, based on a new type of attributes control chart that takes into account the autocorrelation between the extreme rainfall events. The aim is to conclude if there exist or not evidence of a change in the extreme rainfall model of the considered series. After adjusting seasonally the precipitation series and considering the data of the first 30 years, a frequency-based criterion allowed fixing specification limits in order to discriminate between extreme observed rainfall days and normal observed rainfall days. The autocorrelation amongst maximum precipitation is taken into account by a New Binomial Markov Extended Process obtained for each rainfall series. These modelling of the extreme rainfall processes provide a way to generate the attributes control charts for the annual fraction of rainfall extreme days. The extreme rainfall processes along the rest of the years under study can then be monitored by such attributes control charts. The results of the application of this methodology show evidence of change in the model of extreme rainfall events in some of the analyzed precipitation series. This suggests that the attributes control charts proposed for the analysis of the most intense precipitation events will be of practical interest to agriculture and insurance sectors in next future.
Predictability and possible earlier awareness of extreme precipitation across Europe
NASA Astrophysics Data System (ADS)
Lavers, David; Pappenberger, Florian; Richardson, David; Zsoter, Ervin
2017-04-01
Extreme hydrological events can cause large socioeconomic damages in Europe. In winter, a large proportion of these flood episodes are associated with atmospheric rivers, a region of intense water vapour transport within the warm sector of extratropical cyclones. When preparing for such extreme events, forecasts of precipitation from numerical weather prediction models or river discharge forecasts from hydrological models are generally used. Given the strong link between water vapour transport (integrated vapour transport IVT) and heavy precipitation, it is possible that IVT could be used to warn of extreme events. Furthermore, as IVT is located in extratropical cyclones, it is hypothesized to be a more predictable variable due to its link with synoptic-scale atmospheric dynamics. In this research, we firstly provide an overview of the predictability of IVT and precipitation forecasts, and secondly introduce and evaluate the ECMWF Extreme Forecast Index (EFI) for IVT. The EFI is a tool that has been developed to evaluate how ensemble forecasts differ from the model climate, thus revealing the extremeness of the forecast. The ability of the IVT EFI to capture extreme precipitation across Europe during winter 2013/14, 2014/15, and 2015/16 is presented. The results show that the IVT EFI is more capable than the precipitation EFI of identifying extreme precipitation in forecast week 2 during forecasts initialized in a positive North Atlantic Oscillation (NAO) phase. However, the precipitation EFI is superior during the negative NAO phase and at shorter lead times. An IVT EFI example is shown for storm Desmond in December 2015 highlighting its potential to identify upcoming hydrometeorological extremes.
Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR
NASA Astrophysics Data System (ADS)
Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.
2017-12-01
Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.
Changes in the frequency of extreme air pollution events over the Eastern United States and Europe
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Fiore, A. M.; Fang, Y.; Staehelin, J.
2011-12-01
Over the past few decades, thresholds for national air quality standards, intended to protect public health and welfare, have been lowered repeatedly. At the same time observations, over Europe and the Eastern U.S., demonstrate that extreme air pollution events (high O3 and PM2.5) are typically associated with stagnation events. Recent work showed that in a changing climate high air pollution events are likely to increase in frequency and duration. Within this work we examine meteorological and surface ozone observations from CASTNet over the U.S. and EMEP over Europe and "idealized" simulations with the GFDL AM3 chemistry-climate model, which isolate the role of climate change on air quality. Specifically, we examine an "idealized 1990s" simulation, forced with 20-year mean monthly climatologies for sea surface temperatures and sea ice from observations for 1981-2000, and an "idealized 2090s" simulation forced by the observed climatologies plus the multi-model mean changes in sea surface temperature and sea ice simulated by 19 IPCC AR-4 models under the A1B scenario for 2081-2100. With innovative statistical tools (empirical orthogonal functions (EOFs) and statistics of extremes (EVT)), we analyze the frequency distribution of past, present and future extreme air pollution events over the Eastern United States and Europe. The upper tail of observed values at individual stations (e.g., within the CASTNet), i.e., the extremes (maximum daily 8-hour average (MDA8) O3>60ppb) are poorly described by a Gaussian distribution. However, further analysis showed that applying Peak-Over-Threshold-models, better capture the extremes and allows us to estimate return levels of pollution events above certain threshold values of interest. We next apply EOF analysis to identify regions that vary coherently within the ground-based monitoring networks. Over the United States, the first EOF obtained from the model in both the 1990s and 2090s idealized simulations identifies the Northeast as a region that varies coherently. Correlation analysis reveals that this EOF pattern is most strongly expressed in association with high surface temperature and high surface pressure conditions, consistent with previous work showing that observed O3 episodes over this area reflect the combined impacts of stagnation and increased chemical production. Next steps include the extension of this analysis applying EVT tools to the principal component time series associated with this EOF. The combination of EOF and EVT tools applied to the GFDL AM3 1990s vs. 2090s idealized simulations will enable us to quantify changes in the return levels of air pollution extremes. Therefore the combination of observational data and numerical and statistical models should allow us to identify key driving forces between high air pollution events and to estimate changes in the frequency of such events under different climate change scenarios.
Effects of Extreme Temperatures on Cause-Specific Cardiovascular Mortality in China
Wang, Xuying; Li, Guoxing; Liu, Liqun; Westerdahl, Dane; Jin, Xiaobin; Pan, Xiaochuan
2015-01-01
Objective: Limited evidence is available for the effects of extreme temperatures on cause-specific cardiovascular mortality in China. Methods: We collected data from Beijing and Shanghai, China, during 2007–2009, including the daily mortality of cardiovascular disease, cerebrovascular disease, ischemic heart disease and hypertensive disease, as well as air pollution concentrations and weather conditions. We used Poisson regression with a distributed lag non-linear model to examine the effects of extremely high and low ambient temperatures on cause-specific cardiovascular mortality. Results: For all cause-specific cardiovascular mortality, Beijing had stronger cold and hot effects than those in Shanghai. The cold effects on cause-specific cardiovascular mortality reached the strongest at lag 0–27, while the hot effects reached the strongest at lag 0–14. The effects of extremely low and high temperatures differed by mortality types in the two cities. Hypertensive disease in Beijing was particularly susceptible to both extremely high and low temperatures; while for Shanghai, people with ischemic heart disease showed the greatest relative risk (RRs = 1.16, 95% CI: 1.03, 1.34) to extremely low temperature. Conclusion: People with hypertensive disease were particularly susceptible to extremely low and high temperatures in Beijing. People with ischemic heart disease in Shanghai showed greater susceptibility to extremely cold days. PMID:26703637
Effects of Extreme Temperatures on Cause-Specific Cardiovascular Mortality in China.
Wang, Xuying; Li, Guoxing; Liu, Liqun; Westerdahl, Dane; Jin, Xiaobin; Pan, Xiaochuan
2015-12-21
Limited evidence is available for the effects of extreme temperatures on cause-specific cardiovascular mortality in China. We collected data from Beijing and Shanghai, China, during 2007-2009, including the daily mortality of cardiovascular disease, cerebrovascular disease, ischemic heart disease and hypertensive disease, as well as air pollution concentrations and weather conditions. We used Poisson regression with a distributed lag non-linear model to examine the effects of extremely high and low ambient temperatures on cause-specific cardiovascular mortality. For all cause-specific cardiovascular mortality, Beijing had stronger cold and hot effects than those in Shanghai. The cold effects on cause-specific cardiovascular mortality reached the strongest at lag 0-27, while the hot effects reached the strongest at lag 0-14. The effects of extremely low and high temperatures differed by mortality types in the two cities. Hypertensive disease in Beijing was particularly susceptible to both extremely high and low temperatures; while for Shanghai, people with ischemic heart disease showed the greatest relative risk (RRs = 1.16, 95% CI: 1.03, 1.34) to extremely low temperature. People with hypertensive disease were particularly susceptible to extremely low and high temperatures in Beijing. People with ischemic heart disease in Shanghai showed greater susceptibility to extremely cold days.
NASA Astrophysics Data System (ADS)
Alahmadi, F.; Rahman, N. A.; Abdulrazzak, M.
2014-09-01
Rainfall frequency analysis is an essential tool for the design of water related infrastructure. It can be used to predict future flood magnitudes for a given magnitude and frequency of extreme rainfall events. This study analyses the application of rainfall partial duration series (PDS) in the vast growing urban Madinah city located in the western part of Saudi Arabia. Different statistical distributions were applied (i.e. Normal, Log Normal, Extreme Value type I, Generalized Extreme Value, Pearson Type III, Log Pearson Type III) and their distribution parameters were estimated using L-moments methods. Also, different selection criteria models are applied, e.g. Akaike Information Criterion (AIC), Corrected Akaike Information Criterion (AICc), Bayesian Information Criterion (BIC) and Anderson-Darling Criterion (ADC). The analysis indicated the advantage of Generalized Extreme Value as the best fit statistical distribution for Madinah partial duration daily rainfall series. The outcome of such an evaluation can contribute toward better design criteria for flood management, especially flood protection measures.
Heat Vulnerability Index Mapping for Milwaukee and Wisconsin.
Christenson, Megan; Geiger, Sarah Dee; Phillips, Jeffrey; Anderson, Ben; Losurdo, Giovanna; Anderson, Henry A
Extreme heat waves elevate the population's risk for heat-related morbidity and mortality, specifically for vulnerable groups such as older adults and young children. In this context, we developed 2 Heat Vulnerability Indices (HVIs), one for the state of Wisconsin and one for the Milwaukee metropolitan area. Through the creation of an HVI, state and local agencies will be able to use the indices as a planning tool for extreme heat events. Data used for the HVIs were grouped into 4 categories: (1) population density; (2) health factors; (3) demographic and socioeconomic factors; and (4) natural and built environment factors. These categories were mapped at the Census block group level. Unweighted z-score data were used to determine index scores, which were then mapped by quantiles ranging from "high" to "low" vulnerability. Statewide, Menominee County exhibited the highest vulnerability to extreme heat. Milwaukee HVI findings indicated high vulnerability in the city's inner core versus low vulnerability along the lakeshore. Visualization of vulnerability could help local public health agencies prepare for future extreme heat events.
In-vehicle extremity injuries from improvised explosive devices: current and future foci
Ramasamy, Arul; Masouros, Spyros D.; Newell, Nicolas; Hill, Adam M.; Proud, William G.; Brown, Katherine A.; Bull, Anthony M. J.; Clasper, Jon C.
2011-01-01
The conflicts in Iraq and Afghanistan have been epitomized by the insurgents' use of the improvised explosive device against vehicle-borne security forces. These weapons, capable of causing multiple severely injured casualties in a single incident, pose the most prevalent single threat to Coalition troops operating in the region. Improvements in personal protection and medical care have resulted in increasing numbers of casualties surviving with complex lower limb injuries, often leading to long-term disability. Thus, there exists an urgent requirement to investigate and mitigate against the mechanism of extremity injury caused by these devices. This will necessitate an ontological approach, linking molecular, cellular and tissue interaction to physiological dysfunction. This can only be achieved via a collaborative approach between clinicians, natural scientists and engineers, combining physical and numerical modelling tools with clinical data from the battlefield. In this article, we compile existing knowledge on the effects of explosions on skeletal injury, review and critique relevant experimental and computational research related to lower limb injury and damage and propose research foci required to drive the development of future mitigation technologies. PMID:21149353
Studying Weather and Climate Extremes in a Non-stationary Framework
NASA Astrophysics Data System (ADS)
Wu, Z.
2010-12-01
The study of weather and climate extremes often uses the theory of extreme values. Such a detection method has a major problem: to obtain the probability distribution of extremes, one has to implicitly assume the Earth’s climate is stationary over a long period within which the climatology is defined. While such detection makes some sense in a purely statistical view of stationary processes, it can lead to misleading statistical properties of weather and climate extremes caused by long term climate variability and change, and may also cause enormous difficulty in attributing and predicting these extremes. To alleviate this problem, here we report a novel non-stationary framework for studying weather and climate extremes in a non-stationary framework. In this new framework, the weather and climate extremes will be defined as timescale-dependent quantities derived from the anomalies with respect to non-stationary climatologies of different timescales. With this non-stationary framework, the non-stationary and nonlinear nature of climate system will be taken into account; and the attribution and the prediction of weather and climate extremes can then be separated into 1) the change of the statistical properties of the weather and climate extremes themselves and 2) the background climate variability and change. The new non-stationary framework will use the ensemble empirical mode decomposition (EEMD) method, which is a recent major improvement of the Hilbert-Huang Transform for time-frequency analysis. Using this tool, we will adaptively decompose various weather and climate data from observation and climate models in terms of the components of the various natural timescales contained in the data. With such decompositions, the non-stationary statistical properties (both spatial and temporal) of weather and climate anomalies and of their corresponding climatologies will be analyzed and documented.
Markin, Abraham; Barbero, Roxana; Leow, Jeffrey J; Groen, Reinou S; Perlman, Greg; Habermann, Elizabeth B; Apelgren, Keith N; Kushner, Adam L; Nwomeh, Benedict C
2014-09-01
In response to the need for simple, rapid means of quantifying surgical capacity in low resource settings, Surgeons OverSeas (SOS) developed the personnel, infrastructure, procedures, equipment and supplies (PIPES) tool. The present investigation assessed the inter-rater reliability of the PIPES tool. As part of a government assessment of surgical services in Santa Cruz, Bolivia, the PIPES tool was translated into Spanish and applied in interviews with physicians at 31 public hospitals. An additional interview was conducted with nurses at a convenience sample of 25 of these hospitals. Physician and nurse responses were then compared to generate an estimate of reliability. For dichotomous survey items, inter-rater reliability between physicians and nurses was assessed using the Cohen's kappa statistic and percent agreement. The Pearson correlation coefficient was used to assess agreement for continuous items. Cohen's kappa was 0.46 for infrastructure, 0.43 for procedures, 0.26 for equipment, and 0 for supplies sections. The median correlation coefficient was 0.91 for continuous items. Correlation was 0.79 for the PIPES index, and ranged from 0.32 to 0.98 for continuous response items. Reliability of the PIPES tool was moderate for the infrastructure and procedures sections, fair for the equipment section, and poor for supplies section when comparing surgeons' responses to nurses' responses-an extremely rigorous test of reliability. These results indicate that the PIPES tool is an effective measure of surgical capacity but that the equipment and supplies sections may need to be revised.
King, Trevor K; Severin, Colette N; Van Eerd, Dwayne; Ibrahim, Selahadin; Cole, Donald; Amick, Ben; Steenstra, Ivan A
2013-01-01
A pilot study examined the effectiveness of a biofeedback mouse in reducing upper extremity pain and discomfort in office workers; in addition, relative mouse use (RMU), satisfaction and the feasibility of running a randomised controlled trial (RCT) in a workplace setting were evaluated. The mouse would gently vibrate if the hand was idle for more than 12 s. The feedback reminded users to rest the arm in neutral, supported postures. Analysis showed a statistically significant reduction in shoulder pain and discomfort for the intervention group at T2 (38.7% lower than controls). Statistically significant differences in RMU time between groups were seen post intervention (-7% at T1 and +15% at T2 for the intervention group). Fifty-five percent of the intervention group was willing to continue using the mouse. It appears feasible to perform an RCT for this type of intervention in a workplace setting. Further study including more participants is suggested. The study findings support the feasibility of conducting randomised control trials in office settings to evaluate ergonomics interventions. The intervention resulted in reduced pain and discomfort in the shoulder. The intervention could be a relevant tool in the reduction of upper extremity musculoskeletal disorder. Further research will better explain the study's preliminary findings.
Rojo, Nuria; Amengual, Julian; Juncadella, Montserrat; Rubio, Francisco; Camara, Estela; Marco-Pallares, Josep; Schneider, Sabine; Veciana, Misericordia; Montero, Jordi; Mohammadi, Bahram; Altenmüller, Eckart; Grau, Carles; Münte, Thomas F; Rodriguez-Fornells, Antoni
2011-01-01
Music-Supported Therapy (MST) has been developed recently in order to improve the use of the affected upper extremity after stroke. This study investigated the neuroplastic mechanisms underlying effectiveness in a patient with chronic stroke. MST uses musical instruments, a midi piano and an electronic drum set emitting piano sounds, to retrain fine and gross movements of the paretic upper extremity. Data are presented from a patient with a chronic stroke (20 months post-stroke) with residual right-sided hemiparesis who took part in 20 MST sessions over the course of 4 weeks. Post-therapy, a marked improvement of movement quality, assessed by 3D movement analysis, was observed. Moreover, functional magnetic resonance imaging (fMRI) of a sequential hand movement revealed distinct therapy-related changes in the form of a reduction of excess contralateral and ipsilateral activations. This was accompanied by changes in cortical excitability evidenced by transcranial magnetic stimulation (TMS). Functional MRI in a music listening task suggests that one of the effects of MST is the task-dependent coupling of auditory and motor cortical areas. The MST appears to be a useful neurorehabilitation tool in patients with chronic stroke and leads to neural reorganization in the sensorimotor cortex.
Imholte, Gregory; Gottardo, Raphael
2017-01-01
Summary The peptide microarray immunoassay simultaneously screens sample serum against thousands of peptides, determining the presence of antibodies bound to array probes. Peptide microarrays tiling immunogenic regions of pathogens (e.g. envelope proteins of a virus) are an important high throughput tool for querying and mapping antibody binding. Because of the assay’s many steps, from probe synthesis to incubation, peptide microarray data can be noisy with extreme outliers. In addition, subjects may produce different antibody profiles in response to an identical vaccine stimulus or infection, due to variability among subjects’ immune systems. We present a robust Bayesian hierarchical model for peptide microarray experiments, pepBayes, to estimate the probability of antibody response for each subject/peptide combination. Heavy-tailed error distributions accommodate outliers and extreme responses, and tailored random effect terms automatically incorporate technical effects prevalent in the assay. We apply our model to two vaccine trial datasets to demonstrate model performance. Our approach enjoys high sensitivity and specificity when detecting vaccine induced antibody responses. A simulation study shows an adaptive thresholding classification method has appropriate false discovery rate control with high sensitivity, and receiver operating characteristics generated on vaccine trial data suggest that pepBayes clearly separates responses from non-responses. PMID:27061097
Funding of community-based interventions for HIV prevention.
Poku, Nana K; Bonnel, René
2016-07-01
Since the start of the HIV epidemic, community responses have been at the forefront of the response. Following the extraordinary expansion of global resources, the funding of community responses rose to reach at least US$690 million per year in the period 2005-2009. Since then, many civil society organisations (CSOs) have reported a drop in funding. Yet, the need for strong community responses is even more urgent, as shown by their role in reaching the Joint United Nations Programme on HIV/AIDS (UNAIDS) Fast-Track targets. In the case of antiretroviral treatment, interventions need to be adopted by most people at risk of HIV in order to have a substantial effect on the prevention of HIV at the population level. This paper reviews the published literature on community responses, funding and effectiveness. Additional funding is certainly needed to increase the coverage of community-based interventions (CBIs), but current evidence on their effectiveness is extremely mixed, which does not provide clear guidance to policy makers. This is especially an issue for adolescent girls and young women in Eastern and Southern Africa, who face extremely high infection risk, but the biomedical prevention tools that have been proven effective for the general population still remain pilot projects for this group. Research is especially needed to isolate the factors affecting the likelihood that interventions targeting this group are consistently successful. Such work could be focused on the community organisations that are currently involved in delivering gender-sensitive interventions.
The IUE Science Operations Ground System
NASA Technical Reports Server (NTRS)
Pitts, Ronald E.; Arquilla, Richard
1994-01-01
The International Ultraviolet Explorer (IUE) Science Operations System provides full realtime operations capabilities and support to the operations staff and astronomer users. The components of this very diverse and extremely flexible hardware and software system have played a major role in maintaining the scientific efficiency and productivity of the IUE. The software provides the staff and user with all the tools necessary for pre-visit and real-time planning and operations analysis for any day of the year. Examples of such tools include the effects of spacecraft constraints on target availability, maneuver times between targets, availability of guide stars, target identification, coordinate transforms, e-mail transfer of Observatory forms and messages, and quick-look analysis of image data. Most of this extensive software package can also be accessed remotely by individual users for information, scheduling of shifts, pre-visit planning, and actual observing program execution. Astronomers, with a modest investment in hardware and software, may establish remote observing sites. We currently have over 20 such sites in our remote observers' network.
The UK Healthy Universities Self-Review Tool: Whole System Impact.
Dooris, Mark; Farrier, Alan; Doherty, Sharon; Holt, Maxine; Monk, Robert; Powell, Susan
2018-06-01
Over recent years, there has been growing interest in Healthy Universities, evidenced by an increased number of national networks and the participation of 375 participants from over 30 countries in the 2015 International Conference on Health Promoting Universities and Colleges, which also saw the launch of the Okanagan Charter. This paper reports on research exploring the use and impact of the UK Healthy Universities Network's self review tool, specifically examining whether this has supported universities to understand and embed a whole system approach. The research study comprised two stages, the first using an online questionnaire and the second using focus groups. The findings revealed a wide range of perspectives under five overarching themes: motivations; process; outcomes/benefits; challenges/suggested improvements; and future use. In summary, the self review tool was extremely valuable and, when engaged with fully, offered significant benefits to universities seeking to improve the health and wellbeing of their communities. These benefits were felt by institutions at different stages in the journey and spanned outcome and process dimensions: not only did the tool offer an engaging and user-friendly means of undertaking internal benchmarking, generating an easy-to-understand report summarizing strengths and weaknesses; it also proved useful in building understanding of the whole system Healthy Universities approach and served as a catalyst to effective cross-university and cross-sectoral partnership working. Additionally, areas for potential enhancement were identified, offering opportunities to increase the tool's utility further whilst engaging actively in the development of a global movement for Healthy Universities.
Forecasting extreme temperature health hazards in Europe
NASA Astrophysics Data System (ADS)
Di Napoli, Claudia; Pappenberger, Florian; Cloke, Hannah L.
2017-04-01
Extreme hot temperatures, such as those experienced during a heat wave, represent a dangerous meteorological hazard to human health. Heat disorders such as sunstroke are harmful to people of all ages and responsible for excess mortality in the affected areas. In 2003 more than 50,000 people died in western and southern Europe because of a severe and sustained episode of summer heat [1]. Furthermore, according to the Intergovernmental Panel on Climate Change heat waves are expected to get more frequent in the future thus posing an increasing threat to human lives. Developing appropriate tools for extreme hot temperatures prediction is therefore mandatory to increase public preparedness and mitigate heat-induced impacts. A recent study has shown that forecasts of the Universal Thermal Climate Index (UTCI) provide a valid overview of extreme temperature health hazards on a global scale [2]. UTCI is a parameter related to the temperature of the human body and its regulatory responses to the surrounding atmospheric environment. UTCI is calculated using an advanced thermo-physiological model that includes the human heat budget, physiology and clothing. To forecast UTCI the model uses meteorological inputs, such as 2m air temperature, 2m water vapour pressure and wind velocity at body height derived from 10m wind speed, from NWP models. Here we examine the potential of UTCI as an extreme hot temperature prediction tool for the European area. UTCI forecasts calculated using above-mentioned parameters from ECMWF models are presented. The skill in predicting UTCI for medium lead times is also analysed and discussed for implementation to international health-hazard warning systems. This research is supported by the ANYWHERE project (EnhANcing emergencY management and response to extreme WeatHER and climate Events) which is funded by the European Commission's HORIZON2020 programme. [1] Koppe C. et al., Heat waves: risks and responses. World Health Organization. Health and Global Environmental Change, Series No. 2, Copenhagen, Denmark, 2004. [2] Pappenberger F. et al., Global forecasting of thermal health hazards: the skill of probabilistic predictions of the Universal Thermal Climate Index (UTCI), International Journal of Biometeorology 59(3): 311-323, 2015.
Capacity Building to Support Governmental Meteorological and Agricultural Communities in East Africa
NASA Astrophysics Data System (ADS)
Granger, S. L.; Macharia, D.; Das, N.; Andreadis, K.; Ines, A.
2016-12-01
There is a recognized need for data to support decision-making and planning in East Africa where people and national economies depend on rain fed agriculture and are vulnerable to a changing climate and extreme weather events. However, capacity to use existing global data stores and transition promising tools is a gap that severely limits the use and adoption of these data and tools. Although most people think of capacity building as simply training, it is really much more than that and has been more thoroughly described in the public health community as…."the process of developing and strengthening the skills, instincts, abilities, processes and resources that organizations and communities need to survive, adapt, and thrive in the fast-changing world." Data and tools from NASA and other providers are often not used as they could be for technical and institutional reasons. On the technical side, there is the perception that global data stores are impenetrable requiring special expertise to access them, even if the data can be accessed, the technical expertise to understand and use the data and tools may be lacking, and there can be a mismatch between science data and existing user tools. On the institutional side, it may be perceived that remote sensing data and tools are too "expensive", support from upper management may be non-existent due to limited resources or lack of interest, and there can be a lack of appreciation of data and statistics in decision making. How do we overcome some of these barriers to advance the use of remote sensing for applications and to ease transition of data and tools to stakeholders? Experience from recent capacity building efforts in East Africa in support of a NASA-SERVIR Applied Science Project to provide estimates of hydrologic extremes tied to crop yield will be discussed.
EXTATIC: ASML's α-tool development for EUVL
NASA Astrophysics Data System (ADS)
Meiling, Hans; Benschop, Jos P.; Hartman, Robert A.; Kuerz, Peter; Hoghoj, Peter; Geyl, Roland; Harned, Noreen
2002-07-01
Within the recently initiated EXTATIC project a complete full-field lithography exposure tool for he 50-nm technology node is being developed. The goal is to demonstrate the feasibility of extreme UV lithography (EUVL) for 50-nm imaging and to reduce technological risks in the development of EUVL production tools. We describe the EUV MEDEA+) framework in which EXTATIC is executed, and give an update on the status of the (alpha) -tool development. A brief summary of our in-house source-collector module development is given, as well as the general vacuum architecture of the (alpha) -tool is discussed. We discuss defect-free reticle handling, and investigated the uses of V-grooved brackets glued to the side of the reticle to reduce particle generation during takeovers. These takeovers do not only occur in the exposure tool, but also in multilayer deposition equipment, e-beam pattern writers, inspection tools, etc., where similar requirements on particle contamination are present. Finally, we present an update of mirror fabrication technology and show improved mirror figuring and finishing results.
ASCI visualization tool evaluation, Version 2.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kegelmeyer, P.
1997-04-01
The charter of the ASCI Visualization Common Tools subgroup was to investigate and evaluate 3D scientific visualization tools. As part of that effort, a Tri-Lab evaluation effort was launched in February of 1996. The first step was to agree on a thoroughly documented list of 32 features against which all tool candidates would be evaluated. These evaluation criteria were both gleaned from a user survey and determined from informed extrapolation into the future, particularly as concerns the 3D nature and extremely large size of ASCI data sets. The second step was to winnow a field of 41 candidate tools downmore » to 11. The selection principle was to be as inclusive as practical, retaining every tool that seemed to hold any promise of fulfilling all of ASCI`s visualization needs. These 11 tools were then closely investigated by volunteer evaluators distributed across LANL, LLNL, and SNL. This report contains the results of those evaluations, as well as a discussion of the evaluation philosophy and criteria.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver
2009-11-20
Knowledge discovery from large and complex collections of today's scientific datasets is a challenging task. With the ability to measure and simulate more processes at increasingly finer spatial and temporal scales, the increasing number of data dimensions and data objects is presenting tremendous challenges for data analysis and effective data exploration methods and tools. Researchers are overwhelmed with data and standard tools are often insufficient to enable effective data analysis and knowledge discovery. The main objective of this thesis is to provide important new capabilities to accelerate scientific knowledge discovery form large, complex, and multivariate scientific data. The research coveredmore » in this thesis addresses these scientific challenges using a combination of scientific visualization, information visualization, automated data analysis, and other enabling technologies, such as efficient data management. The effectiveness of the proposed analysis methods is demonstrated via applications in two distinct scientific research fields, namely developmental biology and high-energy physics.Advances in microscopy, image analysis, and embryo registration enable for the first time measurement of gene expression at cellular resolution for entire organisms. Analysis of high-dimensional spatial gene expression datasets is a challenging task. By integrating data clustering and visualization, analysis of complex, time-varying, spatial gene expression patterns and their formation becomes possible. The analysis framework MATLAB and the visualization have been integrated, making advanced analysis tools accessible to biologist and enabling bioinformatic researchers to directly integrate their analysis with the visualization. Laser wakefield particle accelerators (LWFAs) promise to be a new compact source of high-energy particles and radiation, with wide applications ranging from medicine to physics. To gain insight into the complex physical processes of particle acceleration, physicists model LWFAs computationally. The datasets produced by LWFA simulations are (i) extremely large, (ii) of varying spatial and temporal resolution, (iii) heterogeneous, and (iv) high-dimensional, making analysis and knowledge discovery from complex LWFA simulation data a challenging task. To address these challenges this thesis describes the integration of the visualization system VisIt and the state-of-the-art index/query system FastBit, enabling interactive visual exploration of extremely large three-dimensional particle datasets. Researchers are especially interested in beams of high-energy particles formed during the course of a simulation. This thesis describes novel methods for automatic detection and analysis of particle beams enabling a more accurate and efficient data analysis process. By integrating these automated analysis methods with visualization, this research enables more accurate, efficient, and effective analysis of LWFA simulation data than previously possible.« less
Exploring tool innovation: a comparison of Western and Bushman children.
Nielsen, Mark; Tomaselli, Keyan; Mushin, Ilana; Whiten, Andrew
2014-10-01
A capacity for constructing new tools, or using old tools in new ways, to solve novel problems is a core feature of what it means to be human. Yet current evidence suggests that young children are surprisingly poor at innovating tools. However, all studies of tool innovation to date have been conducted with children from comparatively privileged Western backgrounds. This raises questions as to whether or not previously documented tool innovation failure is culturally and economically specific. In the current study, thus, we explored the innovation capacities of children from Westernized urban backgrounds and from remote communities of South African Bushmen. Consistent with past research, we found tool innovation to occur at extremely low rates and that cultural background had no bearing on this. The current study is the first to empirically test tool innovation in children from non-Western backgrounds, with our data being consistent with the view that despite its key role in human evolution, a capacity for innovation in tool making remains remarkably undeveloped during early childhood. Copyright © 2014 Elsevier Inc. All rights reserved.
Glycine and GABAA Ultra-Sensitive Ethanol Receptors as Novel Tools for Alcohol and Brain Research
Naito, Anna; Muchhala, Karan H.; Asatryan, Liana; Trudell, James R.; Homanics, Gregg E.; Perkins, Daya I.; Alkana, Ronald L.
2014-01-01
A critical obstacle to developing effective medications to prevent and/or treat alcohol use disorders is the lack of specific knowledge regarding the plethora of molecular targets and mechanisms underlying alcohol (ethanol) action in the brain. To identify the role of individual receptor subunits in ethanol-induced behaviors, we developed a novel class of ultra-sensitive ethanol receptors (USERs) that allow activation of a single receptor subunit population sensitized to extremely low ethanol concentrations. USERs were created by mutating as few as four residues in the extracellular loop 2 region of glycine receptors (GlyRs) or γ-aminobutyric acid type A receptors (GABAARs), which are implicated in causing many behavioral effects linked to ethanol abuse. USERs, expressed in Xenopus oocytes and tested using two-electrode voltage clamp, demonstrated an increase in ethanol sensitivity of 100-fold over wild-type receptors by significantly decreasing the threshold and increasing the magnitude of ethanol response, without altering general receptor properties including sensitivity to the neurosteroid, allopregnanolone. These profound changes in ethanol sensitivity were observed across multiple subunits of GlyRs and GABAARs. Collectively, our studies set the stage for using USER technology in genetically engineered animals as a unique tool to increase understanding of the neurobiological basis of the behavioral effects of ethanol. PMID:25245406
On the abundance of extreme voids II: a survey of void mass functions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chongchitnan, Siri; Hunt, Matthew, E-mail: s.chongchitnan@hull.ac.uk, E-mail: m.d.hunt@2012.hull.ac.uk
2017-03-01
The abundance of cosmic voids can be described by an analogue of halo mass functions for galaxy clusters. In this work, we explore a number of void mass functions: from those based on excursion-set theory to new mass functions obtained by modifying halo mass functions. We show how different void mass functions vary in their predictions for the largest void expected in an observational volume, and compare those predictions to observational data. Our extreme-value formalism is shown to be a new practical tool for testing void theories against simulation and observation.
Instrument control software requirement specification for Extremely Large Telescopes
NASA Astrophysics Data System (ADS)
Young, Peter J.; Kiekebusch, Mario J.; Chiozzi, Gianluca
2010-07-01
Engineers in several observatories are now designing the next generation of optical telescopes, the Extremely Large Telescopes (ELT). These are very complex machines that will host sophisticated astronomical instruments to be used for a wide range of scientific studies. In order to carry out scientific observations, a software infrastructure is required to orchestrate the control of the multiple subsystems and functions. This paper will focus on describing the considerations, strategies and main issues related to the definition and analysis of the software requirements for the ELT's Instrument Control System using modern development processes and modelling tools like SysML.
Overview of Heatshield for Extreme Entry Environment Technology (HEEET)
NASA Technical Reports Server (NTRS)
Driver, David M.; Ellerby, Donald T.; Gasch, Matthew J.; Mahzari, Milad; Milos, Frank S.; Nishioka, Owen S.; Stackpoole, Margaret M.; Venkatapathy, Ethiraj; Young, Zion W.; Gage, Peter J.;
2018-01-01
The Heatshield for Extreme Entry Environment Technology (HEEET) projects objective is to mature a 3-D Woven Thermal Protection System (TPS) to Technical Readiness Level (TRL) 6 to support future NASA missions to destinations such as Venus and Saturn. The scope of the project, status of which will be discussed, encompasses development of manufacturing and integration processes, fabrication of a prototype 1m diameter engineering test unit (ETU) that will undergo a series of structural tests, characterizing material aerothermal performance including development of a material response model, and structural testing and analysis to develop tools to support design and establish system capability.
NASA Astrophysics Data System (ADS)
Bouchet, F.; Laurie, J.; Zaboronski, O.
2012-12-01
We describe transitions between attractors with either one, two or more zonal jets in models of turbulent atmosphere dynamics. Those transitions are extremely rare, and occur over times scales of centuries or millennia. They are extremely hard to observe in direct numerical simulations, because they require on one hand an extremely good resolution in order to simulate accurately the turbulence and on the other hand simulations performed over an extremely long time. Those conditions are usually not met together in any realistic models. However many examples of transitions between turbulent attractors in geophysical flows are known to exist (paths of the Kuroshio, Earth's magnetic field reversal, atmospheric flows, and so on). Their study through numerical computations is inaccessible using conventional means. We present an alternative approach, based on instanton theory and large deviations. Instanton theory provides a way to compute (both numerically and theoretically) extremely rare transitions between turbulent attractors. This tool, developed in field theory, and justified in some cases through the large deviation theory in mathematics, can be applied to models of turbulent atmosphere dynamics. It provides both new theoretical insights and new type of numerical algorithms. Those algorithms can predict transition histories and transition rates using numerical simulations run over only hundreds of typical model dynamical time, which is several order of magnitude lower than the typical transition time. We illustrate the power of those tools in the framework of quasi-geostrophic models. We show regimes where two or more attractors coexist. Those attractors corresponds to turbulent flows dominated by either one or more zonal jets similar to midlatitude atmosphere jets. Among the trajectories connecting two non-equilibrium attractors, we determine the most probable ones. Moreover, we also determine the transition rates, which are several of magnitude larger than a typical time determined from the jet structure. We discuss the medium-term generalization of those results to models with more complexity, like primitive equations or GCMs.
NASA Astrophysics Data System (ADS)
Schneider, Bastian; Hoffmann, Gösta
2017-04-01
The shores of the Northern Indian Ocean were exposed to extreme wave inundation in the past. Two relevant hazards, storm surges triggered by tropical cyclones and tsunamis, are known to occur in the region but are rarely instrumentally recorded. Various sediment deposits along the coast are the only remnants of those past events. A profound understanding of return periods and magnitudes of past events is essential for developing land-use planning and risk mitigation measures in Oman and neighboring countries. A detailed investigation of these deposits, in this case primarily blocks and boulder trains but also fine grained sediments, provides insight on parameters such as wave height and inundation distance. These parameters can then be used for modeling inundation scenarios superimposed on modern infrastructure. We are investigating the spatial 3D-distribution of the extreme wave event sediments along the coastline through a high-precision survey of the event deposits using a Faro Focus 3D X330 TLS. A TLS is capable of recording high-detail and colored point clouds, which allows detailed measurements and has proved to be a powerful tool in geosciences. These multi-parameter point clouds in combination with dating results serve as a base for extreme wave event return period and magnitude estimations. Relevant parameters on large sediments are size, shape, volume, mass as well as relative arrangement, sorting and orientation. Furthermore, the TLS data is used to distinguish between the various boulder lithologies using a multi-scale supervised classification. Surface roughness as a result of weathering can serve as an indicator for exposure time of boulders and hint on various generations of extreme wave events. The distribution of the boulders relative to the site they were quarried from indicates on the flow direction of the waves and consequently might help to distinguish between storm and tsunami waves.
An independent assessment of the monthly PRISM gridded precipitation product in central Oklahoma
USDA-ARS?s Scientific Manuscript database
The development of climate-informed decision support tools for agricultural management requires long-duration location-specific climatologies due to the extreme spatiotemporal variability of precipitation. The traditional source of precipitation data (rain gauges) are too sparsely located to fill t...
Functional Measurement: An Incredibly Flexible Tool
ERIC Educational Resources Information Center
Mullet, Etienne; Morales Martinez, Guadalupe Elizabeth; Makris, Ioannis; Roge, Bernadette; Munoz Sastre, Maria Teresa
2012-01-01
Functional Measurement (FM) has been applied to a variety of settings that can be considered as "extreme" settings; that is, settings involving participants with severe cognitive disabilities or involving unusual stimulus material. FM has, as instance, been successfully applied for analyzing (a) numerosity judgments among children as…
Stochastic sensitivity measure for mistuned high-performance turbines
NASA Technical Reports Server (NTRS)
Murthy, Durbha V.; Pierre, Christophe
1992-01-01
A stochastic measure of sensitivity is developed in order to predict the effects of small random blade mistuning on the dynamic aeroelastic response of turbomachinery blade assemblies. This sensitivity measure is based solely on the nominal system design (i.e., on tuned system information), which makes it extremely easy and inexpensive to calculate. The measure has the potential to become a valuable design tool that will enable designers to evaluate mistuning effects at a preliminary design stage and thus assess the need for a full mistuned rotor analysis. The predictive capability of the sensitivity measure is illustrated by examining the effects of mistuning on the aeroelastic modes of the first stage of the oxidizer turbopump in the Space Shuttle Main Engine. Results from a full analysis mistuned systems confirm that the simple stochastic sensitivity measure predicts consistently the drastic changes due to misturning and the localization of aeroelastic vibration to a few blades.
Kaakinen, M; Huttunen, S; Paavolainen, L; Marjomäki, V; Heikkilä, J; Eklund, L
2014-01-01
Phase-contrast illumination is simple and most commonly used microscopic method to observe nonstained living cells. Automatic cell segmentation and motion analysis provide tools to analyze single cell motility in large cell populations. However, the challenge is to find a sophisticated method that is sufficiently accurate to generate reliable results, robust to function under the wide range of illumination conditions encountered in phase-contrast microscopy, and also computationally light for efficient analysis of large number of cells and image frames. To develop better automatic tools for analysis of low magnification phase-contrast images in time-lapse cell migration movies, we investigated the performance of cell segmentation method that is based on the intrinsic properties of maximally stable extremal regions (MSER). MSER was found to be reliable and effective in a wide range of experimental conditions. When compared to the commonly used segmentation approaches, MSER required negligible preoptimization steps thus dramatically reducing the computation time. To analyze cell migration characteristics in time-lapse movies, the MSER-based automatic cell detection was accompanied by a Kalman filter multiobject tracker that efficiently tracked individual cells even in confluent cell populations. This allowed quantitative cell motion analysis resulting in accurate measurements of the migration magnitude and direction of individual cells, as well as characteristics of collective migration of cell groups. Our results demonstrate that MSER accompanied by temporal data association is a powerful tool for accurate and reliable analysis of the dynamic behaviour of cells in phase-contrast image sequences. These techniques tolerate varying and nonoptimal imaging conditions and due to their relatively light computational requirements they should help to resolve problems in computationally demanding and often time-consuming large-scale dynamical analysis of cultured cells. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.
NASA Astrophysics Data System (ADS)
Deo, Ravinesh C.; Şahin, Mehmet
2015-02-01
The prediction of future drought is an effective mitigation tool for assessing adverse consequences of drought events on vital water resources, agriculture, ecosystems and hydrology. Data-driven model predictions using machine learning algorithms are promising tenets for these purposes as they require less developmental time, minimal inputs and are relatively less complex than the dynamic or physical model. This paper authenticates a computationally simple, fast and efficient non-linear algorithm known as extreme learning machine (ELM) for the prediction of Effective Drought Index (EDI) in eastern Australia using input data trained from 1957-2008 and the monthly EDI predicted over the period 2009-2011. The predictive variables for the ELM model were the rainfall and mean, minimum and maximum air temperatures, supplemented by the large-scale climate mode indices of interest as regression covariates, namely the Southern Oscillation Index, Pacific Decadal Oscillation, Southern Annular Mode and the Indian Ocean Dipole moment. To demonstrate the effectiveness of the proposed data-driven model a performance comparison in terms of the prediction capabilities and learning speeds was conducted between the proposed ELM algorithm and the conventional artificial neural network (ANN) algorithm trained with Levenberg-Marquardt back propagation. The prediction metrics certified an excellent performance of the ELM over the ANN model for the overall test sites, thus yielding Mean Absolute Errors, Root-Mean Square Errors, Coefficients of Determination and Willmott's Indices of Agreement of 0.277, 0.008, 0.892 and 0.93 (for ELM) and 0.602, 0.172, 0.578 and 0.92 (for ANN) models. Moreover, the ELM model was executed with learning speed 32 times faster and training speed 6.1 times faster than the ANN model. An improvement in the prediction capability of the drought duration and severity by the ELM model was achieved. Based on these results we aver that out of the two machine learning algorithms tested, the ELM was the more expeditious tool for prediction of drought and its related properties.
Climate Variability and Weather Extremes: Model-Simulated and Historical Data. Chapter 9
NASA Technical Reports Server (NTRS)
Schubert, Siegfried D.; Lim, Young-Kwon
2012-01-01
Extremes in weather and climate encompass a wide array of phenomena including tropical storms, mesoscale convective systems, snowstorms, floods, heat waves, and drought. Understanding how such extremes might change in the future requires an understanding of their past behavior including their connections to large-scale climate variability and trends. Previous studies suggest that the most robust findings concerning changes in short-term extremes are those that can be most directly (though not completely) tied to the increase in the global mean temperatures. These include the findings that (IPCC 2007): There has been a widespread reduction in the number of frost days in mid-latitude regions in recent decades, an increase in the number of warm extremes, particularly warm nights, and a reduction in the number of cold extremes, particularly cold nights. For North America in particular (CCSP SAP 3.3, 2008): There are fewer unusually cold days during the last few decades. The last 10 years have seen a lower number of severe cold waves than for any other 10-year period in the historical record that dates back to 1895. There has been a decrease in the number of frost days and a lengthening of the frost-free season, particularly in the western part of North America. Other aspects of extremes such as the changes in storminess have a less clear signature of long term change, with considerable interannual, and decadal variability that can obscure any climate change signal. Nevertheless, regarding extratropical storms (CCSP SAP 3.3, 2008): The balance of evidence suggests that there has been a northward shift in the tracks of strong low pressure systems (storms) in both the North Atlantic and North Pacific basins. For North America: Regional analyses suggest that there has been a decrease in snowstorms in the South and lower Midwest of the United States, and an increase in snowstorms in the upper Midwest and Northeast. Despite the progress already made, our understanding of the basic mechanisms by which extremes vary is incomplete. As noted in IPCC (2007), Incomplete global data sets and remaining model uncertainties still restrict understanding of changes in extremes and attribution of changes to causes, although understanding of changes in the intensity, frequency and risk of extremes has improved. Separating decadal and other shorter-term variability from climate change impacts on extremes requires a better understanding of the processes responsible for the changes. In particular, the physical processes linking sea surface temperature changes to regional climate changes, and a basic understanding of the inherent variability in weather extremes and how that is impacted by atmospheric circulation changes at subseasonal to decadal and longer time scales, are still inadequately understood. Given the fundamental limitations in the time span and quality of global observations, substantial progress on these issues will rely increasingly on improvements in models, with observations continuing to play a critical role, though less as a detection tool, and more as a tool for addressing physical processes, and to insure the quality of the climate models and the verisimilitude of the simulations (CCSP SAP 1.3, 2008).
Design, implementation and migration of security systems as an extreme project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scharmer, Carol; Trujillo, David
2010-08-01
Decision Trees, algorithms, software code, risk management, reports, plans, drawings, change control, presentations, and analysis - all useful tools and efforts but time consuming, resource intensive, and potentially costly for projects that have absolute schedule and budget constraints. What are necessary and prudent efforts when a customer calls with a major security problem that needs to be fixed with a proven, off-the-approval-list, multi-layered integrated system with high visibility and limited funding and expires at the end of the Fiscal Year? Whether driven by budget cycles, safety, or by management decree, many such projects begin with generic scopes and funding allocatedmore » based on a rapid management 'guestimate.' Then a Project Manager (PM) is assigned a project with a predefined and potentially limited scope, compressed schedule, and potentially insufficient funding. The PM is tasked to rapidly and cost effectively coordinate a requirements-based design, implementation, test, and turnover of a fully operational system to the customer, all while the customer is operating and maintaining an existing security system. Many project management manuals call this an impossible project that should not be attempted. However, security is serious business and the reality is that rapid deployment of proven systems via an 'Extreme Project' is sometimes necessary. Extreme Projects can be wildly successful but require a dedicated team of security professionals lead by an experienced project manager using a highly-tailored and agile project management process with management support at all levels, all combined with significant interface with the customer. This paper does not advocate such projects or condone eliminating the valuable analysis and project management techniques. Indeed, having worked on a well-planned project provides the basis for experienced team members to complete Extreme Projects. This paper does, however, provide insight into what it takes for projects to be successfully implemented and accepted when completed under extreme conditions.« less
Design implementation and migration of security systems as an extreme project.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scharmer, Carol
2010-10-01
Decision Trees, algorithms, software code, risk management, reports, plans, drawings, change control, presentations, and analysis - all useful tools and efforts but time consuming, resource intensive, and potentially costly for projects that have absolute schedule and budget constraints. What are necessary and prudent efforts when a customer calls with a major security problem that needs to be fixed with a proven, off-the-approval-list, multi-layered integrated system with high visibility and limited funding and expires at the end of the Fiscal Year? Whether driven by budget cycles, safety, or by management decree, many such projects begin with generic scopes and funding allocatedmore » based on a rapid management 'guestimate.' Then a Project Manager (PM) is assigned a project with a predefined and potentially limited scope, compressed schedule, and potentially insufficient funding. The PM is tasked to rapidly and cost effectively coordinate a requirements-based design, implementation, test, and turnover of a fully operational system to the customer, all while the customer is operating and maintaining an existing security system. Many project management manuals call this an impossible project that should not be attempted. However, security is serious business and the reality is that rapid deployment of proven systems via an 'Extreme Project' is sometimes necessary. Extreme Projects can be wildly successful but require a dedicated team of security professionals lead by an experienced project manager using a highly-tailored and agile project management process with management support at all levels, all combined with significant interface with the customer. This paper does not advocate such projects or condone eliminating the valuable analysis and project management techniques. Indeed, having worked on a well-planned project provides the basis for experienced team members to complete Extreme Projects. This paper does, however, provide insight into what it takes for projects to be successfully implemented and accepted when completed under extreme conditions.« less
Kunda, Z; Oleson, K C
1997-05-01
The authors examined how the extent to which counterstereotypic individuals deviate from perceivers' stereotypes affects their impact on these stereotypes and found that extremely deviant group members provoke less stereotype assimilation than do moderately deviant ones. Extremely deviant examples can even provoke boomerang effects, that is, enhance the very stereotype that they violate. When participants whose prior stereotype were moderate or extreme were exposed to moderately or extremely deviant examples, the deviant examples' impact on stereotypes depended both on their extremity and on the extremity of perceivers' prior stereotypes. Boomerang effects were obtained only for extreme-stereotype participants exposed to extremely deviant examples and were mediated by perceptions of the typicality of the deviant examples. Open-ended explanations revealed that the atypicality of extremely deviant examples was used as grounds for dismissing them.
Task parameters affecting ergonomic demands and productivity of HVAC duct installation.
Mitropoulos, Panagiotis; Hussain, Sanaa; Guarascio-Howard, Linda; Memarian, Babak
2014-01-01
Mechanical installation workers experience work-related musculoskeletal disorders (WMSDs) at high rates. (1) Quantify the ergonomic demands during HVAC installation, (2) identify the tasks and task parameters that generated extreme ergonomic demands, and (3) propose improvements to reduce the WMSDs among mechanical workers. The study focused on installation of rectangular ductwork components using ladders, and analyzed five operations by two mechanical contractors. Using continuous time observational assessment, the videotaped operations were analyzed along two dimensions: (1) the production tasks and durations, and (2) the ergonomic demands for four body regions (neck, arms/shoulders, back, and knees). The analysis identified tasks with low portion of productive time and high portion of extreme postures, and task parameters that generated extreme postures. Duct alignment was the task with the highest portion of extreme postures. The position of the ladder (angle and distance from the duct) was a task parameter that strongly influenced the extreme postures for back, neck and shoulders. Other contributing factors included the difficulty to reach the hand tools when working on the ladder, the congestion of components in the ceiling, and the space between the duct and the ceiling. The identified tasks and factors provide directions for improvement.
Combining local search with co-evolution in a remarkably simple way
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boettcher, S.; Percus, A.
2000-05-01
The authors explore a new general-purpose heuristic for finding high-quality solutions to hard optimization problem. The method, called extremal optimization, is inspired by self-organized criticality, a concept introduced to describe emergent complexity in physical systems. In contrast to genetic algorithms, which operate on an entire gene-pool of possible solutions, extremal optimization successively replaces extremely undesirable elements of a single sub-optimal solution with new, random ones. Large fluctuations, or avalanches, ensue that efficiently explore many local optima. Drawing upon models used to simulate far-from-equilibrium dynamics, extremal optimization complements heuristics inspired by equilibrium statistical physics, such as simulated annealing. With only onemore » adjustable parameter, its performance has proved competitive with more elaborate methods, especially near phase transitions. Phase transitions are found in many combinatorial optimization problems, and have been conjectured to occur in the region of parameter space containing the hardest instances. We demonstrate how extremal optimization can be implemented for a variety of hard optimization problems. We believe that this will be a useful tool in the investigation of phase transitions in combinatorial optimization, thereby helping to elucidate the origin of computational complexity.« less
Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patchett, John M; Ahrens, James P; Lo, Li - Ta
2010-10-15
Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less
Cumulative hazard: The case of nuisance flooding
NASA Astrophysics Data System (ADS)
Moftakhari, Hamed R.; AghaKouchak, Amir; Sanders, Brett F.; Matthew, Richard A.
2017-02-01
The cumulative cost of frequent events (e.g., nuisance floods) over time may exceed the costs of the extreme but infrequent events for which societies typically prepare. Here we analyze the likelihood of exceedances above mean higher high water and the corresponding property value exposure for minor, major, and extreme coastal floods. Our results suggest that, in response to sea level rise, nuisance flooding (NF) could generate property value exposure comparable to, or larger than, extreme events. Determining whether (and when) low cost, nuisance incidents aggregate into high cost impacts and deciding when to invest in preventive measures are among the most difficult decisions for policymakers. It would be unfortunate if efforts to protect societies from extreme events (e.g., 0.01 annual probability) left them exposed to a cumulative hazard with enormous costs. We propose a Cumulative Hazard Index (CHI) as a tool for framing the future cumulative impact of low cost incidents relative to infrequent extreme events. CHI suggests that in New York, NY, Washington, DC, Miami, FL, San Francisco, CA, and Seattle, WA, a careful consideration of socioeconomic impacts of NF for prioritization is crucial for sustainable coastal flood risk management.
Individualized Behavioral Health Monitoring Tool
NASA Technical Reports Server (NTRS)
Mollicone, Daniel
2015-01-01
Behavioral health risks during long-duration space exploration missions are among the most difficult to predict, detect, and mitigate. Given the anticipated extended duration of future missions and their isolated, extreme, and confined environments, there is the possibility that behavior conditions and mental disorders will develop among astronaut crew. Pulsar Informatics, Inc., has developed a health monitoring tool that provides a means to detect and address behavioral disorders and mental conditions at an early stage. The tool integrates all available behavioral measures collected during a mission to identify possible health indicator warning signs within the context of quantitatively tracked mission stressors. It is unobtrusive and requires minimal crew time and effort to train and utilize. The monitoring tool can be deployed in space analog environments for validation testing and ultimate deployment in long-duration space exploration missions.
Final Report: Correctness Tools for Petascale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mellor-Crummey, John
2014-10-27
In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoringmore » of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.« less
A narrow open tubular column for high efficiency liquid chromatographic separation.
Chen, Huang; Yang, Yu; Qiao, Zhenzhen; Xiang, Piliang; Ren, Jiangtao; Meng, Yunzhu; Zhang, Kaiqi; Juan Lu, Joann; Liu, Shaorong
2018-04-30
We report a great feature of open tubular liquid chromatography when it is run using an extremely narrow (e.g., 2 μm inner diameter) open tubular column: more than 10 million plates per meter can be achieved in less than 10 min and under an elution pressure of ca. 20 bar. The column is coated with octadecylsilane and both isocratic and gradient separations are performed. We reveal a focusing effect that may be used to interpret the efficiency enhancement. We also demonstrate the feasibility of using this technique for separating complex peptide samples. This high-resolution and fast separation technique is promising and can lead to a powerful tool for trace sample analysis.
Esterhuyse, Surina; Avenant, Marinda; Redelinghuys, Nola; Kijko, Andrzej; Glazewski, Jan; Plit, Lisa; Kemp, Marthie; Smit, Ansie; Vos, A Tascha; Williamson, Richard
2016-12-15
The impacts associated with unconventional oil and gas (UOG) extraction will be cumulative in nature and will most likely occur on a regional scale, highlighting the importance of using strategic decision-making and management tools. Managing possible impacts responsibly is extremely important in a water scarce country such as South Africa, versus countries where more water may be available for UOG extraction activities. This review article explains the possible biophysical and socio-economic impacts associated with UOG extraction within the South African context and how these complex impacts interlink. Relevant policy and governance frameworks to manage these impacts are also highlighted. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rianna, Guido; Roca Collell, Marta; Uzielli, Marco; Van Ruiten, Kees; Mercogliano, Paola; Ciervo, Fabio; Reder, Alfredo
2017-04-01
In Campania Region (Southern Italy), expected increases in heavy rainfall events under the effect of climate changes and demographic pressure could entail a growth of occurrence of weather induced landslides and associated damages. Indeed, already in recent years, pyroclastic covers mantling the slopes of a large part of the Region have been affected by numerous events often causing victims and damages to infrastructures serving the urban centers. Due to the strategic relevance of the area, landslide events affecting volcanic layers in Campania Region are one of the five case studies investigated in the FP7 European Project INTACT about the impacts of extreme weather on critical infrastructure. The main aim of INTACT project is to increase the resilience of critical infrastructures (CI) facing extreme weather events improving the awareness of stakeholders and asset managers about such phenomena and their potential variations due to Climate Changes and providing tools to support risk management strategies. A WIKI has been designed as a remote support for all stages of the risk process through brief theoretical explanations (in Wiki style) about tools and methods proposed and reports on the findings and hints returned by case studies investigations. In order to have a product tailored to the needs and background of CI owners, managers and policy makers, an intense effort of knowledge co-production between researchers and stakeholders have been carried out in different case studies through questionnaires, meetings, workshops and/or 1-to-1 interviews. This work presents the different tools and approaches adopted to facilitate the exchange with stakeholders in the Campanian case study such as the "Storytelling approach", aiming to stress the need for a comprehensive and overall approach to the issue between the different disaster management phases (mitigation, preparedness, response and recovery) and actors; the CIRCLE approach developed by Deltares, partner in INTACT consortium, which investigates direct and cascading effects induced by landslide events in pyroclastic cover; pairwise comparisons to identify the more relevant parameters of protection actions against landslide events in pyroclastic soils; and cumulative distribution functions returned by multi model climate simulation ensembles, displaying the occurrence probability of fixed variations in weather-proxy for landslide events, and providing a reliable frame of the current uncertainties in climate projections. The main findings achieved through the application of these tools and methods for the Campanian test case are illustrated and discussed.
You, Zhu-Hong; Lei, Ying-Ke; Zhu, Lin; Xia, Junfeng; Wang, Bing
2013-01-01
Protein-protein interactions (PPIs) play crucial roles in the execution of various cellular processes and form the basis of biological mechanisms. Although large amount of PPIs data for different species has been generated by high-throughput experimental techniques, current PPI pairs obtained with experimental methods cover only a fraction of the complete PPI networks, and further, the experimental methods for identifying PPIs are both time-consuming and expensive. Hence, it is urgent and challenging to develop automated computational methods to efficiently and accurately predict PPIs. We present here a novel hierarchical PCA-EELM (principal component analysis-ensemble extreme learning machine) model to predict protein-protein interactions only using the information of protein sequences. In the proposed method, 11188 protein pairs retrieved from the DIP database were encoded into feature vectors by using four kinds of protein sequences information. Focusing on dimension reduction, an effective feature extraction method PCA was then employed to construct the most discriminative new feature set. Finally, multiple extreme learning machines were trained and then aggregated into a consensus classifier by majority voting. The ensembling of extreme learning machine removes the dependence of results on initial random weights and improves the prediction performance. When performed on the PPI data of Saccharomyces cerevisiae, the proposed method achieved 87.00% prediction accuracy with 86.15% sensitivity at the precision of 87.59%. Extensive experiments are performed to compare our method with state-of-the-art techniques Support Vector Machine (SVM). Experimental results demonstrate that proposed PCA-EELM outperforms the SVM method by 5-fold cross-validation. Besides, PCA-EELM performs faster than PCA-SVM based method. Consequently, the proposed approach can be considered as a new promising and powerful tools for predicting PPI with excellent performance and less time.
Global Warming Denial: The Human Brain on Extremes
NASA Astrophysics Data System (ADS)
Marrouch, N.; Johnson, B. T.; Slawinska, J. M.
2016-12-01
Future assessments of climate change rely on multi-model intercomparisons, and projections of the extreme events frequency are of particular interest as associated with significant economic costs and social threats. Notably, systematically simulated increases in the number of extreme weather events agree well with observational data over the last decade. At the same time, as the climate grows more volatile, widespread denial of climate change and its anthropocentric causes continues to proliferate (based on nationally representative U.S. polls). Simultaneous increases in both high-impact exposure and its denial is in stark contrast with our knowledge of socio-natural dynamics and its models. Disentangling this paradox requires an understanding of the origins of global warming denial at an individual level, and how subsequently it propagates across social networks of many scales, shaping global policies. However, as the real world and its dynamical models are complex (high-dimensional and coupled), separating the particular feedback of interest remains a challenge. Here, we demonstrate this feedback in a controlled experiment, where increasing unpredictability using helplessness-training paradigms induces changes in global warming denial, and the endorsement of conservative ideology. We explain these results in the context of evolutionary theory framing self-deception and denial as remnants of evolutionary processes that shaped and facilitated the survival of the human species. Further we link these findings to changes in neural and higher-level cognitive processes in response to unpredictable stimuli. We argue that climate change denial is an example of an extreme belief system that carries the potential to threaten the wellbeing of both humans and other species alike. It is therefore crucial to better quantify climate denial using social informatics tools that provide the means to improve its representations in coupled socio-geophysical models to mitigate its effects on global and local policies.
NASA Astrophysics Data System (ADS)
Kim, Byung Sik; Jeung, Se Jin; Lee, Dong Seop; Han, Woo Suk
2015-04-01
As the abnormal rainfall condition has been more and more frequently happen and serious by climate change and variabilities, the question whether the design of drainage system could be prepared with abnormal rainfall condition or not has been on the rise. Usually, the drainage system has been designed by rainfall I-D-F (Intensity-Duration-Frequency) curve with assumption that I-D-F curve is stationary. The design approach of the drainage system has limitation not to consider the extreme rainfall condition of which I-D-F curve is non-stationary by climate change and variabilities. Therefore, the assumption that the I-D-F curve is stationary to design drainage system maybe not available in the climate change period, because climate change has changed the characteristics of extremes rainfall event to be non-stationary. In this paper, design rainfall by rainfall duration and non-stationary I-D-F curve are derived by the conditional GEV distribution considering non-stationary of rainfall characteristics. Furthermore, the effect of designed peak flow with increase of rainfall intensity was analyzed by distributed rainfall-runoff model, S-RAT(Spatial Runoff Assessment Tool). Although there are some difference by rainfall duration, the traditional I-D-F curves underestimates the extreme rainfall events for high-frequency rainfall condition. As a result, this paper suggest that traditional I-D-F curves could not be suitable for the design of drainage system under climate change condition. Keywords : Drainage system, Climate Change, non-stationary, I-D-F curves This research was supported by a grant 'Development of multi-function debris flow control technique considering extreme rainfall event' [NEMA-Natural-2014-74] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of KOREA
Improving operating room productivity via parallel anesthesia processing.
Brown, Michael J; Subramanian, Arun; Curry, Timothy B; Kor, Daryl J; Moran, Steven L; Rohleder, Thomas R
2014-01-01
Parallel processing of regional anesthesia may improve operating room (OR) efficiency in patients undergoes upper extremity surgical procedures. The purpose of this paper is to evaluate whether performing regional anesthesia outside the OR in parallel increases total cases per day, improve efficiency and productivity. Data from all adult patients who underwent regional anesthesia as their primary anesthetic for upper extremity surgery over a one-year period were used to develop a simulation model. The model evaluated pure operating modes of regional anesthesia performed within and outside the OR in a parallel manner. The scenarios were used to evaluate how many surgeries could be completed in a standard work day (555 minutes) and assuming a standard three cases per day, what was the predicted end-of-day time overtime. Modeling results show that parallel processing of regional anesthesia increases the average cases per day for all surgeons included in the study. The average increase was 0.42 surgeries per day. Where it was assumed that three cases per day would be performed by all surgeons, the days going to overtime was reduced by 43 percent with parallel block. The overtime with parallel anesthesia was also projected to be 40 minutes less per day per surgeon. Key limitations include the assumption that all cases used regional anesthesia in the comparisons. Many days may have both regional and general anesthesia. Also, as a case study, single-center research may limit generalizability. Perioperative care providers should consider parallel administration of regional anesthesia where there is a desire to increase daily upper extremity surgical case capacity. Where there are sufficient resources to do parallel anesthesia processing, efficiency and productivity can be significantly improved. Simulation modeling can be an effective tool to show practice change effects at a system-wide level.
What's new in well logging and formation evaluation
Prensky, S.
2011-01-01
A number of significant new developments is emerging in well logging and formation evaluation. Some of the new developments include an ultrasonic wireline imager, an electromagnetic free-point indicator, wired and fiber-optic coiled tubing systems, and extreme-temperature logging-while-drilling (LWD) tools. The continued consolidation of logging and petrophysical service providers in 2010 means that these innovations are increasingly being provided by a few large companies. Weatherford International has launched a slimhole cross-dipole tool as part of the company's line of compact logging tools. The 26-ft-long Compact Cross-Dipole Sonic (CXD) tool can be run as part of a quad-combo compact logging string. Halliburton has introduced a version of its circumferential acoustic scanning tool (CAST) that runs on monoconductor cable (CAST-M) to provide high-resolution images in open hole and in cased hole for casing and cement evaluation.
Spark and HPC for High Energy Physics Data Analyses
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sehrish, Saba; Kowalkowski, Jim; Paterno, Marc
A full High Energy Physics (HEP) data analysis is divided into multiple data reduction phases. Processing within these phases is extremely time consuming, therefore intermediate results are stored in files held in mass storage systems and referenced as part of large datasets. This processing model limits what can be done with interactive data analytics. Growth in size and complexity of experimental datasets, along with emerging big data tools are beginning to cause changes to the traditional ways of doing data analyses. Use of big data tools for HEP analysis looks promising, mainly because extremely large HEP datasets can be representedmore » and held in memory across a system, and accessed interactively by encoding an analysis using highlevel programming abstractions. The mainstream tools, however, are not designed for scientific computing or for exploiting the available HPC platform features. We use an example from the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) in Geneva, Switzerland. The LHC is the highest energy particle collider in the world. Our use case focuses on searching for new types of elementary particles explaining Dark Matter in the universe. We use HDF5 as our input data format, and Spark to implement the use case. We show the benefits and limitations of using Spark with HDF5 on Edison at NERSC.« less
Applications of Microcomputers in the Education of the Physically Disabled Child.
ERIC Educational Resources Information Center
Foulds, Richard A.
1982-01-01
Microcomputers can serve as expressive communication tools for severely physically disabled persons. Features such as single input devices, direct selection aids, and speech synthesis capabilities can be extremely useful. The trend toward portable battery-operated computers will make the technology even more accessible. (CL)
DOT National Transportation Integrated Search
2015-12-31
The goal of this project is to provide statistical inference for the communitys willingness to pay for improvements in the resiliency to extreme events of the transportation system in New York City. This objective seeks to provide better tools for...
ERIC Educational Resources Information Center
Hogan, Kevin
2008-01-01
This article presents this year's winners of Tech & Learning's student photography contest. Selected from an overwhelming six thousand entries, these images are proof that "the kids today" are not only extremely talented, but also powerfully enabled by digital tools that can help them express and communicate those talents. The theme of the…
Sharpening ball-nose mill cutters
NASA Technical Reports Server (NTRS)
Burch, C. F.
1977-01-01
Economical attachment allows faster, more precise grinding. Vibrationless and rigid relation between grinding wheel and cutter allows for extremely high finish and accurate grinding. Leveling device levels flutes with respect to toolholder rotation that generates ball-nose radius. Constant relief around entire profile of cutting edge produces longer tool life.
A Generalized Framework for Non-Stationary Extreme Value Analysis
NASA Astrophysics Data System (ADS)
Ragno, E.; Cheng, L.; Sadegh, M.; AghaKouchak, A.
2017-12-01
Empirical trends in climate variables including precipitation, temperature, snow-water equivalent at regional to continental scales are evidence of changes in climate over time. The evolving climate conditions and human activity-related factors such as urbanization and population growth can exert further changes in weather and climate extremes. As a result, the scientific community faces an increasing demand for updated appraisal of the time-varying climate extremes. The purpose of this study is to offer a robust and flexible statistical tool for non-stationary extreme value analysis which can better characterize the severity and likelihood of extreme climatic variables. This is critical to ensure a more resilient environment in a changing climate. Following the positive feedback on the first version of Non-Stationary Extreme Value Analysis (NEVA) Toolbox by Cheng at al. 2014, we present an improved version, i.e. NEVA2.0. The upgraded version herein builds upon a newly-developed hybrid evolution Markov Chain Monte Carlo (MCMC) approach for numerical parameters estimation and uncertainty assessment. This addition leads to a more robust uncertainty estimates of return levels, return periods, and risks of climatic extremes under both stationary and non-stationary assumptions. Moreover, NEVA2.0 is flexible in incorporating any user-specified covariate other than the default time-covariate (e.g., CO2 emissions, large scale climatic oscillation patterns). The new feature will allow users to examine non-stationarity of extremes induced by physical conditions that underlie the extreme events (e.g. antecedent soil moisture deficit, large-scale climatic teleconnections, urbanization). In addition, the new version offers an option to generate stationary and/or non-stationary rainfall Intensity - Duration - Frequency (IDF) curves that are widely used for risk assessment and infrastructure design. Finally, a Graphical User Interface (GUI) of the package is provided, making NEVA accessible to a broader audience.
Evaluating the Large-Scale Environment of Extreme Events Using Reanalyses
NASA Astrophysics Data System (ADS)
Bosilovich, M. G.; Schubert, S. D.; Koster, R. D.; da Silva, A. M., Jr.; Eichmann, A.
2014-12-01
Extreme conditions and events have always been a long standing concern in weather forecasting and national security. While some evidence indicates extreme weather will increase in global change scenarios, extremes are often related to the large scale atmospheric circulation, but also occurring infrequently. Reanalyses assimilate substantial amounts of weather data and a primary strength of reanalysis data is the representation of the large-scale atmospheric environment. In this effort, we link the occurrences of extreme events or climate indicators to the underlying regional and global weather patterns. Now, with greater than 3o years of data, reanalyses can include multiple cases of extreme events, and thereby identify commonality among the weather to better characterize the large-scale to global environment linked to the indicator or extreme event. Since these features are certainly regionally dependent, and also, the indicators of climate are continually being developed, we outline various methods to analyze the reanalysis data and the development of tools to support regional evaluation of the data. Here, we provide some examples of both individual case studies and composite studies of similar events. For example, we will compare the large scale environment for Northeastern US extreme precipitation with that of highest mean precipitation seasons. Likewise, southerly winds can shown to be a major contributor to very warm days in the Northeast winter. While most of our development has involved NASA's MERRA reanalysis, we are also looking forward to MERRA-2 which includes several new features that greatly improve the representation of weather and climate, especially for the regions and sectors involved in the National Climate Assessment.
NASA Astrophysics Data System (ADS)
de Vries, A. J.; Ouwersloot, H. G.; Feldstein, S. B.; Riemer, M.; El Kenawy, A. M.; McCabe, M. F.; Lelieveld, J.
2018-01-01
Extreme precipitation events in the otherwise arid Middle East can cause flooding with dramatic socioeconomic impacts. Most of these events are associated with tropical-extratropical interactions, whereby a stratospheric potential vorticity (PV) intrusion reaches deep into the subtropics and forces an incursion of high poleward vertically integrated water vapor transport (IVT) into the Middle East. This study presents an object-based identification method for extreme precipitation events based on the combination of these two larger-scale meteorological features. The general motivation for this approach is that precipitation is often poorly simulated in relatively coarse weather and climate models, whereas the synoptic-scale circulation is much better represented. The algorithm is applied to ERA-Interim reanalysis data (1979-2015) and detects 90% (83%) of the 99th (97.5th) percentile of extreme precipitation days in the region of interest. Our results show that stratospheric PV intrusions and IVT structures are intimately connected to extreme precipitation intensity and seasonality. The farther south a stratospheric PV intrusion reaches, the larger the IVT magnitude, and the longer the duration of their combined occurrence, the more extreme the precipitation. Our algorithm detects a large fraction of the climatological rainfall amounts (40-70%), heavy precipitation days (50-80%), and the top 10 extreme precipitation days (60-90%) at many sites in southern Israel and the northern and western parts of Saudi Arabia. This identification method provides a new tool for future work to disentangle teleconnections, assess medium-range predictability, and improve understanding of climatic changes of extreme precipitation in the Middle East and elsewhere.
Frank, Dorothea; Reichstein, Markus; Bahn, Michael; Thonicke, Kirsten; Frank, David; Mahecha, Miguel D; Smith, Pete; van der Velde, Marijn; Vicca, Sara; Babst, Flurin; Beer, Christian; Buchmann, Nina; Canadell, Josep G; Ciais, Philippe; Cramer, Wolfgang; Ibrom, Andreas; Miglietta, Franco; Poulter, Ben; Rammig, Anja; Seneviratne, Sonia I; Walz, Ariane; Wattenbach, Martin; Zavala, Miguel A; Zscheischler, Jakob
2015-01-01
Extreme droughts, heat waves, frosts, precipitation, wind storms and other climate extremes may impact the structure, composition and functioning of terrestrial ecosystems, and thus carbon cycling and its feedbacks to the climate system. Yet, the interconnected avenues through which climate extremes drive ecological and physiological processes and alter the carbon balance are poorly understood. Here, we review the literature on carbon cycle relevant responses of ecosystems to extreme climatic events. Given that impacts of climate extremes are considered disturbances, we assume the respective general disturbance-induced mechanisms and processes to also operate in an extreme context. The paucity of well-defined studies currently renders a quantitative meta-analysis impossible, but permits us to develop a deductive framework for identifying the main mechanisms (and coupling thereof) through which climate extremes may act on the carbon cycle. We find that ecosystem responses can exceed the duration of the climate impacts via lagged effects on the carbon cycle. The expected regional impacts of future climate extremes will depend on changes in the probability and severity of their occurrence, on the compound effects and timing of different climate extremes, and on the vulnerability of each land-cover type modulated by management. Although processes and sensitivities differ among biomes, based on expert opinion, we expect forests to exhibit the largest net effect of extremes due to their large carbon pools and fluxes, potentially large indirect and lagged impacts, and long recovery time to regain previous stocks. At the global scale, we presume that droughts have the strongest and most widespread effects on terrestrial carbon cycling. Comparing impacts of climate extremes identified via remote sensing vs. ground-based observational case studies reveals that many regions in the (sub-)tropics are understudied. Hence, regional investigations are needed to allow a global upscaling of the impacts of climate extremes on global carbon–climate feedbacks. PMID:25752680
Frank, Dorothea; Reichstein, Markus; Bahn, Michael; Thonicke, Kirsten; Frank, David; Mahecha, Miguel D; Smith, Pete; van der Velde, Marijn; Vicca, Sara; Babst, Flurin; Beer, Christian; Buchmann, Nina; Canadell, Josep G; Ciais, Philippe; Cramer, Wolfgang; Ibrom, Andreas; Miglietta, Franco; Poulter, Ben; Rammig, Anja; Seneviratne, Sonia I; Walz, Ariane; Wattenbach, Martin; Zavala, Miguel A; Zscheischler, Jakob
2015-08-01
Extreme droughts, heat waves, frosts, precipitation, wind storms and other climate extremes may impact the structure, composition and functioning of terrestrial ecosystems, and thus carbon cycling and its feedbacks to the climate system. Yet, the interconnected avenues through which climate extremes drive ecological and physiological processes and alter the carbon balance are poorly understood. Here, we review the literature on carbon cycle relevant responses of ecosystems to extreme climatic events. Given that impacts of climate extremes are considered disturbances, we assume the respective general disturbance-induced mechanisms and processes to also operate in an extreme context. The paucity of well-defined studies currently renders a quantitative meta-analysis impossible, but permits us to develop a deductive framework for identifying the main mechanisms (and coupling thereof) through which climate extremes may act on the carbon cycle. We find that ecosystem responses can exceed the duration of the climate impacts via lagged effects on the carbon cycle. The expected regional impacts of future climate extremes will depend on changes in the probability and severity of their occurrence, on the compound effects and timing of different climate extremes, and on the vulnerability of each land-cover type modulated by management. Although processes and sensitivities differ among biomes, based on expert opinion, we expect forests to exhibit the largest net effect of extremes due to their large carbon pools and fluxes, potentially large indirect and lagged impacts, and long recovery time to regain previous stocks. At the global scale, we presume that droughts have the strongest and most widespread effects on terrestrial carbon cycling. Comparing impacts of climate extremes identified via remote sensing vs. ground-based observational case studies reveals that many regions in the (sub-)tropics are understudied. Hence, regional investigations are needed to allow a global upscaling of the impacts of climate extremes on global carbon-climate feedbacks. © 2015 The Authors. Global Change Biology published by John Wiley & Sons Ltd.
Tools used by the insurance industry to assess risk from hydroclimatic extremes
NASA Astrophysics Data System (ADS)
Higgs, Stephanie; McMullan, Caroline
2016-04-01
Probabilistic catastrophe models are widely used within the insurance industry to assess and price the risk of natural hazards to individual residences through to portfolios of millions of properties. Over the relatively short period that catastrophe models have been available (almost 30 years), the insurance industry has built up a financial resilience to key natural hazards in certain areas (e.g. US tropical cyclone, European extra-tropical cyclone and flood). However, due the rapidly expanding global population and increase in wealth, together with uncertainties in the behaviour of meteorological phenomena introduced by climate change, the domain in which natural hazards impact society is growing. As a result, the insurance industry faces new challenges in assessing the risk and uncertainty from natural hazards. As a catastrophe modelling company, AIR Worldwide has a toolbox of options available to help the insurance industry assess extreme climatic events and their associated uncertainty. Here we discuss several of these tools: from helping analysts understand how uncertainty is inherently built in to probabilistic catastrophe models, to understanding alternative stochastic catalogs for tropical cyclone based on climate conditioning. Through the use of stochastic extreme disaster events such as those provided through AIR's catalogs or through the Lloyds of London marketplace (RDS's) to provide useful benchmarks for the loss probability exceedence and tail-at-risk metrics outputted from catastrophe models; to the visualisation of 1000+ year event footprints and hazard intensity maps. Ultimately the increased transparency of catastrophe models and flexibility of a software platform that allows for customisation of modelled and non-modelled risks will drive a greater understanding of extreme hydroclimatic events within the insurance industry.
weather@home 2: validation of an improved global-regional climate modelling system
NASA Astrophysics Data System (ADS)
Guillod, Benoit P.; Jones, Richard G.; Bowery, Andy; Haustein, Karsten; Massey, Neil R.; Mitchell, Daniel M.; Otto, Friederike E. L.; Sparrow, Sarah N.; Uhe, Peter; Wallom, David C. H.; Wilson, Simon; Allen, Myles R.
2017-05-01
Extreme weather events can have large impacts on society and, in many regions, are expected to change in frequency and intensity with climate change. Owing to the relatively short observational record, climate models are useful tools as they allow for generation of a larger sample of extreme events, to attribute recent events to anthropogenic climate change, and to project changes in such events into the future. The modelling system known as weather@home, consisting of a global climate model (GCM) with a nested regional climate model (RCM) and driven by sea surface temperatures, allows one to generate a very large ensemble with the help of volunteer distributed computing. This is a key tool to understanding many aspects of extreme events. Here, a new version of the weather@home system (weather@home 2) with a higher-resolution RCM over Europe is documented and a broad validation of the climate is performed. The new model includes a more recent land-surface scheme in both GCM and RCM, where subgrid-scale land-surface heterogeneity is newly represented using tiles, and an increase in RCM resolution from 50 to 25 km. The GCM performs similarly to the previous version, with some improvements in the representation of mean climate. The European RCM temperature biases are overall reduced, in particular the warm bias over eastern Europe, but large biases remain. Precipitation is improved over the Alps in summer, with mixed changes in other regions and seasons. The model is shown to represent the main classes of regional extreme events reasonably well and shows a good sensitivity to its drivers. In particular, given the improvements in this version of the weather@home system, it is likely that more reliable statements can be made with regards to impact statements, especially at more localized scales.
Rosetta Structure Prediction as a Tool for Solving Difficult Molecular Replacement Problems.
DiMaio, Frank
2017-01-01
Molecular replacement (MR), a method for solving the crystallographic phase problem using phases derived from a model of the target structure, has proven extremely valuable, accounting for the vast majority of structures solved by X-ray crystallography. However, when the resolution of data is low, or the starting model is very dissimilar to the target protein, solving structures via molecular replacement may be very challenging. In recent years, protein structure prediction methodology has emerged as a powerful tool in model building and model refinement for difficult molecular replacement problems. This chapter describes some of the tools available in Rosetta for model building and model refinement specifically geared toward difficult molecular replacement cases.
NASA Technical Reports Server (NTRS)
Wissler, Steven S.; Maldague, Pierre; Rocca, Jennifer; Seybold, Calina
2006-01-01
The Deep Impact mission was ambitious and challenging. JPL's well proven, easily adaptable multi-mission sequence planning tools combined with integrated spacecraft subsystem models enabled a small operations team to develop, validate, and execute extremely complex sequence-based activities within very short development times. This paper focuses on the core planning tool used in the mission, APGEN. It shows how the multi-mission design and adaptability of APGEN made it possible to model spacecraft subsystems as well as ground assets throughout the lifecycle of the Deep Impact project, starting with models of initial, high-level mission objectives, and culminating in detailed predictions of spacecraft behavior during mission-critical activities.
Colossal Tooling Design: 3D Simulation for Ergonomic Analysis
NASA Technical Reports Server (NTRS)
Hunter, Steve L.; Dischinger, Charles; Thomas, Robert E.; Babai, Majid
2003-01-01
The application of high-level 3D simulation software to the design phase of colossal mandrel tooling for composite aerospace fuel tanks was accomplished to discover and resolve safety and human engineering problems. The analyses were conducted to determine safety, ergonomic and human engineering aspects of the disassembly process of the fuel tank composite shell mandrel. Three-dimensional graphics high-level software, incorporating various ergonomic analysis algorithms, was utilized to determine if the process was within safety and health boundaries for the workers carrying out these tasks. In addition, the graphical software was extremely helpful in the identification of material handling equipment and devices for the mandrel tooling assembly/disassembly process.
Dynamical ocean-atmospheric drivers of floods and droughts
NASA Astrophysics Data System (ADS)
Perdigão, Rui A. P.; Hall, Julia
2014-05-01
The present study contributes to a better depiction and understanding of the "facial expression" of the Earth in terms of dynamical ocean-atmospheric processes associated to both floods and droughts. For this purpose, the study focuses on nonlinear dynamical and statistical analysis of ocean-atmospheric mechanisms contributing to hydrological extremes, broadening the analytical hydro-meteorological perspective of floods and hydrological droughts to driving mechanisms and feedbacks at the global scale. In doing so, the analysis of the climate-related causality of hydrological extremes is not limited to the synoptic situation in the region where the events take place. Rather, it goes further in the train of causality, peering into dynamical interactions between planetary-scale ocean and atmospheric processes that drive weather regimes and influence the antecedent and event conditions associated to hydrological extremes. In order to illustrate the approach, dynamical ocean-atmospheric drivers are investigated for a selection of floods and droughts. Despite occurring in different regions with different timings, common underlying mechanisms are identified for both kinds of hydrological extremes. For instance, several analysed events are seen to have resulted from a large-scale atmospheric situation consisting on standing planetary waves encircling the northern hemisphere. These correspond to wider vortices locked in phase, resulting in wider and more persistent synoptic weather patterns, i.e. with larger spatial and temporal coherence. A standing train of anticyclones and depressions thus encircled the mid and upper latitudes of the northern hemisphere. The stationary regime of planetary waves occurs when the mean eastward zonal flow decreases up to a point in which it no longer exceeds the westward phase propagation of the Rossby waves produced by the latitude-varying Coriolis effect. The ocean-atmospheric causes for this behaviour and consequences on hydrological extremes are investigated and the findings supported with spatiotemporal geostatistical analysis and nonlinear geophysical models. Overall, the study provides a three-fold contribution to the research on hydrological extremes: Firstly, it improves their physical attribution by better understanding the dynamical reasons behind the meteorological drivers. Secondly, it brings out fundamental early warning signs for potential hydrological extremes, by bringing out global ocean-atmospheric features that manifest themselves much earlier than the regional weather patterns. Thirdly, it provides tools for addressing and understanding hydrological regime changes at wider spatiotemporal scales, by providing links to planetary-scale dynamical processes that play a crucial role in multi-decadal global climate variability.
[transcutaneous oximetry--between theory and practice].
Zulec, Mirna
2014-10-01
Transcutaneous oximetry is a procedure used to measure the pressure of oxygen in tissue and to determine oxygenation level. It is essential to determine the state of microcirculation and is used to assess the necessity and level of amputation and the effect of revascularization procedures, as a predictor of wound healing and hyperbaric oxygen therapy (HBOT) effectiveness tool. The measurement is done by the application electrode measuring point and the result is measured in mm Hg. Tissue with adequate oxygen level has a value greater than 50 mm Hg. Values between 20 and 40 mm Hg are considered hypoxic, while those below 20 mm Hg indicate extreme hypoxia. In Croatia, TcPO2 is commonly used for HBOT assessment but there is the need of broader application to objectify and facilitate procedures in the care of persons with impaired microcirculation.
Fernández-Sierra, Mónica; Quiñones, Edwin
2015-03-15
Here we characterize the fluorescence of the YOYO dye as a tool for studying DNA-protein interactions in real time and present two continuous YOYO-based assays for sensitively monitoring the kinetics of DNA digestion by λ-exonuclease and the endonuclease EcoRV. The described assays rely on the different fluorescence intensities between single- and double-stranded DNA-YOYO complexes, allowing straightforward determination of nuclease activity and quantitative determination of reaction products. The assays were also employed to assess the effect of single-stranded DNA-binding proteins on the λ-exonuclease reaction kinetics, showing that the extreme thermostable single-stranded DNA-binding protein (ET-SSB) significantly reduced the reaction rate, while the recombination protein A (RecA) displayed no effect. Copyright © 2015 Elsevier Inc. All rights reserved.
A new approach to pattern metrology
NASA Astrophysics Data System (ADS)
Ausschnitt, Christopher P.
2004-05-01
We describe an approach to pattern metrology that enables the simultaneous determination of critical dimensions, overlay and film thickness. A single optical system captures nonzero- and zero-order diffracted signals from illuminated grating targets, as well as unpatterned regions of the surrounding substrate. Differential targets provide in situ dimensional calibration. CD target signals are analyzed to determine average dimension, profile attributes, and effective dose and defocus. In turn, effective dose and defocus determines all CDs pre-correlated to the dose and focus settings of the exposure tool. Overlay target signals are analyzed to determine the relative reflectivity of the layer pair and the overlay error between them. Compared to commercially available pattern metrology (SEM, optical microscopy, AFM, scatterometry and schnitzlometry), our approach promises improved signal-to-noise, higher throughput and smaller targets. We have dubbed this optical chimera MOXIE (Metrology Of eXtremely Irrational Exuberance).
CRISPR/Cas9 mediates efficient conditional mutagenesis in Drosophila.
Xue, Zhaoyu; Wu, Menghua; Wen, Kejia; Ren, Menda; Long, Li; Zhang, Xuedi; Gao, Guanjun
2014-09-05
Existing transgenic RNA interference (RNAi) methods greatly facilitate functional genome studies via controlled silencing of targeted mRNA in Drosophila. Although the RNAi approach is extremely powerful, concerns still linger about its low efficiency. Here, we developed a CRISPR/Cas9-mediated conditional mutagenesis system by combining tissue-specific expression of Cas9 driven by the Gal4/upstream activating site system with various ubiquitously expressed guide RNA transgenes to effectively inactivate gene expression in a temporally and spatially controlled manner. Furthermore, by including multiple guide RNAs in a transgenic vector to target a single gene, we achieved a high degree of gene mutagenesis in specific tissues. The CRISPR/Cas9-mediated conditional mutagenesis system provides a simple and effective tool for gene function analysis, and complements the existing RNAi approach. Copyright © 2014 Xue et al.
Lorantfy, Bettina; Seyer, Bernhard; Herwig, Christoph
2014-01-25
Extreme halophilic Archaea are extremophile species which can thrive in hypersaline environments of up to 3-5 M sodium chloride concentration. Although their ecology and physiology are widely identified on the microbiological level, little emphasis has been laid on quantitative bioprocess development with extreme halophiles. The goal of this study was to establish, on the one hand, a methodological basis for quantitative bioprocess analysis of extreme halophilic Archaea with an extreme halophilic strain as an example. Firstly, as a novel usage, a corrosion resistant bioreactor setup for extreme halophiles has been implemented. Then, paying special attention to total bioprocess quantification approaches, an indirect method for biomass quantification using on-line process signals was introduced. Subsequently, robust quantitative data evaluation methods for halophiles could be developed, providing defined and controlled cultivation conditions in the bioreactor and therefore obtaining suitable quality of on-line as well as off-line datasets. On the other hand, new physiological results of extreme halophiles in bioreactor have also been obtained based on the quantitative methodological tools. For the first time, quantitative data on stoichiometry and kinetics were collected and evaluated on different carbon sources. The results on various substrates were interpreted, with proposed metabolic mechanisms, by linking to the reported primary carbon metabolism of extreme halophilic Archaea. Moreover, results of chemostat cultures demonstrated that extreme halophilic organisms show Monod-kinetics on different sole carbon sources. A diauxic growth pattern was described on a mixture of substrates in batch cultivations. In addition, the methodologies presented here enable one to characterize the utilized strain Haloferax mediterranei (HFX) as a potential new host organism. Thus, this study offers a strong methodological basis as well as a fundamental physiological assessment for bioreactor quantification of extreme halophiles that can serve as primary knowledge for applications of extreme halophiles in biotechnology. Copyright © 2013 Elsevier B.V. All rights reserved.
Influence of extreme weather disasters on global crop production.
Lesk, Corey; Rowhani, Pedram; Ramankutty, Navin
2016-01-07
In recent years, several extreme weather disasters have partially or completely damaged regional crop production. While detailed regional accounts of the effects of extreme weather disasters exist, the global scale effects of droughts, floods and extreme temperature on crop production are yet to be quantified. Here we estimate for the first time, to our knowledge, national cereal production losses across the globe resulting from reported extreme weather disasters during 1964-2007. We show that droughts and extreme heat significantly reduced national cereal production by 9-10%, whereas our analysis could not identify an effect from floods and extreme cold in the national data. Analysing the underlying processes, we find that production losses due to droughts were associated with a reduction in both harvested area and yields, whereas extreme heat mainly decreased cereal yields. Furthermore, the results highlight ~7% greater production damage from more recent droughts and 8-11% more damage in developed countries than in developing ones. Our findings may help to guide agricultural priorities in international disaster risk reduction and adaptation efforts.
77 FR 51807 - Agency Forms Undergoing Paperwork Reduction Act Review
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-27
... Minimum Data Elements (MDEs) for the National Breast and Cervical Cancer Early Detection Program (NBCCEDP... screening and early detection tests for breast and cervical cancer. Mammography is extremely valuable as an early detection tool because it can detect breast cancer well before the woman can feel the lump, when...
Using Digital Storytelling to Improve Literacy Skills
ERIC Educational Resources Information Center
Menezes, Helena
2012-01-01
The paper shows the importance of Storybird, an online platform, for developing writing and storytelling among young learners of a foreign language. Storybird is an extremely engaging collaborative storywriting website that embodies three ideas--creating, reading, and sharing. It is also a collaborative storytelling tool that allows students to…
Literacy and Informational Interviews
ERIC Educational Resources Information Center
Decarie, Christina
2010-01-01
Informational interviews are valuable tools for improving writing, editing, and interviewing skills, and they are also extremely valuable in improving the soft skills that are valued by employers, such as confidence, adaptability, the ability to set and keep deadlines, the ability to manage risk, and so on. These soft skills, this article argues,…
The assessment of land use and land cover is an extremely important activity for contemporary land management. A large body of current literature suggests that human land-use practice is the most important factor influencing natural resource management and environmental condition...
The power of counselling--for better or worse?
Cohen, J
1994-04-01
Counselling is an extremely powerful tool, a vital skill to acquire in general practice today. Nevertheless, it must be used wisely and not prescribed wholesale. This article discusses counselling in general practice in relation to a series of cases where a presentation of nightmares disguised underlying repressed traumas.
Evaluation of climate variability on drought occurrence in an agricultural watershed
USDA-ARS?s Scientific Manuscript database
Changes in the future hydrologic cycle due to changes in precipitation and temperature are likely to be associated with increases in hydrologic extremes. This study evaluates the impacts of climate variability on drought using the Soil and Water Assessment Tool (SWAT) in the Goodwater Creek Experim...
Predicting drought in an agricultural watershed given climate variability
USDA-ARS?s Scientific Manuscript database
Changes in the future hydrologic cycle due to changes in temperature (T) and precipitation (P) are likely to be associated with increases in hydrologic extremes. This study evaluates the impacts of climate variability on drought using the Soil and Water Assessment Tool (SWAT) in Goodwater Creek Expe...
False-color near-infrared (CIR) aerial photography of seven Oregon estuaries was acquired at extreme low tides and digitally orthorectified with a ground pixel resolution of 25 cm to provide data for intertidal vegetation mapping. Exposed, semi-exposed and some submerged eelgras...
ERIC Educational Resources Information Center
Emery, Jill
2009-01-01
Twitter provides rapid information in a short form, and it is extremely easy to follow the updates of others because of myriad software applications with which it works on both mobile devices and traditional computing hardware. Currently, most academic librarians are using Twitter primarily as a tool at library conferences and seminars to capture…
Burken, J.G.; Vroblesky, D.A.; Balouet, J.-C.
2011-01-01
As plants evolved to be extremely proficient in mass transfer with their surroundings and survive as earth's dominant biomass, they also accumulate and store some contaminants from surroundings, acting as passive samplers. Novel applications and analytical methods have been utilized to gain information about a wide range of contaminants in the biosphere soil, water, and air, with information available on both past (dendrochemistry) and present (phytoscreening). Collectively these sampling approaches provide rapid, cheap, ecologically friendly, and overall "green" tools termed "Phytoforensics". ?? 2011 American Chemical Society.
Improved kinect-based spatiotemporal and kinematic treadmill gait assessment.
Eltoukhy, Moataz; Oh, Jeonghoon; Kuenze, Christopher; Signorile, Joseph
2017-01-01
A cost-effective, clinician friendly gait assessment tool that can automatically track patients' anatomical landmarks can provide practitioners with important information that is useful in prescribing rehabilitative and preventive therapies. This study investigated the validity and reliability of the Microsoft Kinect v2 as a potential inexpensive gait analysis tool. Ten healthy subjects walked on a treadmill at 1.3 and 1.6m·s -1 , as spatiotemporal parameters and kinematics were extracted concurrently using the Kinect and three-dimensional motion analysis. Spatiotemporal measures included step length and width, step and stride times, vertical and mediolateral pelvis motion, and foot swing velocity. Kinematic outcomes included hip, knee, and ankle joint angles in the sagittal plane. The absolute agreement and relative consistency between the two systems were assessed using interclass correlations coefficients (ICC2,1), while reproducibility between systems was established using Lin's Concordance Correlation Coefficient (rc). Comparison of ensemble curves and associated 90% confidence intervals (CI90) of the hip, knee, and ankle joint angles were performed to investigate if the Kinect sensor could consistently and accurately assess lower extremity joint motion throughout the gait cycle. Results showed that the Kinect v2 sensor has the potential to be an effective clinical assessment tool for sagittal plane knee and hip joint kinematics, as well as some spatiotemporal temporal variables including pelvis displacement and step characteristics during the gait cycle. Copyright © 2016 Elsevier B.V. All rights reserved.
Togawa, Yoichiro; Nunoshiba, Tatsuo; Hiratsu, Keiichiro
2018-02-01
Markerless gene-disruption technology is particularly useful for effective genetic analyses of Thermus thermophilus (T. thermophilus), which have a limited number of selectable markers. In an attempt to develop a novel system for the markerless disruption of genes in T. thermophilus, we applied a Cre/lox system to construct a triple gene disruptant. To achieve this, we constructed two genetic tools, a loxP-htk-loxP cassette and cre-expressing plasmid, pSH-Cre, for gene disruption and removal of the selectable marker by Cre-mediated recombination. We found that the Cre/lox system was compatible with the proliferation of the T. thermophilus HB27 strain at the lowest growth temperature (50 °C), and thus succeeded in establishing a triple gene disruptant, the (∆TTC1454::loxP, ∆TTC1535KpnI::loxP, ∆TTC1576::loxP) strain, without leaving behind a selectable marker. During the process of the sequential disruption of multiple genes, we observed the undesired deletion and inversion of the chromosomal region between multiple loxP sites that were induced by Cre-mediated recombination. Therefore, we examined the effects of a lox66-htk-lox71 cassette by exploiting the mutant lox sites, lox66 and lox71, instead of native loxP sites. We successfully constructed a (∆TTC1535::lox72, ∆TTC1537::lox72) double gene disruptant without inducing the undesired deletion of the 0.7-kbp region between the two directly oriented lox72 sites created by the Cre-mediated recombination of the lox66-htk-lox71 cassette. This is the first demonstration of a Cre/lox system being applicable to extreme thermophiles in a genetic manipulation. Our results indicate that this system is a powerful tool for multiple markerless gene disruption in T. thermophilus.
Evaluation of friction heating in cavitating high pressure Diesel injector nozzles
NASA Astrophysics Data System (ADS)
Salemi, R.; Koukouvinis, P.; Strotos, G.; McDavid, R.; Wang, Lifeng; Li, Jason; Marengo, M.; Gavaises, M.
2015-12-01
Variation of fuel properties occurring during extreme fuel pressurisation in Diesel fuel injectors relative to those under atmospheric pressure and room temperature conditions may affect significantly fuel delivery, fuel injection temperature, injector durability and thus engine performance. Indicative results of flow simulations during the full injection event of a Diesel injector are presented. In addition to the Navier-Stokes equations, the enthalpy conservation equation is considered for predicting the fuel temperature. Cavitation is simulated using an Eulerian-Lagrangian cavitation model fully coupled with the flow equations. Compressible bubble dynamics based on the R-P equation also consider thermal effects. Variable fuel properties function of the local pressure and temperature are taken from literature and correspond to a reference so-called summer Diesel fuel. Fuel pressurisation up to 3000bar pressure is considered while various wall temperature boundary conditions are tested in order to compare their effect relative to those of the fuel heating caused during the depressurisation of the fuel as it passes through the injection orifices. The results indicate formation of strong temperature gradients inside the fuel injector while heating resulting from the extreme friction may result to local temperatures above the fuel's boiling point. Predictions indicate bulk fuel temperature increase of more than 100°C during the opening phase of the needle valve. Overall, it is concluded that such effects are significant for the injector performance and should be considered in relevant simulation tools.
Improving Ramsey spectroscopy in the extreme-ultraviolet region with a random-sampling approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eramo, R.; Bellini, M.; European Laboratory for Non-linear Spectroscopy
2011-04-15
Ramsey-like techniques, based on the coherent excitation of a sample by delayed and phase-correlated pulses, are promising tools for high-precision spectroscopic tests of QED in the extreme-ultraviolet (xuv) spectral region, but currently suffer experimental limitations related to long acquisition times and critical stability issues. Here we propose a random subsampling approach to Ramsey spectroscopy that, by allowing experimentalists to reach a given spectral resolution goal in a fraction of the usual acquisition time, leads to substantial improvements in high-resolution spectroscopy and may open the way to a widespread application of Ramsey-like techniques to precision measurements in the xuv spectral region.
The prevalence of the Middle-Eastern extreme ideologies among some Canadians.
Loza, Wagdy
2011-05-01
A total of 183 Canadian participants of different religious backgrounds completed the Belief Diversity Scale (BDS). The BDS is an 80-item, 6-subscale instrument designed to quantitatively measure the religious attitudes, beliefs, and ideologies of Middle-Eastern extremists' on risk areas that are reported in the literature. The results demonstrated the reliability and validity of the BDS as well as indicated the prevalence of Middle-Eastern extremists' ideologies among Muslim Canadians. Results were similar to those obtained from similar study completed on South African participants. These findings suggested that the BDS has the potential to be used as an objective tool to measure Middle-Eastern religious extremism.
NASA Astrophysics Data System (ADS)
Aziz, Nur Liyana Afiqah Abdul; Siah Yap, Keem; Afif Bunyamin, Muhammad
2013-06-01
This paper presents a new approach of the fault detection for improving efficiency of circulating water system (CWS) in a power generation plant using a hybrid Fuzzy Logic System (FLS) and Extreme Learning Machine (ELM) neural network. The FLS is a mathematical tool for calculating the uncertainties where precision and significance are applied in the real world. It is based on natural language which has the ability of "computing the word". The ELM is an extremely fast learning algorithm for neural network that can completed the training cycle in a very short time. By combining the FLS and ELM, new hybrid model, i.e., FLS-ELM is developed. The applicability of this proposed hybrid model is validated in fault detection in CWS which may help to improve overall efficiency of power generation plant, hence, consuming less natural recourses and producing less pollutions.
Classroom Live: a software-assisted gamification tool
NASA Astrophysics Data System (ADS)
de Freitas, Adrian A.; de Freitas, Michelle M.
2013-06-01
Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Stübi, R.; Weihs, P.; Holawe, F.; Peter, T.; Davison, A. C.
2009-04-01
Over the last few decades negative trends in stratospheric ozone have been studied because of the direct link between decreasing stratospheric ozone and increasing surface UV-radiation. Recently a discussion on ozone recovery has begun. Long-term measurements of total ozone extending back earlier than 1958 are limited and only available from a few stations in the northern hemisphere. The world's longest total ozone record is available from Arosa, Switzerland (Staehelin et al., 1998a,b). At this site total ozone measurements have been made since late 1926 through the present day. Within this study (Rieder et al., 2009) new tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007) are applied to select mathematically well-defined thresholds for extreme low and extreme high total ozone. A heavy-tail focused approach is used by fitting the Generalized Pareto Distribution (GPD) to the Arosa time series. Asymptotic arguments (Pickands, 1975) justify the use of the GPD for modeling exceedances over a sufficiently high (or below a sufficiently low) threshold (Coles, 2001). More precisely, the GPD is the limiting distribution of normalized excesses over a threshold, as the threshold approaches the endpoint of the distribution. In practice, GPD parameters are fitted, to exceedances by maximum likelihood or other methods - such as the probability weighted moments. A preliminary step consists in defining an appropriate threshold for which the asymptotic GPD approximation holds. Suitable tools for threshold selection as the MRL-plot (mean residual life plot) and TC-plot (stability plot) from the POT-package (Ribatet, 2007) are presented. The frequency distribution of extremes in low (termed ELOs) and high (termed EHOs) total ozone and their influence on the long-term changes in total ozone are analyzed. Further it is shown that from the GPD-model the distribution of so-called ozone mini holes (e.g. Bojkov and Balis, 2001) can be precisely estimated and that the "extremes concept" provides new information on the data distribution and variability within the Arosa record as well as on the influence of ELOs and EHOs on the long-term trends of the ozone time series. References: Bojkov, R. D., and Balis, D.S.: Characteristics of episodes with extremely low ozone values in the northern middle latitudes 1975-2000, Ann. Geophys., 19, 797-807, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Pickands, J.: Statistical inference using extreme order statistics, Ann. Stat., 3, 1, 119-131, 1975. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Stübi, R., Weihs, P., Holawe, F., and M. Ribatet: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
Developmental Testing of Habitability and Human Factors Tools and Methods During Neemo 15
NASA Technical Reports Server (NTRS)
Thaxton, S. S.; Litaker, H. L., Jr.; Holden, K. L.; Adolf, J. A.; Pace, J.; Morency, R. M.
2011-01-01
Currently, no established methods exist to collect real-time human factors and habitability data while crewmembers are living aboard the International Space Station (ISS), traveling aboard other space vehicles, or living in remote habitats. Currently, human factors and habitability data regarding space vehicles and habitats are acquired at the end of missions during postflight crew debriefs. These debriefs occur weeks or often longer after events have occurred, which forces a significant reliance on incomplete human memory, which is imperfect. Without a means to collect real-time data, small issues may have a cumulative effect and continue to cause crew frustration and inefficiencies. Without timely and appropriate reporting methodologies, issues may be repeated or lost. TOOL DEVELOPMENT AND EVALUATION: As part of a directed research project (DRP) aiming to develop and validate tools and methods for collecting near real-time human factors and habitability data, a preliminary set of tools and methods was developed. These tools and methods were evaluated during the NASA Extreme Environments Mission Operations (NEEMO) 15 mission in October 2011. Two versions of a software tool were used to collect observational data from NEEMO crewmembers that also used targeted strategies for using video cameras to collect observations. Space habitability observation reporting tool (SHORT) was created based on a tool previously developed by NASA to capture human factors and habitability issues during spaceflight. SHORT uses a web-based interface that allows users to enter a text description of any observations they wish to report and assign a priority level if changes are needed. In addition to the web-based format, a mobile Apple (iOS) format was implemented, referred to as iSHORT. iSHORT allows users to provide text, audio, photograph, and video data to report observations. iSHORT can be deployed on an iPod Touch, iPhone, or iPad; for NEEMO 15, the app was provided on an iPad2.
Web-based Tool Suite for Plasmasphere Information Discovery
NASA Astrophysics Data System (ADS)
Newman, T. S.; Wang, C.; Gallagher, D. L.
2005-12-01
A suite of tools that enable discovery of terrestrial plasmasphere characteristics from NASA IMAGE Extreme Ultra Violet (EUV) images is described. The tool suite is web-accessible, allowing easy remote access without the need for any software installation on the user's computer. The features supported by the tool include reconstruction of the plasmasphere plasma density distribution from a short sequence of EUV images, semi-automated selection of the plasmapause boundary in an EUV image, and mapping of the selected boundary to the geomagnetic equatorial plane. EUV image upload and result download is also supported. The tool suite's plasmapause mapping feature is achieved via the Roelof and Skinner (2000) Edge Algorithm. The plasma density reconstruction is achieved through a tomographic technique that exploits physical constraints to allow for a moderate resolution result. The tool suite's software architecture uses Java Server Pages (JSP) and Java Applets on the front side for user-software interaction and Java Servlets on the server side for task execution. The compute-intensive components of the tool suite are implemented in C++ and invoked by the server via Java Native Interface (JNI).
A Job Monitoring and Accounting Tool for the LSF Batch System
NASA Astrophysics Data System (ADS)
Sarkar, Subir; Taneja, Sonia
2011-12-01
This paper presents a web based job monitoring and group-and-user accounting tool for the LSF Batch System. The user oriented job monitoring displays a simple and compact quasi real-time overview of the batch farm for both local and Grid jobs. For Grid jobs the Distinguished Name (DN) of the Grid users is shown. The overview monitor provides the most up-to-date status of a batch farm at any time. The accounting tool works with the LSF accounting log files. The accounting information is shown for a few pre-defined time periods by default. However, one can also compute the same information for any arbitrary time window. The tool already proved to be an extremely useful means to validate more extensive accounting tools available in the Grid world. Several sites have already been using the present tool and more sites running the LSF batch system have shown interest. We shall discuss the various aspects that make the tool essential for site administrators and end-users alike and outline the current status of development as well as future plans.
Onozuka, Daisuke; Hagihara, Akihito
2015-07-01
Although the impact of extreme heat and cold on mortality has been documented in recent years, few studies have investigated whether variation in susceptibility to extreme temperatures has changed in Japan. We used data on daily total mortality and mean temperatures in Fukuoka, Japan, for 1973-2012. We used time-series analysis to assess the effects of extreme hot and low temperatures on all-cause mortality, stratified by decade, gender, and age, adjusting for time trends. We used a multivariate meta-analysis with a distributed lag non-linear model to estimate pooled non-linear lag-response relationships associated with extreme temperatures on mortality. The relative risk of mortality increased during heat extremes in all decades, with a declining trend over time. The mortality risk was higher during cold extremes for the entire study period, with a dispersed pattern across decades. Meta-analysis showed that both heat and cold extremes increased the risk of mortality. Cold effects were delayed and lasted for several days, whereas heat effects appeared quickly and did not last long. Our study provides quantitative evidence that extreme heat and low temperatures were significantly and non-linearly associated with the increased risk of mortality with substantial variation. Our results suggest that timely preventative measures are important for extreme high temperatures, whereas several days' protection should be provided for extreme low temperatures. Copyright © 2015 Elsevier Inc. All rights reserved.
Ni, Qian Qian; Tang, Chun Xiang; Zhao, Yan E; Zhou, Chang Sheng; Chen, Guo Zhong; Lu, Guang Ming; Zhang, Long Jiang
2016-05-25
Aneurysmal subarachnoid hemorrhages have extremely high case fatality in clinic. Early and rapid identifications of ruptured intracranial aneurysms seem to be especially important. Here we evaluate clinical value of single phase contrast-enhanced dual-energy CT angiograph (DE-CTA) as a one-stop-shop tool in detecting aneurysmal subarachnoid hemorrhage. One hundred and five patients who underwent true non-enhanced CT (TNCT), contrast-enhanced DE-CTA and digital subtraction angiography (DSA) were included. Image quality and detectability of intracranial hemorrhage were evaluated and compared between virtual non-enhanced CT (VNCT) images reconstructed from DE-CTA and TNCT. There was no statistical difference in image quality (P > 0.05) between VNCT and TNCT. The agreement of VNCT and TNCT in detecting intracranial hemorrhage reached 98.1% on a per-patient basis. With DSA as reference standard, sensitivity and specificity on a per-patient were 98.3% and 97.9% for DE-CTA in intracranial aneurysm detection. Effective dose of DE-CTA was reduced by 75.0% compared to conventional digital subtraction CTA. Thus, single phase contrast-enhanced DE-CTA is optimal reliable one-stop-shop tool for detecting intracranial hemorrhage with VNCT and intracranial aneurysms with DE-CTA with substantial radiation dose reduction compared with conventional digital subtraction CTA.
Enabling Efficient Climate Science Workflows in High Performance Computing Environments
NASA Astrophysics Data System (ADS)
Krishnan, H.; Byna, S.; Wehner, M. F.; Gu, J.; O'Brien, T. A.; Loring, B.; Stone, D. A.; Collins, W.; Prabhat, M.; Liu, Y.; Johnson, J. N.; Paciorek, C. J.
2015-12-01
A typical climate science workflow often involves a combination of acquisition of data, modeling, simulation, analysis, visualization, publishing, and storage of results. Each of these tasks provide a myriad of challenges when running on a high performance computing environment such as Hopper or Edison at NERSC. Hurdles such as data transfer and management, job scheduling, parallel analysis routines, and publication require a lot of forethought and planning to ensure that proper quality control mechanisms are in place. These steps require effectively utilizing a combination of well tested and newly developed functionality to move data, perform analysis, apply statistical routines, and finally, serve results and tools to the greater scientific community. As part of the CAlibrated and Systematic Characterization, Attribution and Detection of Extremes (CASCADE) project we highlight a stack of tools our team utilizes and has developed to ensure that large scale simulation and analysis work are commonplace and provide operations that assist in everything from generation/procurement of data (HTAR/Globus) to automating publication of results to portals like the Earth Systems Grid Federation (ESGF), all while executing everything in between in a scalable environment in a task parallel way (MPI). We highlight the use and benefit of these tools by showing several climate science analysis use cases they have been applied to.
Ni, Qian Qian; Tang, Chun Xiang; Zhao, Yan E; Zhou, Chang Sheng; Chen, Guo Zhong; Lu, Guang Ming; Zhang, Long Jiang
2016-01-01
Aneurysmal subarachnoid hemorrhages have extremely high case fatality in clinic. Early and rapid identifications of ruptured intracranial aneurysms seem to be especially important. Here we evaluate clinical value of single phase contrast-enhanced dual-energy CT angiograph (DE-CTA) as a one-stop-shop tool in detecting aneurysmal subarachnoid hemorrhage. One hundred and five patients who underwent true non-enhanced CT (TNCT), contrast-enhanced DE-CTA and digital subtraction angiography (DSA) were included. Image quality and detectability of intracranial hemorrhage were evaluated and compared between virtual non-enhanced CT (VNCT) images reconstructed from DE-CTA and TNCT. There was no statistical difference in image quality (P > 0.05) between VNCT and TNCT. The agreement of VNCT and TNCT in detecting intracranial hemorrhage reached 98.1% on a per-patient basis. With DSA as reference standard, sensitivity and specificity on a per-patient were 98.3% and 97.9% for DE-CTA in intracranial aneurysm detection. Effective dose of DE-CTA was reduced by 75.0% compared to conventional digital subtraction CTA. Thus, single phase contrast-enhanced DE-CTA is optimal reliable one-stop-shop tool for detecting intracranial hemorrhage with VNCT and intracranial aneurysms with DE-CTA with substantial radiation dose reduction compared with conventional digital subtraction CTA. PMID:27222163
Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-10-01
In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.
Extreme events in total ozone over Arosa - Part 1: Application of extreme value theory
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Peter, T.; Ribatet, M.; Davison, A. C.; Stübi, R.; Weihs, P.; Holawe, F.
2010-05-01
In this study ideas from extreme value theory are for the first time applied in the field of stratospheric ozone research, because statistical analysis showed that previously used concepts assuming a Gaussian distribution (e.g. fixed deviations from mean values) of total ozone data do not adequately address the structure of the extremes. We show that statistical extreme value methods are appropriate to identify ozone extremes and to describe the tails of the Arosa (Switzerland) total ozone time series. In order to accommodate the seasonal cycle in total ozone, a daily moving threshold was determined and used, with tools from extreme value theory, to analyse the frequency of days with extreme low (termed ELOs) and high (termed EHOs) total ozone at Arosa. The analysis shows that the Generalized Pareto Distribution (GPD) provides an appropriate model for the frequency distribution of total ozone above or below a mathematically well-defined threshold, thus providing a statistical description of ELOs and EHOs. The results show an increase in ELOs and a decrease in EHOs during the last decades. The fitted model represents the tails of the total ozone data set with high accuracy over the entire range (including absolute monthly minima and maxima), and enables a precise computation of the frequency distribution of ozone mini-holes (using constant thresholds). Analyzing the tails instead of a small fraction of days below constant thresholds provides deeper insight into the time series properties. Fingerprints of dynamical (e.g. ENSO, NAO) and chemical features (e.g. strong polar vortex ozone loss), and major volcanic eruptions, can be identified in the observed frequency of extreme events throughout the time series. Overall the new approach to analysis of extremes provides more information on time series properties and variability than previous approaches that use only monthly averages and/or mini-holes and mini-highs.
NASA Astrophysics Data System (ADS)
Rajib, M. A.; Merwade, V.; Zhao, L.; Song, C.
2014-12-01
Explaining the complex cause-and-effect relationships in hydrologic cycle can often be challenging in a classroom with the use of traditional teaching approaches. With the availability of observed rainfall, streamflow and other hydrology data on the internet, it is possible to provide the necessary tools to students to explore these relationships and enhance their learning experience. From this perspective, a new online educational tool, called RWater, is developed using Purdue University's HUBzero technology. RWater's unique features include: (i) its accessibility including the R software from any java supported web browser; (ii) no installation of any software on user's computer; (iii) all the work and resulting data are stored in user's working directory on RWater server; and (iv) no prior programming experience with R software is necessary. In its current version, RWater can dynamically extract streamflow data from any USGS gaging station without any need for post-processing for use in the educational modules. By following data-driven modules, students can write small scripts in R and thereby create visualizations to identify the effect of rainfall distribution and watershed characteristics on runoff generation, investigate the impacts of landuse and climate change on streamflow, and explore the changes in extreme hydrologic events in actual locations. Each module contains relevant definitions, instructions on data extraction and coding, as well as conceptual questions based on the possible analyses which the students would perform. In order to assess its suitability in classroom implementation, and to evaluate users' perception over its utility, the current version of RWater has been tested with three different groups: (i) high school students, (ii) middle and high school teachers; and (iii) upper undergraduate/graduate students. The survey results from these trials suggest that the RWater has potential to improve students' understanding on various relationships in hydrologic cycle, leading towards effective dissemination of hydrology education ranging from K-12 to the graduate level. RWater is a publicly available for use at: https://mygeohub.org/tools/rwater.
NASA Astrophysics Data System (ADS)
Field, C. B.; Stocker, T. F.; Barros, V. R.; Qin, D.; Ebi, K. L.; Midgley, P. M.
2011-12-01
The Summary for Policy Makers of the IPCC Special Report on Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation will be approved by the world governments in November 2011. The focus of the Special Report is on climate change and its role in altering the frequency, severity, and impact of extreme events or disasters, and on the costs of both impacts and the actions taken to prepare for, respond to, and recover from extreme events and disasters. The emphasis is on understanding the factors that make people and infrastructure vulnerable to extreme events, on recent and future changes in the relationship between climate change and extremes, and on managing the risks of disasters over a wide range of spatial and temporal scales. The assessment considers a broad suite of adaptations and explores the limits to adaptation. The assessment was designed to build durable links and foundations for partnerships between the stakeholder communities focused on climate change and those focused on disaster risk reduction. The Special Report begins with material that frames the issues, followed by an assessment of the reasons that communities are vulnerable. Two chapters assess the role of past and future climate change in altering extremes and the impact of these on the physical environment and human systems. Three chapters assess available knowledge on impacts and adaptation, with separate chapters considering the literature, stakeholder relationships, and potential policy tools relevant to the local, national, and international scales. Longer-term components of adaptation to weather and climate extremes and disasters are assessed in the context of moving toward sustainability. The final chapter provides case studies that integrate themes across several chapters or are so unique that they need to be considered separately.
Lowes, Linda P; Alfano, Lindsay N; Yetter, Brent A; Worthen-Chaudhari, Lise; Hinchman, William; Savage, Jordan; Samona, Patrick; Flanigan, Kevin M; Mendell, Jerry R
2013-03-14
Individuals with dystrophinopathy lose upper extremity strength in proximal muscles followed by those more distal. Current upper extremity evaluation tools fail to fully capture changes in upper extremity strength and function across the disease spectrum as they tend to focus solely on distal ability. The Kinect by Microsoft is a gaming interface that can gather positional information about an individual's upper extremity movement which can be used to determine functional reaching volume, velocity of movement, and rate of fatigue while playing an engaging video game. The purpose of this study was to determine the feasibility of using the Kinect platform to assess upper extremity function in individuals with dystrophinopathy across the spectrum of abilities. Investigators developed a proof-of-concept device, ACTIVE (Abilities Captured Through Interactive Video Evaluation), to measure functional reaching volume, movement velocity, and rate of fatigue. Five subjects with dystrophinopathy and 5 normal controls were tested using ACTIVE during one testing session. A single subject with dystrophinopathy was simultaneously tested with ACTIVE and a marker-based motion analysis system to establish preliminary validity of measurements. ACTIVE proof-of-concept ranked the upper extremity abilities of subjects with dystrophinopathy by Brooke score, and also differentiated them from performance of normal controls for the functional reaching volume and velocity tests. Preliminary test-retest reliability of the ACTIVE for 2 sequential trials was excellent for functional reaching volume (ICC=0.986, p<0.001) and velocity trials (ICC=0.963, p<0.001). The data from our pilot study with ACTIVE proof-of-concept demonstrates that newly available gaming technology has potential to be used to create a low-cost, widely-accessible and functional upper extremity outcome measure for use with children and adults with dystrophinopathy.
NASA Astrophysics Data System (ADS)
Jiménez-Forteza, Xisco; Keitel, David; Husa, Sascha; Hannam, Mark; Khan, Sebastian; Pürrer, Michael
2017-03-01
Numerical relativity is an essential tool in studying the coalescence of binary black holes (BBHs). It is still computationally prohibitive to cover the BBH parameter space exhaustively, making phenomenological fitting formulas for BBH waveforms and final-state properties important for practical applications. We describe a general hierarchical bottom-up fitting methodology to design and calibrate fits to numerical relativity simulations for the three-dimensional parameter space of quasicircular nonprecessing merging BBHs, spanned by mass ratio and by the individual spin components orthogonal to the orbital plane. Particular attention is paid to incorporating the extreme-mass-ratio limit and to the subdominant unequal-spin effects. As an illustration of the method, we provide two applications, to the final spin and final mass (or equivalently: radiated energy) of the remnant black hole. Fitting to 427 numerical relativity simulations, we obtain results broadly consistent with previously published fits, but improving in overall accuracy and particularly in the approach to extremal limits and for unequal-spin configurations. We also discuss the importance of data quality studies when combining simulations from diverse sources, how detailed error budgets will be necessary for further improvements of these already highly accurate fits, and how this first detailed study of unequal-spin effects helps in choosing the most informative parameters for future numerical relativity runs.
Effect of Using Extreme Years in Hydrologic Model Calibration Performance
NASA Astrophysics Data System (ADS)
Goktas, R. K.; Tezel, U.; Kargi, P. G.; Ayvaz, T.; Tezyapar, I.; Mesta, B.; Kentel, E.
2017-12-01
Hydrological models are useful in predicting and developing management strategies for controlling the system behaviour. Specifically they can be used for evaluating streamflow at ungaged catchments, effect of climate change, best management practices on water resources, or identification of pollution sources in a watershed. This study is a part of a TUBITAK project named "Development of a geographical information system based decision-making tool for water quality management of Ergene Watershed using pollutant fingerprints". Within the scope of this project, first water resources in Ergene Watershed is studied. Streamgages found in the basin are identified and daily streamflow measurements are obtained from State Hydraulic Works of Turkey. Streamflow data is analysed using box-whisker plots, hydrographs and flow-duration curves focusing on identification of extreme periods, dry or wet. Then a hydrological model is developed for Ergene Watershed using HEC-HMS in the Watershed Modeling System (WMS) environment. The model is calibrated for various time periods including dry and wet ones and the performance of calibration is evaluated using Nash-Sutcliffe Efficiency (NSE), correlation coefficient, percent bias (PBIAS) and root mean square error. It is observed that calibration period affects the model performance, and the main purpose of the development of the hydrological model should guide calibration period selection. Acknowledgement: This study is funded by The Scientific and Technological Research Council of Turkey (TUBITAK) under Project Number 115Y064.
NASA Astrophysics Data System (ADS)
Pang, Linyong; Hu, Peter; Satake, Masaki; Tolani, Vikram; Peng, Danping; Li, Ying; Chen, Dongxue
2011-11-01
According to the ITRS roadmap, mask defects are among the top technical challenges to introduce extreme ultraviolet (EUV) lithography into production. Making a multilayer defect-free extreme ultraviolet (EUV) blank is not possible today, and is unlikely to happen in the next few years. This means that EUV must work with multilayer defects present on the mask. The method proposed by Luminescent is to compensate effects of multilayer defects on images by modifying the absorber patterns. The effect of a multilayer defect is to distort the images of adjacent absorber patterns. Although the defect cannot be repaired, the images may be restored to their desired targets by changing the absorber patterns. This method was first introduced in our paper at BACUS 2010, which described a simple pixel-based compensation algorithm using a fast multilayer model. The fast model made it possible to complete the compensation calculations in seconds, instead of days or weeks required for rigorous Finite Domain Time Difference (FDTD) simulations. Our SPIE 2011 paper introduced an advanced compensation algorithm using the Level Set Method for 2D absorber patterns. In this paper the method is extended to consider process window, and allow repair tool constraints, such as permitting etching but not deposition. The multilayer defect growth model is also enhanced so that the multilayer defect can be "inverted", or recovered from the top layer profile using a calibrated model.
Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Zhao, Xue-Hua
2014-01-01
A novel hybrid method named SCFW-KELM, which integrates effective subtractive clustering features weighting and a fast classifier kernel-based extreme learning machine (KELM), has been introduced for the diagnosis of PD. In the proposed method, SCFW is used as a data preprocessing tool, which aims at decreasing the variance in features of the PD dataset, in order to further improve the diagnostic accuracy of the KELM classifier. The impact of the type of kernel functions on the performance of KELM has been investigated in detail. The efficiency and effectiveness of the proposed method have been rigorously evaluated against the PD dataset in terms of classification accuracy, sensitivity, specificity, area under the receiver operating characteristic (ROC) curve (AUC), f-measure, and kappa statistics value. Experimental results have demonstrated that the proposed SCFW-KELM significantly outperforms SVM-based, KNN-based, and ELM-based approaches and other methods in the literature and achieved highest classification results reported so far via 10-fold cross validation scheme, with the classification accuracy of 99.49%, the sensitivity of 100%, the specificity of 99.39%, AUC of 99.69%, the f-measure value of 0.9964, and kappa value of 0.9867. Promisingly, the proposed method might serve as a new candidate of powerful methods for the diagnosis of PD with excellent performance.
Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Zhao, Xue-Hua
2014-01-01
A novel hybrid method named SCFW-KELM, which integrates effective subtractive clustering features weighting and a fast classifier kernel-based extreme learning machine (KELM), has been introduced for the diagnosis of PD. In the proposed method, SCFW is used as a data preprocessing tool, which aims at decreasing the variance in features of the PD dataset, in order to further improve the diagnostic accuracy of the KELM classifier. The impact of the type of kernel functions on the performance of KELM has been investigated in detail. The efficiency and effectiveness of the proposed method have been rigorously evaluated against the PD dataset in terms of classification accuracy, sensitivity, specificity, area under the receiver operating characteristic (ROC) curve (AUC), f-measure, and kappa statistics value. Experimental results have demonstrated that the proposed SCFW-KELM significantly outperforms SVM-based, KNN-based, and ELM-based approaches and other methods in the literature and achieved highest classification results reported so far via 10-fold cross validation scheme, with the classification accuracy of 99.49%, the sensitivity of 100%, the specificity of 99.39%, AUC of 99.69%, the f-measure value of 0.9964, and kappa value of 0.9867. Promisingly, the proposed method might serve as a new candidate of powerful methods for the diagnosis of PD with excellent performance. PMID:25484912
K-wire assisted split-thickness skin graft harvesting from the anterior trunk.
Yontar, Yalcin; Coruh, Atilla; Severcan, Mehmet
2016-02-01
Split thickness skin graft (STSG) harvesting from the anterior chest and abdominal wall skin is quite a difficult process. The main reason for the difficulty to perform this process is the unsuitable anatomic characteristics of the anterior trunk, such as irregular wavy-like surface over the ribs and lax abdominal wall skin resulting in collapse due to lack of adequate underneath supporting structures when a downward force is applied by the skin graft dermatome. Lower extremity and especially the thigh are generally chosen as the donor site where the STSGs are easily harvested from. However, extensive lower extremity burns, with or without other region burns, preclude harvesting auto STSGs from this invaluable anatomic site. We harvested K-wire assisted STSGs from the anterior chest and abdominal wall skin of 7 patients with lower extremity burns and also a patient that sustained motor vehicle collision. We encountered no problems in any of our patients both intra and postoperatively by using K-wire assisted STSG harvesting. All of the STSGs donor sites healed uneventfully without complications. In our opinion, K-wire assisted STSG harvesting must always be in the tool-box of any surgeon who deals with extensive burns with or without lower extremity burns and extensive traumas of lower extremities. Copyright © 2015 Elsevier Ltd and ISBI. All rights reserved.
Quintela-del-Río, Alejandro; Francisco-Fernández, Mario
2011-02-01
The study of extreme values and prediction of ozone data is an important topic of research when dealing with environmental problems. Classical extreme value theory is usually used in air-pollution studies. It consists in fitting a parametric generalised extreme value (GEV) distribution to a data set of extreme values, and using the estimated distribution to compute return levels and other quantities of interest. Here, we propose to estimate these values using nonparametric functional data methods. Functional data analysis is a relatively new statistical methodology that generally deals with data consisting of curves or multi-dimensional variables. In this paper, we use this technique, jointly with nonparametric curve estimation, to provide alternatives to the usual parametric statistical tools. The nonparametric estimators are applied to real samples of maximum ozone values obtained from several monitoring stations belonging to the Automatic Urban and Rural Network (AURN) in the UK. The results show that nonparametric estimators work satisfactorily, outperforming the behaviour of classical parametric estimators. Functional data analysis is also used to predict stratospheric ozone concentrations. We show an application, using the data set of mean monthly ozone concentrations in Arosa, Switzerland, and the results are compared with those obtained by classical time series (ARIMA) analysis. Copyright © 2010 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Rieder, Harald E.; Jancso, Leonhardt M.; Staehelin, Johannes; Maeder, Jörg A.; Ribatet, Mathieu; Peter, Thomas; Davison, Anthony C.
2010-05-01
In this study we analyze the frequency distribution of extreme events in low and high total ozone (termed ELOs and EHOs) for 5 long-term stations in the northern mid-latitudes in Europe (Belsk, Poland; Hradec Kralove, Czech Republic; Hohenpeissenberg and Potsdam, Germany; and Uccle, Belgium). Further, the influence of these extreme events on annual and seasonal mean values and trends is analysed. The applied method follows the new "ozone extreme concept", which is based on tools from extreme value theory [Coles, 2001; Ribatet, 2007], recently developed by Rieder et al. [2010a, b]. Mathematically seen the decisive feature within the extreme concept is the Generalized Pareto Distribution (GPD). In this analysis, the long-term trends needed to be removed first, differently to the treatment of Rieder et al. [2010a, b], in which the time series of Arosa was analysed, covering many decades of measurements in the anthropogenically undisturbed stratosphere. In contrast to previous studies only focusing on so called ozone mini-holes and mini-highs the "ozone extreme concept" provides a statistical description of the tails in total ozone distributions (i.e. extreme low and high values). It is shown that this concept is not only an appropriate method to describe the frequency and distribution of extreme events, it also provides new information on time series properties and internal variability. Furthermore it allows detection of fingerprints of physical (e.g. El Niño, NAO) and chemical (e.g. polar vortex ozone loss) features in the Earth's atmosphere as well as major volcanic eruptions (e.g. El Chichón, Mt. Pinatubo). It is shown that mean values and trends in total ozone are strongly influenced by extreme events. Trend calculations (for the period 1970-1990) are performed for the entire as well as the extremes-removed time series. The results after excluding extremes show that annual trends are most reduced at Hradec Kralove (about a factor of 3), followed by Potsdam (factor of 2.5), and Hohenpeissenberg and Belsk (both about a factor of 2). In general the reduction of trend is strongest during winter and spring. Throughout all stations the influence of ELOs on observed trends is larger than those of EHOs. Especially from the 1990s on ELOs dominate the picture as only a relatively small fraction of EHOs can be observed in the records (due to strong influence of Mt. Pinatubo eruption and polar vortex ozone loss contributions). Additionally it is evidenced that the number of observed mini-holes can be estimated highly accurate by the GPD-model. Overall the results of this thesis show that extreme events play a major role in total ozone and the "ozone extremes concept" provides deeper insight in the influence of chemical and physical features on column ozone. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder ,H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part I: Application of extreme value theory, to be submitted to ACPD. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and A.D., Davison (2010): Extreme events in total ozone over Arosa - Part II: Fingerprints of atmospheric dynamics and chemistry and effects on mean values and long-term changes, to be submitted to ACPD.
NASA Astrophysics Data System (ADS)
Bradshaw, S. J.
2009-07-01
Context: The effects of non-equilibrium processes on the ionisation state of strongly emitting elements in the solar corona can be extremely difficult to assess and yet they are critically important. For example, there is much interest in dynamic heating events localised in the solar corona because they are believed to be responsible for its high temperature and yet recent work has shown that the hottest (≥107 K) emission predicted to be associated with these events can be observationally elusive due to the difficulty of creating the highly ionised states from which the expected emission arises. This leads to the possibility of observing instruments missing such heating events entirely. Aims: The equations describing the evolution of the ionisaton state are a very stiff system of coupled, partial differential equations whose solution can be numerically challenging and time-consuming. Without access to specialised codes and significant computational resources it is extremely difficult to avoid the assumption of an equilibrium ionisation state even when it clearly cannot be justified. The aim of the current work is to develop a computational tool to allow straightforward calculation of the time-dependent ionisation state for a wide variety of physical circumstances. Methods: A numerical model comprising the system of time-dependent ionisation equations for a particular element and tabulated values of plasma temperature as a function of time is developed. The tabulated values can be the solutions of an analytical model, the output from a numerical code or a set of observational measurements. An efficient numerical method to solve the ionisation equations is implemented. Results: A suite of tests is designed and run to demonstrate that the code provides reliable and accurate solutions for a number of scenarios including equilibration of the ion population and rapid heating followed by thermal conductive cooling. It is found that the solver can evolve the ionisation state to recover exactly the equilibrium state found by an independent, steady-state solver for all temperatures, resolve the extremely small ionisation/recombination timescales associated with rapid temperature changes at high densities, and provide stable and accurate solutions for both dominant and minor ion population fractions. Rapid heating and cooling of low to moderate density plasma is characterised by significant non-equilibrium ionisation conditions. The effective ionisation temperatures are significantly lower than the electron temperature and the values found are in close agreement with the previous work of others. At the very highest densities included in the present study an assumption of equilibrium ionisation is found to be robust. Conclusions: The computational tool presented here provides a straightforward and reliable way to calculate ionisation states for a wide variety of physical circumstances. The numerical code gives results that are accurate and consistent with previous studies, has relatively undemanding computational requirements and is freely available from the author.
Robot-Aided Neurorehabilitation: A Pediatric Robot for Ankle Rehabilitation
Michmizos, Konstantinos P.; Rossi, Stefano; Castelli, Enrico; Cappa, Paolo; Krebs, Hermano Igo
2015-01-01
This paper presents the pediAnklebot, an impedance-controlled low-friction, backdriveable robotic device developed at the Massachusetts Institute of Technology that trains the ankle of neurologically impaired children of ages 6-10 years old. The design attempts to overcome the known limitations of the lower extremity robotics and the unknown difficulties of what constitutes an appropriate therapeutic interaction with children. The robot's pilot clinical evaluation is on-going and it incorporates our recent findings on the ankle sensorimotor control in neurologically intact subjects, namely the speed-accuracy tradeoff, the deviation from an ideally smooth ankle trajectory, and the reaction time. We used these concepts to develop the kinematic and kinetic performance metrics that guided the ankle therapy in a similar fashion that we have done for our upper extremity devices. Here we report on the use of the device in at least 9 training sessions for 3 neurologically impaired children. Results demonstrated a statistically significant improvement in the performance metrics assessing explicit and implicit motor learning. Based on these initial results, we are confident that the device will become an effective tool that harnesses plasticity to guide habilitation during childhood. PMID:25769168
Time-Dependent Density Functional Theory for Extreme Environments
NASA Astrophysics Data System (ADS)
Baczewski, Andrew; Magyar, Rudolph; Shulenburger, Luke
2013-10-01
In recent years, DFT-MD has been shown to be a powerful tool for calculating the equation of state and constitutive properties of warm dense matter (WDM). These studies are validated through a number of experiments, including recently developed X-Ray Thomson Scattering (XRTS) techniques. Here, electronic temperatures and densities of WDM are accessible through x-ray scattering data, which is related to the system's dynamic structure factor (DSF)-a quantity that is accessible through DFT-MD calculations. Previous studies predict the DSF within the Born-Oppenheimer approximation, with the electronic state computed using Mermin DFT. A capability for including more general coupled electron-ion dynamics is desirable, to study both the effect on XRTS observables and the broader problem of electron-ion energy transfer in extreme WDM conditions. Progress towards such a capability will be presented, in the form of an Ehrenfest MD framework using TDDFT. Computational challenges and open theoretical questions will be discussed. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Security Administration under contract DE-AC04-94AL85000.
Mathematical aspects of assessing extreme events for the safety of nuclear plants
NASA Astrophysics Data System (ADS)
Potempski, Slawomir; Borysiewicz, Mieczyslaw
2015-04-01
In the paper the review of mathematical methodologies applied for assessing low frequencies of rare natural events like earthquakes, tsunamis, hurricanes or tornadoes, floods (in particular flash floods and surge storms), lightning, solar flares, etc., will be given in the perspective of the safety assessment of nuclear plants. The statistical methods are usually based on the extreme value theory, which deals with the analysis of extreme deviation from the median (or the mean). In this respect application of various mathematical tools can be useful, like: the extreme value theorem of Fisher-Tippett-Gnedenko leading to possible choices of general extreme value distributions, or the Pickands-Balkema-de Haan theorem for tail fitting, or the methods related to large deviation theory. In the paper the most important stochastic distributions relevant for performing rare events statistical analysis will be presented. This concerns, for example, the analysis of the data with the annual extreme values (maxima - "Annual Maxima Series" or minima), or the peak values, exceeding given thresholds at some periods of interest ("Peak Over Threshold"), or the estimation of the size of exceedance. Despite of the fact that there is a lack of sufficient statistical data directly containing rare events, in some cases it is still possible to extract useful information from existing larger data sets. As an example one can consider some data sets available from the web sites for floods, earthquakes or generally natural hazards. Some aspects of such data sets will be also presented taking into account their usefulness for the practical assessment of risk for nuclear power plants coming from extreme weather conditions.
Primary chondrosarcoma of the trachea with extensive extraluminal growth.
Ryabov, Andrey; Pikin, Oleg; Sokolov, Victor; Volchenko, Nadezda
2017-09-01
Primary chondrosarcoma of the trachea is an extremely rare non-epithelial neoplasm with only few cases published in the literature. We present a rare case of tracheal chondrosarcoma with extensive extraluminal growth. We operated a patient with obstructive tumour of the upper third of the trachea via partial sternotomy. Before surgery, a Hanarostent was put into the trachea to treat a life-threatening stenosis. Postoperative period was uneventful. We discuss the incidence, clinical presentation and treatment options in patients with rare tracheal tumours. In some cases, a multidisciplinary approach (endoscopic intervention followed by surgical resection) is an effective treatment tool. © The Author 2017. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.
Gianni, Stefano; Dogan, Jakob; Jemth, Per
2014-01-01
The Φ value analysis is a method to analyze the structure of metastable states in reaction pathways. Such a methodology is based on the quantitative analysis of the effect of point mutations on the kinetics and thermodynamics of the probed reaction. The Φ value analysis is routinely used in protein folding studies and is potentially an extremely powerful tool to analyze the mechanism of binding induced folding of intrinsically disordered proteins. In this review we recapitulate the key equations and experimental advices to perform the Φ value analysis in the perspective of the possible caveats arising in intrinsically disordered systems. Finally, we briefly discuss some few examples already available in the literature.
CrossCheck plagiarism screening : Experience of the Journal of Epidemiology
NASA Astrophysics Data System (ADS)
Hashimoto, Katsumi
Due to technological advances in the past two decades, researchers now have unprecedented access to a tremendous amount of useful information. However, because of the extreme pressure to publish, this abundance of information can sometimes tempt researchers to commit scientific misconduct. A serious form of such misconduct is plagiarism. Editors are always concerned about the possibility of publishing plagiarized manuscripts. The plagiarism detection tool CrossCheck allows editors to scan and analyze manuscripts effectively. The Journal of Epidemiology took part in a trial of CrossCheck, and this article discusses the concerns journal editors might have regarding the use of CrossCheck and its analysis. In addition, potential problems identified by CrossCheck, including self-plagiarism, are introduced.
Habitability and performance issues for long duration space flights.
Whitmore, M; McQuilkin, M L; Woolford, B J
1998-09-01
Advancing technology, coupled with the desire to explore space has resulted in increasingly longer manned space missions. Although the Long Duration Space Flights (LDSF) have provided a considerable amount of scientific research on human ability to function in extreme environments, findings indicate long duration missions take a toll on the individual, both physiologically and psychologically. These physiological and psychological issues manifest themselves in performance decrements; and could lead to serious errors endangering the mission, spacecraft and crew. The purpose of this paper is threefold: 1) to document existing knowledge of the effects of LDSF on performance, habitability, and workload, 2) to identify and assess potential tools designed to address these decrements, and 3) to propose an implementation plan to address these habitability, performance and workload issues.
Habitability and Performance Issues for Long Duration Space Flights
NASA Technical Reports Server (NTRS)
Whitmore, Mihriban; McQuilkin, Meredith L.; Woolford, Barbara J.
1997-01-01
Advancing technology, coupled with the desire to explore space has resulted in increasingly longer manned space missions. Although the Long Duration Space Flights (LDSF) have provided a considerable amount of scientific research on human ability to function in extreme environments, findings indicate long duration missions take a toll on the individual, both physiologically and psychologically. These physiological and psychological issues manifest themselves in performance decrements; and could lead to serious errors endangering the mission, spacecraft and crew. The purpose of this paper is to document existing knowledge of the effects of LDSF on performance, habitability, and workload and to identify and assess potential tools designed to address these decrements as well as propose an implementation plan to address the habitability, performance and workload issues.
Stocking chart for upland central hardwoods
Martin E. Dale; Donald E. Hilt
1989-01-01
The upland hardwoods stocking chart, introduced by Gingrich in 1967, has become one of the forest manager's most useful tools. The chart allows you to determine the condition of the present stand in relation to a stocking standard. The stocking of a stand is extremely helpful in prescribing various silvicultural treatments such as intermediate thinnings,...
Experimental Economics for Teaching the Functioning of Electricity Markets
ERIC Educational Resources Information Center
Guevara-Cedeno, J. Y.; Palma-Behnke, R.; Uribe, R.
2012-01-01
In the field of electricity markets, the development of training tools for engineers has been extremely useful. A novel experimental economics approach based on a computational Web platform of an electricity market is proposed here for the practical teaching of electrical engineering students. The approach is designed to diminish the gap that…
2015-03-23
SAMPE, Long Beach, CA, 2008. [28] N Hu and H Fukunaga. A new approach for health monitoring of composite structures through identification of impact...Bernard H Minster . Hysteresis and two- dimensional nonlinear wave propagation in berea sandstone. Journal of Geo- physical Research: Solid Earth (1978–2012
A Social Tool: Why and How ESOL Students Use Facebook
ERIC Educational Resources Information Center
Mitchell, Kathleen
2012-01-01
English language learners in the United States and abroad have begun to utilize Facebook, a social networking site, which since its inception in 2004 has been extremely popular with American college students. This qualitative case study with participants from an intensive English program in the US explores seven ESOL students' motivations for…
The Early Childhood Community Gives Back: Exchange Center Makeover Project
ERIC Educational Resources Information Center
Exchange: The Early Childhood Leaders' Magazine Since 1978, 2010
2010-01-01
The early childhood community gives back. With the extreme generosity of the "Exchange" Makeover Project partners, Jewel's Learning Center, selected as the winner of the Center Makeover, will be awarded with new and innovative tools to help build an even stronger educational foundation for the children attending the center. Included will be new…
Using Humor in Physical Education
ERIC Educational Resources Information Center
Barney, David; Christenson, Robert
2013-01-01
Humor can be extremely beneficial in everyday life, whether giving or receiving it. It can be used to lighten the mood, give encouragement, or make corrections. Humor in physical education is no exception. Physical educators can use humor as a teaching tool and to create an environment for students to acquire the knowledge to practice a lifetime…
Fabrication of dense wavelength division multiplexing filters with large useful area
NASA Astrophysics Data System (ADS)
Lee, Cheng-Chung; Chen, Sheng-Hui; Hsu, Jin-Cherng; Kuo, Chien-Cheng
2006-08-01
Dense Wavelength Division Multiplexers (DWDM), a kind of narrow band-pass filter, are extremely sensitive to the optical thickness error in each composite layer. Therefore to have a large useful coating area is extreme difficult because of the uniformity problem. To enlarge the useful coating area it is necessary to improve their design and their fabrication. In this study, we discuss how the tooling factors at different positions and for different materials are related to the optical performance of the design. 100GHz DWDM filters were fabricated by E-gun evaporation with ion-assisted deposition (IAD). To improve the coating uniformity, an analysis technique called shaping tooling factor (STF) was used to analyze the deviation of the optical thickness in different materials so as to enlarge the useful coating area. Also a technique of etching the deposited layers with oxygen ions was introduced. When the above techniques were applied in the fabrication of 100 GHz DWDM filters, the uniformity was better than +/-0.002% over an area of 72 mm in diameter and better than +/-0.0006% over 20mm in diameter.
Scaling a Survey Course in Extreme Weather
NASA Astrophysics Data System (ADS)
Samson, P. J.
2013-12-01
"Extreme Weather" is a survey-level course offered at the University of Michigan that is broadcast via the web and serves as a research testbed to explore best practices for large class conduct. The course has led to the creation of LectureTools, a web-based student response and note-taking system that has been shown to increase student engagement dramatically in multiple courses by giving students more opportunities to participate in class. Included in this is the capacity to pose image-based questions (see image where question was "Where would you expect winds from the south") as well as multiple choice, ordered list, free response and numerical questions. Research in this class has also explored differences in learning outcomes from those who participate remotely versus those who physically come to class and found little difference. Moreover the technologies used allow instructors to conduct class from wherever they are while the students can still answer questions and engage in class discussion from wherever they are. This presentation will use LectureTools to demonstrate its features. Attendees are encouraged to bring a mobile device to the session to participate.
Hoyt, Anne L; Bushman, Don; Lewis, Nathan; Faber, Robert
2012-01-01
How can a formulator have confidence that a preservative system will perform as expected under adverse conditions? Extreme conditions that can lead to the development of "off odors" in the product can be a serious challenge for companies providing home care products in the global market. Formulation and stability testing occur under controlled parameters that simulate limited environmental conditions and microbial challenges are typically performed with a standard inoculum level. While this is an acceptable and dependable process, it does not necessarily assess how well a preservative system can perform under extreme environmental conditions or against unusually high levels of bacterial challenges. This is especially true when formulations are diluted and stored by the end-user. By modifying microbial challenge testing of a liquid dishwashing product to include unexpected dilution schemes, increased microbial assaults, and elevated temperatures, a pattern of preservative efficacy was established. The resulting approach proved to be a useful tool when developing use directions, recommended dilution levels, the overall surfactant system, preservative type, and storage restrictions.
Astrobiology as a tool for getting high school students interested in science
NASA Astrophysics Data System (ADS)
Van der Meer, B. W.; Alletto, James J.; Bryant, Dudley; Carini, Mike; Elliott, Larry; Gelderman, Richard; Mason, Wayne; McDaniel, Kerrie; McGruder, Charles H.; Rinehart, Claire; Tyler, Rico; Walker, Linda
2000-12-01
A workshop was held (10/99) for high school students and teachers on astrobiology. NASA provided support through an IDEAS grant. Out of 63 qualified applicants, 29 were accepted: 22 students (11 minorities) and 7 teachers. The worship was held on 2 successive weekends. Activities included: culturing microbes from human skin, discussing 'what is life?', building and using a 2-inch refractive telescope and a van-Leeuwenhoek- type microscope (each participant built and kept them), listening to lectures by Dr. Richard Gelderman on detecting extra solar planets and by Dr. Richard Hoover on life in extreme environments. Other activities included: collecting samples and isolating micro-organisms from the lost river cave, studying microbial life from extreme environments in the laboratory, using the internet as a research tool and debating the logistics and feasibility of a lunar colony. Written evaluations of the workshop led to the following conclusions: 48% of the students considered a possible career in the biological and/or astrophysical sciences, and half of these stated they were spurred on by the workshop itself.
Automatic Fault Characterization via Abnormality-Enhanced Classification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bronevetsky, G; Laguna, I; de Supinski, B R
Enterprise and high-performance computing systems are growing extremely large and complex, employing hundreds to hundreds of thousands of processors and software/hardware stacks built by many people across many organizations. As the growing scale of these machines increases the frequency of faults, system complexity makes these faults difficult to detect and to diagnose. Current system management techniques, which focus primarily on efficient data access and query mechanisms, require system administrators to examine the behavior of various system services manually. Growing system complexity is making this manual process unmanageable: administrators require more effective management tools that can detect faults and help tomore » identify their root causes. System administrators need timely notification when a fault is manifested that includes the type of fault, the time period in which it occurred and the processor on which it originated. Statistical modeling approaches can accurately characterize system behavior. However, the complex effects of system faults make these tools difficult to apply effectively. This paper investigates the application of classification and clustering algorithms to fault detection and characterization. We show experimentally that naively applying these methods achieves poor accuracy. Further, we design novel techniques that combine classification algorithms with information on the abnormality of application behavior to improve detection and characterization accuracy. Our experiments demonstrate that these techniques can detect and characterize faults with 65% accuracy, compared to just 5% accuracy for naive approaches.« less
Miniature Compressive Ultra-spectral Imaging System Utilizing a Single Liquid Crystal Phase Retarder
NASA Astrophysics Data System (ADS)
August, Isaac; Oiknine, Yaniv; Abuleil, Marwan; Abdulhalim, Ibrahim; Stern, Adrian
2016-03-01
Spectroscopic imaging has been proved to be an effective tool for many applications in a variety of fields, such as biology, medicine, agriculture, remote sensing and industrial process inspection. However, due to the demand for high spectral and spatial resolution it became extremely challenging to design and implement such systems in a miniaturized and cost effective manner. Using a Compressive Sensing (CS) setup based on a single variable Liquid Crystal (LC) retarder and a sensor array, we present an innovative Miniature Ultra-Spectral Imaging (MUSI) system. The LC retarder acts as a compact wide band spectral modulator. Within the framework of CS, a sequence of spectrally modulated images is used to recover ultra-spectral image cubes. Using the presented compressive MUSI system, we demonstrate the reconstruction of gigapixel spatio-spectral image cubes from spectral scanning shots numbering an order of magnitude less than would be required using conventional systems.
August, Isaac; Oiknine, Yaniv; AbuLeil, Marwan; Abdulhalim, Ibrahim; Stern, Adrian
2016-03-23
Spectroscopic imaging has been proved to be an effective tool for many applications in a variety of fields, such as biology, medicine, agriculture, remote sensing and industrial process inspection. However, due to the demand for high spectral and spatial resolution it became extremely challenging to design and implement such systems in a miniaturized and cost effective manner. Using a Compressive Sensing (CS) setup based on a single variable Liquid Crystal (LC) retarder and a sensor array, we present an innovative Miniature Ultra-Spectral Imaging (MUSI) system. The LC retarder acts as a compact wide band spectral modulator. Within the framework of CS, a sequence of spectrally modulated images is used to recover ultra-spectral image cubes. Using the presented compressive MUSI system, we demonstrate the reconstruction of gigapixel spatio-spectral image cubes from spectral scanning shots numbering an order of magnitude less than would be required using conventional systems.
Small Sample Reactivity Measurements in the RRR/SEG Facility: Reanalysis using TRIPOLI-4
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hummel, Andrew; Palmiotti, Guiseppe
2016-08-01
This work involved reanalyzing the RRR/SEG integral experiments performed at the Rossendorf facility in Germany throughout the 1970s and 80s. These small sample reactivity worth measurements were carried out using the pile oscillator technique for many different fission products, structural materials, and standards. The coupled fast-thermal system was designed such that the measurements would provide insight into elemental data, specifically the competing effects between neutron capture and scatter. Comparing the measured to calculated reactivity values can then provide adjustment criteria to ultimately improve nuclear data for fast reactor designs. Due to the extremely small reactivity effects measured (typically less thanmore » 1 pcm) and the specific heterogeneity of the core, the tool chosen for this analysis was TRIPOLI-4. This code allows for high fidelity 3-dimensional geometric modeling, and the most recent, unreleased version, is capable of exact perturbation theory.« less
Womack, Sarah K; Armstrong, Thomas J
2005-09-01
The present study evaluates the effectiveness of a decision support system used to evaluate and control physical job stresses and prevent re-injury of workers who have experienced or are concerned about work-related musculoskeletal disorders. The software program is a database that stores detailed job information such as standardized work data, videos, and upper-extremity physical stress ratings for over 400 jobs in the plant. Additionally, the database users were able to record comments about the jobs and related control issues. The researchers investigated the utility and effectiveness of the software by analyzing its use over a 20-month period. Of the 197 comments entered by the users, 25% pertained to primary prevention, 75% pertained to secondary prevention, and 94 comments (47.7%) described ergonomic interventions. Use of the software tool improved primary and secondary prevention by improving the quality and efficiency of the ergonomic job analysis process.
Kobylinski, Kevin C.; Alout, Haoues; Foy, Brian D.; Clements, Archie; Adisakwattana, Poom; Swierczewski, Brett E.; Richardson, Jason H.
2014-01-01
Recently there have been calls for the eradication of malaria and the elimination of soil-transmitted helminths (STHs). Malaria and STHs overlap in distribution, and STH infections are associated with increased risk for malaria. Indeed, there is evidence that suggests that STH infection may facilitate malaria transmission. Malaria and STH coinfection may exacerbate anemia, especially in pregnant women, leading to worsened child development and more adverse pregnancy outcomes than these diseases would cause on their own. Ivermectin mass drug administration (MDA) to humans for malaria parasite transmission suppression is being investigated as a potential malaria elimination tool. Adding albendazole to ivermectin MDAs would maximize effects against STHs. A proactive, integrated control platform that targets malaria and STHs would be extremely cost-effective and simultaneously reduce human suffering caused by multiple diseases. This paper outlines the benefits of adding albendazole to ivermectin MDAs for malaria parasite transmission suppression. PMID:25070998
Rugged, Low Cost, Environmental Sensors for a Turbulent World
NASA Astrophysics Data System (ADS)
Schulz, B.; Sandell, C. T.; Wickert, A. D.
2017-12-01
Ongoing scientific research and resource management require a diverse range of high-quality and low-cost sensors to maximize the number and type of measurements that can be obtained. To accomplish this, we have developed a series of diversified sensors for common environmental applications. The TP-DownHole is an ultra-compact temperature and pressure sensor designed for use in CMT (Continuous Multi-channel Tubing) multi-level wells. Its 1 mm water depth resolution, 30 cm altitude resolution, and rugged design make it ideal for both water level measurements and monitoring barometric pressure and associated temperature changes. The TP-DownHole sensor has also been incorporated into a self-contained, fully independent data recorder for extreme and remote environments. This device (the TP-Solo) is based around the TP-DownHole design, but has self-contained power and data storage and is designed to collect data independently for up to 6 months (logging at once an hour), creating a specialized tool for extreme environment data collection. To gather spectral information, we have also developed a very low cost photodiode-based Lux sensor to measure spectral irradiance; while this does not measure the entire solar radiation spectrum, simple modeling to rescale the remainder of the solar spectrum makes this a cost-effective alternative to a thermopile pyranometer. Lastly, we have developed an instrumentation amplifier which is designed to interface a wide range of sensitive instruments to common data logging systems, such as thermopile pyranometers, thermocouples, and many other analog output sensors. These three instruments are the first in a diverse family aimed to give researchers a set of powerful and low-cost tools for environmental instrumentation.
The Evaluative Lexicon 2.0: The measurement of emotionality, extremity, and valence in language.
Rocklage, Matthew D; Rucker, Derek D; Nordgren, Loran F
2017-10-19
The rapid expansion of the Internet and the availability of vast repositories of natural text provide researchers with the immense opportunity to study human reactions, opinions, and behavior on a massive scale. To help researchers take advantage of this new frontier, the present work introduces and validates the Evaluative Lexicon 2.0 (EL 2.0)-a quantitative linguistic tool that specializes in the measurement of the emotionality of individuals' evaluations in text. Specifically, the EL 2.0 utilizes natural language to measure the emotionality, extremity, and valence of evaluative reactions and attitudes. The present article describes how we used a combination of 9 million real-world online reviews and over 1,500 participant judges to construct the EL 2.0 and an additional 5.7 million reviews to validate it. To assess its unique value, the EL 2.0 is compared with two other prominent text analysis tools-LIWC and Warriner et al.'s (Behavior Research Methods, 45, 1191-1207, 2013) wordlist. The EL 2.0 is comparatively distinct in its ability to measure emotionality and explains a significantly greater proportion of the variance in individuals' evaluations. The EL 2.0 can be used with any data that involve speech or writing and provides researchers with the opportunity to capture evaluative reactions both in the laboratory and "in the wild." The EL 2.0 wordlist and normative emotionality, extremity, and valence ratings are freely available from www.evaluativelexicon.com .
A fast and complete GEANT4 and ROOT Object-Oriented Toolkit: GROOT
NASA Astrophysics Data System (ADS)
Lattuada, D.; Balabanski, D. L.; Chesnevskaya, S.; Costa, M.; Crucillà, V.; Guardo, G. L.; La Cognata, M.; Matei, C.; Pizzone, R. G.; Romano, S.; Spitaleri, C.; Tumino, A.; Xu, Y.
2018-01-01
Present and future gamma-beam facilities represent a great opportunity to validate and evaluate the cross-sections of many photonuclear reactions at near-threshold energies. Monte Carlo (MC) simulations are very important to evaluate the reaction rates and to maximize the detection efficiency but, unfortunately, they can be very cputime-consuming and in some cases very hard to reproduce, especially when exploring near-threshold cross-section. We developed a software that makes use of the validated tracking GEANT4 libraries and the n-body event generator of ROOT in order to provide a fast, realiable and complete MC tool to be used for nuclear physics experiments. This tool is indeed intended to be used for photonuclear reactions at γ-beam facilities with ELISSA (ELI Silicon Strip Array), a new detector array under development at the Extreme Light Infrastructure - Nuclear Physics (ELI-NP). We discuss the results of MC simulations performed to evaluate the effects of the electromagnetic induced background, of the straggling due to the target thickness and of the resolution of the silicon detectors.
NASA Astrophysics Data System (ADS)
Ivannikova, E.; Kruglyakov, M.; Kuvshinov, A. V.; Rastaetter, L.; Pulkkinen, A. A.; Ngwira, C. M.
2017-12-01
During extreme space weather events electric currents in the Earth's magnetosphere and ionosphere experience large variations, which leads to dramatic intensification of the fluctuating magnetic field at the surface of the Earth. According to Faraday's law of induction, the fluctuating geomagnetic field in turn induces electric field that generates harmful currents (so-called "geomagnetically induced currents"; GICs) in grounded technological systems. Understanding (via modeling) of the spatio-temporal evolution of the geoelectric field during enhanced geomagnetic activity is a key consideration in estimating the hazard to technological systems from space weather. We present the results of ground geoelectric field modeling for the Northeast United States, which is performed with the use of our novel numerical tool based on integral equation approach. The tool exploits realistic regional three-dimensional (3-D) models of the Earth's electrical conductivity and realistic global models of the spatio-temporal evolution of the magnetospheric and ionospheric current systems responsible for geomagnetic disturbances. We also explore in detail the manifestation of the coastal effect (anomalous intensification of the geoelectric field near the coasts) in this region.
Schmieder, George J; Huang, Eugene Y; Jarratt, Michael
2012-12-01
Photodynamic therapy (PDT) with aminolevulinic acid (ALA) has been shown to be safe and effective in the treatment of actinic keratoses (AKs) of the face and scalp. A recent small study has suggested that ALA-PDT can be effective for AKs of the dorsal hands/forearms. However, studies designed to provide sufficient statistical power to test this hypothesis are lacking in the literature. To determine and compare the safety and efficacy of blue light ALA-PDT vs blue light placebo vehicle (VEH) in the treatment of AKs of the upper extremities and to evaluate the effect of occlusion after application of ALA vs VEH. ALA or VEH was applied to both dorsal hands/forearms for the 3-hour incubation period before blue light treatment (10 J/ cm2). One extremity of each subject was covered with occlusive dressing during the incubation period. Treatment was repeated at week 8 if any AK lesions remained. The median AK lesion clearance rate at week 12 was 88.7% for extremities treated with occluded ALA (ALA+OCC), 70.0% for extremities treated with nonoccluded ALA, 16.7% for extremities treated with occluded VEH (VEH+OCC), and 5.6% for extremities treated with nonoccluded VEH (P<.0001). ALA+OCC resulted in a significantly higher clearance rate compared with the nonoccluded extremity at weeks 8 (P=.0006) and 12 (P=.0029). Thirty-four percent (12/35) of extremities treated with ALA+OCC had complete clearance of lesions at week 12 compared with 0% (0/35) of extremities treated with VEH+OCC (P=.0002). The safety pro!le in this study is consistent with previously reported side effects of the therapy. Blue light ALA-PDT following a 3-hour incubation appears efficacious for AK clearance of the upper extremities. Incubation using an occlusive dressing significantly increases the efficacy of the procedure and also increases the incidence and severity of some acute inflammatory side effects of PDT.
Extreme fluctuations in stochastic network coordination with time delays
NASA Astrophysics Data System (ADS)
Hunt, D.; Molnár, F.; Szymanski, B. K.; Korniss, G.
2015-12-01
We study the effects of uniform time delays on the extreme fluctuations in stochastic synchronization and coordination problems with linear couplings in complex networks. We obtain the average size of the fluctuations at the nodes from the behavior of the underlying modes of the network. We then obtain the scaling behavior of the extreme fluctuations with system size, as well as the distribution of the extremes on complex networks, and compare them to those on regular one-dimensional lattices. For large complex networks, when the delay is not too close to the critical one, fluctuations at the nodes effectively decouple, and the limit distributions converge to the Fisher-Tippett-Gumbel density. In contrast, fluctuations in low-dimensional spatial graphs are strongly correlated, and the limit distribution of the extremes is the Airy density. Finally, we also explore the effects of nonlinear couplings on the stability and on the extremes of the synchronization landscapes.
The Youth Throwing Score: Validating Injury Assessment in Young Baseball Players.
Ahmad, Christopher S; Padaki, Ajay S; Noticewala, Manish S; Makhni, Eric C; Popkin, Charles A
2017-02-01
Epidemic levels of shoulder and elbow injuries have been reported recently in youth and adolescent baseball players. Despite the concerning frequency of these injuries, no instrument has been validated to assess upper extremity injury in this patient population. Purpose/Hypothesis: The purpose of this study was to validate an upper extremity assessment tool specifically designed for young baseball players. We hypothesized that this tool will be both reliable and valid. Cohort study (diagnosis); Level of evidence, 2. The Youth Throwing Score (YTS) was constructed by an interdisciplinary team of providers and coaches as a tool to assess upper extremity injury in youth and adolescent baseball players (age range, 10-18 years). The psychometric properties of the test were then determined. A total of 223 players completed the final survey. The players' mean age was 14.3 ± 2.7 years. Pilot analysis showed that none of the 14 questions received a mean athlete importance rating less than 3 of 5, and the final survey read at a Flesch-Kincaid level of 4.1, which is appropriate for patients aged 9 years and older. The players self-assigned their injury status, resulting in a mean instrument score of 59.7 ± 8.4 for the 148 players "playing without pain," 42.0 ± 11.5 for the 60 players "playing with pain," and 40.4 ± 10.5 for the 15 players "not playing due to pain." Players playing without pain scored significantly higher than those playing with pain and those not playing due to pain ( P < .001). Psychometric analysis showed a test-retest intraclass correlation coefficient of 0.90 and a Cronbach alpha intra-item reliability coefficient of 0.93, indicating excellent reliability and internal consistency. Pearson correlation coefficients of 0.65, 0.62, and 0.31 were calculated between the YTS and the Pediatric Outcomes Data Collection Instrument sports/physical functioning module, the Kerlan-Jobe Orthopaedic Clinic Shoulder and Elbow score, and the Quick Disabilities of the Arm, Shoulder, and Hand (QuickDASH) score, respectively. Injured players scored a mean of 9.4 points higher after treatment ( P < .001), and players who improved in their self-assigned pain categorization scored 16.5 points higher ( P < .001). The YTS is the first valid and reliable instrument for assessing young baseball players' upper extremity health.
NASA Astrophysics Data System (ADS)
Panziera, Luca; Gabella, Marco; Zanini, Stefano; Hering, Alessandro; Germann, Urs; Berne, Alexis
2016-06-01
This paper presents a regional extreme rainfall analysis based on 10 years of radar data for the 159 regions adopted for official natural hazard warnings in Switzerland. Moreover, a nowcasting tool aimed at issuing heavy precipitation regional alerts is introduced. The two topics are closely related, since the extreme rainfall analysis provides the thresholds used by the nowcasting system for the alerts. Warm and cold seasons' monthly maxima of several statistical quantities describing regional rainfall are fitted to a generalized extreme value distribution in order to derive the precipitation amounts corresponding to sub-annual return periods for durations of 1, 3, 6, 12, 24 and 48 h. It is shown that regional return levels exhibit a large spatial variability in Switzerland, and that their spatial distribution strongly depends on the duration of the aggregation period: for accumulations of 3 h and shorter, the largest return levels are found over the northerly alpine slopes, whereas for longer durations the southern Alps exhibit the largest values. The inner alpine chain shows the lowest values, in agreement with previous rainfall climatologies. The nowcasting system presented here is aimed to issue heavy rainfall alerts for a large variety of end users, who are interested in different precipitation characteristics and regions, such as, for example, small urban areas, remote alpine catchments or administrative districts. The alerts are issued not only if the rainfall measured in the immediate past or forecast in the near future exceeds some predefined thresholds but also as soon as the sum of past and forecast precipitation is larger than threshold values. This precipitation total, in fact, has primary importance in applications for which antecedent rainfall is as important as predicted one, such as urban floods early warning systems. The rainfall fields, the statistical quantity representing regional rainfall and the frequency of alerts issued in case of continuous threshold exceedance are some of the configurable parameters of the tool. The analysis of the urban flood which occurred in the city of Schaffhausen in May 2013 suggests that this alert tool might have complementary skill with respect to radar-based thunderstorm nowcasting systems for storms which do not show a clear convective signature.
Beneficial effects of restoration practices can be thwarted by climate extremes.
Maccherini, Simona; Bacaro, Giovanni; Marignani, Michela
2018-06-01
The impacts of climate extremes on species, communities and ecosystems have become critical concerns to science and society. Under a changing climate, how restoration outcomes are affected by extreme climate variables is a largely unknown topic. We analyzed the effects of experimental factors (grazing and sowing of native species), extreme climate events (intense precipitation and extreme temperatures indexes) and their combination on the restoration progress of a dry, calcareous grassland in Tuscany (Italy) with a 1 year before/15 years continuous annual monitoring after, control/impact (BACI) experiment. Grazing had a beneficial effect on the diversity of the grassland, while sowing had a limited impact. The climatic index that most affected the entire plant community composition was the number of very heavy precipitation days. The interaction of grazing and extreme climatic indexes had a significant detrimental effect on restoration outcomes, increasing the cover of synanthropic and Cosmopolitan-Subcosmopolitan generalist species and decreasing the cover of more valuable species such endemic species. In the richest grazed plots, species richness showed a lower sensitivity to the average precipitation per wet day but in grazed site, restoration outcomes can be negatively influenced by the intensification of precipitation and temperature extremes. In a context of progressive tropicalization of the Mediterranean area, to assist managers setting achievable restoration goals, restoration practitioners should consider that climate extremes might interfere with the beneficial effects of restoration practices. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Shaw, C.
2016-12-01
Globally, higher daily peak temperatures and longer, more intense heat waves are becoming increasingly frequent due to climate change. India, with relatively low GDP per capita, high population density, and tropical climate, is particularly vulnerable to these trends. In May 2015, one of the worst heat waves in world history hit the country, culminating in at least 2,300 officially-reported deaths as temperatures in some regions reached 48°C. As a result of climate change, heat waves in this region will last longer, be more extreme, and occur with greater frequency in the coming years. Impacts will be felt most acutely by vulnerable populations, which include not only those with frail health, but also populations otherwise considered healthy whose livelihood involves working under exposure to high temperatures. The problem is exacerbated by low levels of economic development, particularly in the under-provision of medical services, a higher proportion of weather-reliant income sources, and the inability to recover quickly from shocks. Responding to these challenges requires collaboration among the disciplines of climate science, public health, economics, and public policy. This project, presented as an online web application using Esri's ArcGIS Story Map, covers 1) the impact of extreme heat on human mortality, 2) the impact of combined heat and humidity (as measured by wet bulb globe temperature) on labor productivity, and 3) emerging best practices in adaptation planning by local municipalities and NGOs. The work is presented in a format that is designed to allow policymakers to take a deeper dive into the literature linking extreme temperature to human health and labor productivity, combined with interactive mapping tools that allow planners to drill down to data at the district level across the country of India. Further, the work presents a case study of heat adaptation planning efforts that have already been implemented in the city of Ahmedabad, allowing planners to understand what adaptations options might be available to mitigate the risk. Taken together, the tool provides a means to stimulate adaptation efforts, helping society's ability to prepare and cope with extreme heat events.
Bendixen, Roxanna M.; Butrum, Jocelyn; Jain, Mina S.; Parks, Rebecca; Hodsdon, Bonnie; Nichols, Carmel; Hsia, Michelle; Nelson, Leslie; Keller, Katherine C.; McGuire, Michelle; Elliott, Jeffrey S.; Linton, Melody M.; Arveson, Irene C.; Tounkara, Fatou; Vasavada, Ruhi; Harnett, Elizabeth; Punjabi, Monal; Donkervoort, Sandra; Dastgir, Jahannaz; Leach, Meganne E.; Rutkowski, Anne; Waite, Melissa; Collins, James; Bönnemann, Carsten G.; Meilleur, Katherine G.
2017-01-01
Purpose Congenital muscular dystrophy (CMD) comprises a rare group of genetic muscle diseases that present at birth or early during infancy. Two common subtypes of CMD are collagen VI-related muscular dystrophy (COL6-RD) and laminin alpha 2-related dystrophy (LAMA2-RD). Traditional outcome measures in CMD include gross motor and mobility assessments, yet significant motor declines underscore the need for valid upper extremity (UE) motor assessments as a clinical endpoint. This study validated a battery of UE measures in these two CMD subtypes for future clinical trials. Methods For this cross-sectional study, 42 participants were assessed over the same 2–5 day period at the National Institutes of Health Clinical Center (CC). All UE measures were correlated with the Motor Function Measure 32 (MFM32). The battery of UE assessments included the Jebsen Taylor Hand Function Test, Quality of Upper Extremity Skills Test (QUEST), hand held dynamometry, goniometry, and MyoSet Tools. Spearman Rho was used for correlations to the MFM32. Pearson was performed to correlate the Jebsen, QUEST, hand-held dynamometry, goniometry and the MyoSet Tools. Correlations were considered significant at the 0.01 level (2-tailed). Results Significant correlations were found between both the MFM32 and MFM Dimension 3 only (Distal Motor function) and the Jebsen, QUEST, MyoGrip and MyoPinch, elbow flexion/extension ROM and myometry. Additional correlations between the assessments are reported. Conclusions The Jebsen, the Grasp and Dissociated Movements domains of the QUEST, the MyoGrip and the MyoPinch tools, as well as elbow ROM and myometry were determined to be valid and feasible in this population, provided variation in test items, and assessed a range of difficulty in CMD. To move forward, it will be of utmost importance to determine whether these UE measures are reproducible and sensitive to change over time. PMID:28087121
A Conceptual Framework for Planning Systemic Human Adaptation to Global Warming
Tait, Peter W.; Hanna, Elizabeth G.
2015-01-01
Human activity is having multiple, inter-related effects on ecosystems. Greenhouse gas emissions persisting along current trajectories threaten to significantly alter human society. At 0.85 °C of anthropogenic warming, deleterious human impacts are acutely evident. Additional warming of 0.5 °C–1.0 °C from already emitted CO2 will further intensify extreme heat and damaging storm events. Failing to sufficiently address this trend will have a heavy human toll directly and indirectly on health. Along with mitigation efforts, societal adaptation to a warmer world is imperative. Adaptation efforts need to be significantly upscaled to prepare society to lessen the public health effects of rising temperatures. Modifying societal behaviour is inherently complex and presents a major policy challenge. We propose a social systems framework for conceptualizing adaptation that maps out three domains within the adaptation policy landscape: acclimatisation, behavioural adaptation and technological adaptation, which operate at societal and personal levels. We propose that overlaying this framework on a systems approach to societal change planning methods will enhance governments’ capacity and efficacy in strategic planning for adaptation. This conceptual framework provides a policy oriented planning assessment tool that will help planners match interventions to the behaviours being targeted for change. We provide illustrative examples to demonstrate the framework’s application as a planning tool. PMID:26334285
Centen, Andrew; Lowrey, Catherine R; Scott, Stephen H; Yeh, Ting-Ting; Mochizuki, George
2017-06-19
Spasticity is a common sequela of stroke. Traditional assessment methods include relatively coarse scales that may not capture all characteristics of elevated muscle tone. Thus, the aim of this study was to develop a tool to quantitatively assess post-stroke spasticity in the upper extremity. Ninety-six healthy individuals and 46 individuals with stroke participated in this study. The kinematic assessment of passive stretch (KAPS) protocol consisted of passive elbow stretch in flexion and extension across an 80° range in 5 movement durations. Seven parameters were identified and assessed to characterize spasticity (peak velocity, final angle, creep (or release), between-arm peak velocity difference, between-arm final angle, between-arm creep, and between-arm catch angle). The fastest movement duration (600 ms) was most effective at identifying impairment in each parameter associated with spasticity. A decrease in peak velocity during passive stretch between the affected and unaffected limb was most effective at identifying individuals as impaired. Spasticity was also associated with a decreased passive range (final angle) and a classic 'catch and release' as seen through between-arm catch and creep metrics. The KAPS protocol and robotic technology can provide a sensitive and quantitative assessment of post-stroke elbow spasticity not currently attainable through traditional measures.
NASA Technical Reports Server (NTRS)
Tasciotti, Ennio (Inventor); Hu, Ye (Inventor); Ferrari, Mauro (Inventor); Bouamrani, Ali (Inventor); Liu, Xuewu (Inventor)
2014-01-01
A new fractionation device shows desirable features for exploratory screening and biomarker discovery. The constituent MSCs may be tailored for desired pore sizes and surface properties and for the sequestration and enrichment of extremely low abundant protein and peptides in desired ranges of the mass/charge spectrum. The MSCs are effective in yielding reproducible extracts from complex biological samples as small as 10 microliter in a time as short as 30 minutes. They are inexpensive to manufacture, and allow for scaled up production to attain the simultaneous processing of a large number of samples. The MSCs are multiplexed, label-free diagnostic tools with the potential of biological recognition moiety modification for enhanced specificity. The MSCs may store, protect and stabilize biological fluids, enabling the simplified and cost-effective collection and transportation of clinical samples. The MSC-based device may serve as a diagnostic tool to complement histopathology, imaging, and other conventional clinical techniques. The MSCs mediated identification of disease-specific protein signatures may help in the selection of personalized therapeutic combinations, in the real-time assessment of therapeutic efficacy and toxicity, and in the rational modulation of therapy based on the changes in the protein networks associated with the prognosis and the drug resistance of the disease.
A Conceptual Framework for Planning Systemic Human Adaptation to Global Warming.
Tait, Peter W; Hanna, Elizabeth G
2015-08-31
Human activity is having multiple, inter-related effects on ecosystems. Greenhouse gas emissions persisting along current trajectories threaten to significantly alter human society. At 0.85 °C of anthropogenic warming, deleterious human impacts are acutely evident. Additional warming of 0.5 °C-1.0 °C from already emitted CO₂ will further intensify extreme heat and damaging storm events. Failing to sufficiently address this trend will have a heavy human toll directly and indirectly on health. Along with mitigation efforts, societal adaptation to a warmer world is imperative. Adaptation efforts need to be significantly upscaled to prepare society to lessen the public health effects of rising temperatures. Modifying societal behaviour is inherently complex and presents a major policy challenge. We propose a social systems framework for conceptualizing adaptation that maps out three domains within the adaptation policy landscape: acclimatisation, behavioural adaptation and technological adaptation, which operate at societal and personal levels. We propose that overlaying this framework on a systems approach to societal change planning methods will enhance governments' capacity and efficacy in strategic planning for adaptation. This conceptual framework provides a policy oriented planning assessment tool that will help planners match interventions to the behaviours being targeted for change. We provide illustrative examples to demonstrate the framework's application as a planning tool.
Risk Management for the International Space Station
NASA Technical Reports Server (NTRS)
Sebastian, J.; Brezovic, Philip
2002-01-01
The International Space Station (ISS) is an extremely complex system, both technically and programmatically. The Space Station must support a wide range of payloads and missions. It must be launched in numerous launch packages and be safely assembled and operated in the harsh environment of space. It is being designed and manufactured by many organizations, including the prime contractor, Boeing, the NASA institutions, and international partners and their contractors. Finally, the ISS has multiple customers, (e.g., the Administration, Congress, users, public, international partners, etc.) with contrasting needs and constraints. It is the ISS Risk Management Office strategy to proactively and systematically manages risks to help ensure ISS Program success. ISS program follows integrated risk management process (both quantitative and qualitative) and is integrated into ISS project management. The process and tools are simple and seamless and permeate to the lowest levels (at a level where effective management can be realized) and follows the continuous risk management methodology. The risk process assesses continually what could go wrong (risks), determine which risks need to be managed, implement strategies to deal with those risks, and measure effectiveness of the implemented strategies. The process integrates all facets of risk including cost, schedule and technical aspects. Support analysis risk tools like PRA are used to support programatic decisions and assist in analyzing risks.
NASA Astrophysics Data System (ADS)
Chan, Y. David; Rastegar, Abbas; Yun, Henry; Putna, E. Steve; Wurm, Stefan
2010-04-01
Reducing mask blank and patterned mask defects is the number one challenge for extreme ultraviolet lithography. If the industry succeeds in reducing mask blank defects at the required rate of 10X every year for the next 2-3 years to meet high volume manufacturing defect requirements, new inspection and review tool capabilities will soon be needed to support this goal. This paper outlines the defect inspection and review tool technical requirements and suggests development plans to achieve pilot line readiness in 2011/12 and high volume manufacturing readiness in 2013. The technical specifications, tooling scenarios, and development plans were produced by a SEMATECH-led technical working group with broad industry participation from material suppliers, tool suppliers, mask houses, integrated device manufacturers, and consortia. The paper summarizes this technical working group's assessment of existing blank and mask inspection/review infrastructure capabilities to support pilot line introduction and outlines infrastructure development requirements and tooling strategies to support high volume manufacturing.
An application of eddy current damping effect on single point diamond turning of titanium alloys
NASA Astrophysics Data System (ADS)
Yip, W. S.; To, S.
2017-11-01
Titanium alloys Ti6Al4V (TC4) have been popularly applied in many industries. They have superior material properties including an excellent strength-to-weight ratio and corrosion resistance. However, they are regarded as difficult to cut materials; serious tool wear, a high level of cutting vibration and low surface integrity are always involved in machining processes especially in ultra-precision machining (UPM). In this paper, a novel hybrid machining technology using an eddy current damping effect is firstly introduced in UPM to suppress machining vibration and improve the machining performance of titanium alloys. A magnetic field was superimposed on samples during single point diamond turning (SPDT) by exposing the samples in between two permanent magnets. When the titanium alloys were rotated within a magnetic field in the SPDT, an eddy current was generated through a stationary magnetic field inside the titanium alloys. An eddy current generated its own magnetic field with the opposite direction of the external magnetic field leading a repulsive force, compensating for the machining vibration induced by the turning process. The experimental results showed a remarkable improvement in cutting force variation, a significant reduction in adhesive tool wear and an extreme long chip formation in comparison to normal SPDT of titanium alloys, suggesting the enhancement of the machinability of titanium alloys using an eddy current damping effect. An eddy current damping effect was firstly introduced in the area of UPM to deliver the results of outstanding machining performance.
Tako, Elad; Bar, Haim; Glahn, Raymond P.
2016-01-01
Research methods that predict Fe bioavailability for humans can be extremely useful in evaluating food fortification strategies, developing Fe-biofortified enhanced staple food crops and assessing the Fe bioavailability of meal plans that include such crops. In this review, research from four recent poultry (Gallus gallus) feeding trials coupled with in vitro analyses of Fe-biofortified crops will be compared to the parallel human efficacy studies which used the same varieties and harvests of the Fe-biofortified crops. Similar to the human studies, these trials were aimed to assess the potential effects of regular consumption of these enhanced staple crops on maintenance or improvement of iron status. The results demonstrate a strong agreement between the in vitro/in vivo screening approach and the parallel human studies. These observations therefore indicate that the in vitro/Caco-2 cell and Gallus gallus models can be integral tools to develop varieties of staple food crops and predict their effect on iron status in humans. The cost-effectiveness of this approach also means that it can be used to monitor the nutritional stability of the Fe-biofortified crop once a variety has released and integrated into the food system. These screening tools therefore represent a significant advancement to the field for crop development and can be applied to ensure the sustainability of the biofortification approach. PMID:27869705
Gage for micromachining system
Miller, Donald M.
1979-02-27
A gage for measuring the contour of the surface of an element of a micromachining tool system and of a work piece machined by the micromachining tool system. The gage comprises a glass plate containing two electrical contacts and supporting a steel ball resting against the contacts. As the element or workpiece is moved against the steel ball, the very slight contact pressure causes an extremely small movement of the steel ball which breaks the electrical circuit between the two contacts. The contour information is supplied to a dedicated computer controlling the micromachining tool so that the computer knows the contour of the element and the work piece to an accuracy of .+-. 25 nm. The micromachining tool system with X- and omega-axes is used to machine spherical, aspherical, and irregular surfaces with a maximum contour error of 100 nanometers (nm) and surface waviness of no more than 0.8 nm RMS.
EUVL mask dual pods to be used for mask shipping and handling in exposure tools
NASA Astrophysics Data System (ADS)
Gomei, Yoshio; Ota, Kazuya; Lystad, John; Halbmair, Dave; He, Long
2007-03-01
The concept of Extreme Ultra-Violet Lithography (EUVL) mask dual pods is proposed for use in both mask shipping and handling in exposure tools. The inner pod was specially designed to protect masks from particle contamination during shipping from mask houses to wafer factories. It can be installed in a load-lock chamber of exposure tools and evacuated while holding the mask inside. The inner pod upper cover is removed just before the mask is installed to a mask stage. Prototypes were manufactured and tested for shipping and for vacuum cycling. We counted particle adders through these actions with a detectable level of 54 nm and up. The adder count was close to zero, or we can say that the obtained result is within the noise level of our present evaluation environment. This indicates that the present concept is highly feasible for EUVL mask shipping and handling in exposure tools.
Hänsch, Theodor W.
2018-05-23
For more than three decades, the quest for ever higher precision in laser spectroscopy of the simple hydrogen atom has inspired many advances in laser, optical, and spectroscopic techniques, culminating in femtosecond laser optical frequency combs as perhaps the most precise measuring tools known to man. Applications range from optical atomic clocks and tests of QED and relativity to searches for time variations of fundamental constants. Recent experiments are extending frequency comb techniques into the extreme ultraviolet. Laser frequency combs can also control the electric field of ultrashort light pulses, creating powerful new tools for the emerging field of attosecond science.
Extremely high data-rate, reliable network systems research
NASA Technical Reports Server (NTRS)
Foudriat, E. C.; Maly, Kurt J.; Mukkamala, R.; Murray, Nicholas D.; Overstreet, C. Michael
1990-01-01
Significant progress was made over the year in the four focus areas of this research group: gigabit protocols, extensions of metropolitan protocols, parallel protocols, and distributed simulations. Two activities, a network management tool and the Carrier Sensed Multiple Access Collision Detection (CSMA/CD) protocol, have developed to the point that a patent is being applied for in the next year; a tool set for distributed simulation using the language SIMSCRIPT also has commercial potential and is to be further refined. The year's results for each of these areas are summarized and next year's activities are described.
DIY: "Do Imaging Yourself" - Conventional microscopes as powerful tools for in vivo investigation.
Antunes, Maísa Mota; Carvalho, Érika de; Menezes, Gustavo Batista
2018-01-01
Intravital imaging has been increasingly employed in cell biology studies and it is becoming one of the most powerful tools for in vivo investigation. Although some protocols can be extremely complex, most intravital imaging procedures can be performed using basic surgery and animal maintenance techniques. More importantly, regular confocal microscopes - the same that are used for imaging immunofluorescence slides - can also acquire high quality intravital images and movies after minor adaptations. Here we propose minimal adaptations in stock microscopes that allow major improvements in different fields of scientific investigation. Copyright © 2017 Elsevier Ltd. All rights reserved.
Recent advances in genetic modification systems for Actinobacteria.
Deng, Yu; Zhang, Xi; Zhang, Xiaojuan
2017-03-01
Actinobacteria are extremely important to human health, agriculture, and forests. Because of the vast differences of the characteristics of Actinobacteria, a lot of genetic tools have been developed for efficiently manipulating the genetics. Although there are a lot of successful examples of engineering Actinobacteria, they are still more difficult to be genetically manipulated than other model microorganisms such as Saccharomyces cerevisiae, Escherichia coli, and Bacillus subtilis etc. due to the diverse genomics and biochemical machinery. Here, we review the methods to introduce heterologous DNA into Actinobacteria and the available genetic modification tools. The trends and problems existing in engineering Actinobacteria are also covered.
Evaluation of the Plastic Surgery In-Service Training Exam: Lower Extremity Questions.
Silvestre, Jason; Basta, Marten N; Serletti, Joseph M; Chang, Benjamin
2015-01-01
To facilitate the training of plastic surgery residents, we analyzed a knowledge-based curriculum for plastic and reconstructive surgery of the lower extremity. The Plastic Surgery In-Service Training Exam (PSITE) is a commonly used tool to assess medical knowledge in plastic surgery. We reviewed the lower extremity content on 6 consecutive score keys (2008-2013). Questions were classified by taxonomy, anatomy, and subject. Answer references were quantified by source and relative year of publication. Totally, 107 questions related to the lower extremity (9.1% of all questions) and 14 questions had an associated image (13.1%). Questions required decision making (49%) over interpretation (36%) and direct recall (15%) skills (p < 0.001). Conditions of the leg (42.1%) and thigh (24.3%) constituted most of the questions. Subject matter focused on flap reconstruction (38.3%), nerve injury (8.4%), and congenital deformity (6.5%). Analysis of 263 citations to 66 unique journals showed that Plastic and Reconstructive Surgery (54.9%) was the highest yield primary source. The median year of publication relative to PSITE administration was 6 (range: 1-58) with a mode of 2 years. Plastic Surgery by Mathes et al. was the most referenced textbook (21.9%). These data establish a benchmark for lower extremity training during plastic surgery residency. Study efforts focused on the most common topics and references will enhance trainee preparation for lower extremity PSITE questions. Copyright © 2015 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
Greve, Julia Maria D'Andréa; Santos, Luciana; Alonso, Angelica Castilho; Tate, Denise G
2015-01-01
Assessing the driving abilities of individuals with disabilities is often a very challenging task because each medical condition is accompanied by physical impairments and because relative individual functional performance may vary depending on personal characteristics. We identified existing driving evaluation modalities for able-bodied and lower extremity-impaired subjects (spinal cord injury patients and amputees) and evaluated the potential relationships between driving performance and the motor component of driving. An extensive scoping review of the literature was conducted to identify driving assessment tools that are currently used for able-bodied individuals and for those with spinal cord injury or lower extremity amputation. The literature search focused on the assessment of the motor component of driving. References were electronically obtained via Medline from the PubMed, Ovid, Web of Science and Google Scholar databases. This article compares the current assessments of driving performance for those with lower extremity impairments with the assessments used for able-bodied persons. Very few articles were found concerning “Lower Extremity Disabilities,” thus confirming the need for further studies that can provide evidence and guidance for such assessments in the future. Little is known about the motor component of driving and its association with the other driving domains, such as vision and cognition. The available research demonstrates the need for a more evidenced-based understanding of how to best evaluate persons with lower extremity impairment. PMID:26375567
Guo, Yuming; Li, Shanshan; Zhang, Yanshen; Armstrong, Ben; Jaakkola, Jouni J K; Tong, Shilu; Pan, Xiaochuan
2013-02-01
To examine the effects of extremely cold and hot temperatures on ischaemic heart disease (IHD) mortality in five cities (Beijing, Tianjin, Shanghai, Wuhan and Guangzhou) in China; and to examine the time relationships between cold and hot temperatures and IHD mortality for each city. A negative binomial regression model combined with a distributed lag non-linear model was used to examine city-specific temperature effects on IHD mortality up to 20 lag days. A meta-analysis was used to pool the cold effects and hot effects across the five cities. 16 559 IHD deaths were monitored by a sentinel surveillance system in five cities during 2004-2008. The relationships between temperature and IHD mortality were non-linear in all five cities. The minimum-mortality temperatures in northern cities were lower than in southern cities. In Beijing, Tianjin and Guangzhou, the effects of extremely cold temperatures were delayed, while Shanghai and Wuhan had immediate cold effects. The effects of extremely hot temperatures appeared immediately in all the cities except Wuhan. Meta-analysis showed that IHD mortality increased 48% at the 1st percentile of temperature (extremely cold temperature) compared with the 10th percentile, while IHD mortality increased 18% at the 99th percentile of temperature (extremely hot temperature) compared with the 90th percentile. Results indicate that both extremely cold and hot temperatures increase IHD mortality in China. Each city has its characteristics of heat effects on IHD mortality. The policy for response to climate change should consider local climate-IHD mortality relationships.
Extreme Events: low and high total ozone over Arosa, Switzerland
NASA Astrophysics Data System (ADS)
Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Stübi, R.; Weihs, P.; Holawe, F.; Peter, T.; Davison, A. C.
2009-04-01
The frequency distribution of days with extreme low (termed ELOs) and high (termed EHOs) total ozone is analyzed for the world's longest total ozone record (Arosa, Switzerland - for details see Staehelin et al.,1998a,b), with new tools from extreme value theory (e.g. Coles, 2001; Ribatet, 2007). A heavy-tail focused approach is used through the fitting of the Generalized Pareto Distribution (GPD) to the Arosa time series. Asymptotic arguments (Pickands, 1975) justify the use of the GPD for modeling exceedances over a high (or below a low) enough threshold (Coles, 2001). The analysis shows that the GPD is appropriate for modeling the frequency distribution in total ozone above or below a mathematically well-defined threshold. While previous studies focused on so termed ozone mini-holes and mini-highs (e.g. Bojkov and Balis, 2001, Koch et al., 2005), this study is the first to present a mathematical description of extreme events in low and high total ozone for a northern mid-latitudes site (Rieder et al., 2009). The results show (a) an increase in days with extreme low (ELOs) and (b) a decrease in days with extreme high total ozone (EHOs) during the last decades, (c) that the general trend in total ozone is strongly determined by these extreme events and (d) that fitting the GPD is an appropriate method for the estimation of the frequency distribution of so-called ozone mini-holes. Furthermore, this concept allows one to separate the effect of Arctic ozone depletion from that of in situ mid-latitude ozone loss. As shown by this study, ELOs and EHOs have a strong influence on mean values in total ozone and the "extremes concept" could be further used also for validation of Chemistry-Climate-Models (CCMs) within the scientific community. References: Bojkov, R. D., and Balis, D.S.: Characteristics of episodes with extremely low ozone values in the northern middle latitudes 1975-2000, Ann. Geophys., 19, 797-807, 2001. Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Koch, G., H. Wernli, C. Schwierz, J. Staehelin, and T. Peter (2005), A composite study on the structure and formation of ozone miniholes and minihighs over central Europe, Geophys. Res. Lett., 32, L12810, doi:10.1029/2004GL022062. Pickands, J.: Statistical-Inference using extreme order Statistics, Ann. Stat., 3, 1, 119-131, 1975. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and Davison, A.C.: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.
Do Extremely Violent Juveniles Respond Differently to Treatment?
Asscher, Jessica J.; Deković, M.; Van den Akker, Alithe L.; Prins, Pier J. M.; Van der Laan, Peter H.
2016-01-01
This study increases knowledge on effectiveness of treatment for extremely violent (EV) youth by investigating their response to multisystemic therapy (MST). Using data of a randomized controlled trial on effectiveness of MST, we investigated differences in treatment response between EV youth and not extremely violent (NEV) youth. Pre- to post-treatment comparison indicated MST was equally effective for EV and NEV youth, whereas treatment as usual was not effective for either group. Growth curves of within-treatment changes indicated EV youth responded differently to MST than NEV youth. The within-treatment change was for EV youth non-linear: Initially, they show a deterioration; however, after one month, EV juveniles respond positively to MST, indicating longer lasting, intensive programs may be effective in treating extreme violence. PMID:27794135
Reusable science tools for analog exploration missions: xGDS Web Tools, VERVE, and Gigapan Voyage
NASA Astrophysics Data System (ADS)
Lee, Susan Y.; Lees, David; Cohen, Tamar; Allan, Mark; Deans, Matthew; Morse, Theodore; Park, Eric; Smith, Trey
2013-10-01
The Exploration Ground Data Systems (xGDS) project led by the Intelligent Robotics Group (IRG) at NASA Ames Research Center creates software tools to support multiple NASA-led planetary analog field experiments. The two primary tools that fall under the xGDS umbrella are the xGDS Web Tools (xGDS-WT) and Visual Environment for Remote Virtual Exploration (VERVE). IRG has also developed a hardware and software system that is closely integrated with our xGDS tools and is used in multiple field experiments called Gigapan Voyage. xGDS-WT, VERVE, and Gigapan Voyage are examples of IRG projects that improve the ratio of science return versus development effort by creating generic and reusable tools that leverage existing technologies in both hardware and software. xGDS Web Tools provides software for gathering and organizing mission data for science and engineering operations, including tools for planning traverses, monitoring autonomous or piloted vehicles, visualization, documentation, analysis, and search. VERVE provides high performance three dimensional (3D) user interfaces used by scientists, robot operators, and mission planners to visualize robot data in real time. Gigapan Voyage is a gigapixel image capturing and processing tool that improves situational awareness and scientific exploration in human and robotic analog missions. All of these technologies emphasize software reuse and leverage open source and/or commercial-off-the-shelf tools to greatly improve the utility and reduce the development and operational cost of future similar technologies. Over the past several years these technologies have been used in many NASA-led robotic field campaigns including the Desert Research and Technology Studies (DRATS), the Pavilion Lake Research Project (PLRP), the K10 Robotic Follow-Up tests, and most recently we have become involved in the NASA Extreme Environment Mission Operations (NEEMO) field experiments. A major objective of these joint robot and crew experiments is to improve NASAs understanding of how to most effectively execute and increase science return from exploration missions. This paper focuses on an integrated suite of xGDS software and compatible hardware tools: xGDS Web Tools, VERVE, and Gigapan Voyage, how they are used, and the design decisions that were made to allow them to be easily developed, integrated, tested, and reused by multiple NASA field experiments and robotic platforms.
NASA Astrophysics Data System (ADS)
Rosendahl, D. H.; Ćwik, P.; Martin, E. R.; Basara, J. B.; Brooks, H. E.; Furtado, J. C.; Homeyer, C. R.; Lazrus, H.; Mcpherson, R. A.; Mullens, E.; Richman, M. B.; Robinson-Cook, A.
2017-12-01
Extreme precipitation events cause significant damage to homes, businesses, infrastructure, and agriculture, as well as many injures and fatalities as a result of fast-moving water or waterborne diseases. In the USA, these natural hazard events claimed the lives of more than 300 people during 2015 - 2016 alone, with total damage reaching $24.4 billion. Prior studies of extreme precipitation events have focused on the sub-daily to sub-weekly timeframes. However, many decisions for planning, preparing and resilience-building require sub-seasonal to seasonal timeframes (S2S; 14 to 90 days), but adequate forecasting tools for prediction do not exist. Therefore, the goal of this newly funded project is an enhancement in understanding of the large-scale forcing and dynamics of S2S extreme precipitation events in the United States, and improved capability for modeling and predicting such events. Here, we describe the project goals, objectives, and research activities that will take place over the next 5 years. In this project, a unique team of scientists and stakeholders will identify and understand weather and climate processes connected with the prediction of S2S extreme precipitation events by answering these research questions: 1) What are the synoptic patterns associated with, and characteristic of, S2S extreme precipitation evens in the contiguous U.S.? 2) What role, if any, do large-scale modes of climate variability play in modulating these events? 3) How predictable are S2S extreme precipitation events across temporal scales? 4) How do we create an informative prediction of S2S extreme precipitation events for policymaking and planing? This project will use observational data, high-resolution radar composites, dynamical climate models and workshops that engage stakeholders (water resource managers, emergency managers and tribal environmental professionals) in co-production of knowledge. The overarching result of this project will be predictive models to reduce of the societal and economic impacts of extreme precipitation events. Another outcome will include statistical and co-production frameworks, which could be applied across other meteorological extremes, all time scales and in other parts of the world to increase resilience to extreme meteorological events.
NASA Astrophysics Data System (ADS)
Qiu, Hong; Tian, Linwei; Ho, Kin-fai; Yu, Ignatius T. S.; Thach, Thuan-Quoc; Wong, Chit-Ming
2016-05-01
The short-term effects of ambient cold temperature on mortality have been well documented in the literature worldwide. However, less is known about which subpopulations are more vulnerable to death related to extreme cold. We aimed to examine the personal characteristics and underlying causes of death that modified the association between extreme cold and mortality in a case-only approach. Individual information of 197,680 deaths of natural causes, daily temperature, and air pollution concentrations in cool season (November-April) during 2002-2011 in Hong Kong were collected. Extreme cold was defined as those days with preceding week with a daily maximum temperature at or less than the 1st percentile of its distribution. Logistic regression models were used to estimate the effects of modification, further controlling for age, seasonal pattern, and air pollution. Sensitivity analyses were conducted by using the 5th percentile as cutoff point to define the extreme cold. Subjects with age of 85 and older were more vulnerable to extreme cold, with an odds ratio (OR) of 1.33 (95 % confidence interval (CI), 1.22-1.45). The greater risk of extreme cold-related mortality was observed for total cardiorespiratory diseases and several specific causes including hypertensive diseases, stroke, congestive heart failure, chronic obstructive pulmonary disease (COPD), and pneumonia. Hypertensive diseases exhibited the greatest vulnerability to extreme cold exposure, with an OR of 1.37 (95 % CI, 1.13-1.65). Sensitivity analyses showed the robustness of these effect modifications. This evidence on which subpopulations are vulnerable to the adverse effects of extreme cold is important to inform public health measures to minimize those effects.
Advanced imaging in acute and chronic deep vein thrombosis
Karande, Gita Yashwantrao; Sanchez, Yadiel; Baliyan, Vinit; Mishra, Vishala; Ganguli, Suvranu; Prabhakar, Anand M.
2016-01-01
Deep venous thrombosis (DVT) affecting the extremities is a common clinical problem. Prompt imaging aids in rapid diagnosis and adequate treatment. While ultrasound (US) remains the workhorse of detection of extremity venous thrombosis, CT and MRI are commonly used as the problem-solving tools either to visualize the thrombosis in central veins like superior or inferior vena cava (IVC) or to test for the presence of complications like pulmonary embolism (PE). The cross-sectional modalities also offer improved visualization of venous collaterals. The purpose of this article is to review the established modalities used for characterization and diagnosis of DVT, and further explore promising innovations and recent advances in this field. PMID:28123971
Gordillo, Gayle M; Sen, Chandan K
2009-06-01
Topical oxygen therapy provides another tool in the armamentarium of clinicians treating refractory lower extremity wounds. Devices suitable for providing topical oxygen therapy in a clinical setting have recently become available. This article reviews the evidence to justify the use of this treatment modality, including in vitro, preclinical data, and clinical data. It also provides a protocol for how to administer topical oxygen therapy as well as guidance on patient selection and management to optimize outcomes. Randomized controlled trials are not yet reported and clearly necessary. The current body of evidence suggests that topical oxygen therapy may be considered as a second line of therapy for refractory wounds.
Extreme-value statistics of work done in stretching a polymer in a gradient flow.
Vucelja, M; Turitsyn, K S; Chertkov, M
2015-02-01
We analyze the statistics of work generated by a gradient flow to stretch a nonlinear polymer. We obtain the large deviation function (LDF) of the work in the full range of appropriate parameters by combining analytical and numerical tools. The LDF shows two distinct asymptotes: "near tails" are linear in work and dominated by coiled polymer configurations, while "far tails" are quadratic in work and correspond to preferentially fully stretched polymers. We find the extreme value statistics of work for several singular elastic potentials, as well as the mean and the dispersion of work near the coil-stretch transition. The dispersion shows a maximum at the transition.
Perspectives on biotechnological applications of archaea
Schiraldi, Chiara; Giuliano, Mariateresa; De Rosa, Mario
2002-01-01
Many archaea colonize extreme environments. They include hyperthermophiles, sulfur-metabolizing thermophiles, extreme halophiles and methanogens. Because extremophilic microorganisms have unusual properties, they are a potentially valuable resource in the development of novel biotechnological processes. Despite extensive research, however, there are few existing industrial applications of either archaeal biomass or archaeal enzymes. This review summarizes current knowledge about the biotechnological uses of archaea and archaeal enzymes with special attention to potential applications that are the subject of current experimental evaluation. Topics covered include cultivation methods, recent achievements in genomics, which are of key importance for the development of new biotechnological tools, and the application of wild-type biomasses, engineered microorganisms, enzymes and specific metabolites in particular bioprocesses of industrial interest. PMID:15803645
Perspectives on biotechnological applications of archaea.
Schiraldi, Chiara; Giuliano, Mariateresa; De Rosa, Mario
2002-09-01
Many archaea colonize extreme environments. They include hyperthermophiles, sulfur-metabolizing thermophiles, extreme halophiles and methanogens. Because extremophilic microorganisms have unusual properties, they are a potentially valuable resource in the development of novel biotechnological processes. Despite extensive research, however, there are few existing industrial applications of either archaeal biomass or archaeal enzymes. This review summarizes current knowledge about the biotechnological uses of archaea and archaeal enzymes with special attention to potential applications that are the subject of current experimental evaluation. Topics covered include cultivation methods, recent achievements in genomics, which are of key importance for the development of new biotechnological tools, and the application of wild-type biomasses, engineered microorganisms, enzymes and specific metabolites in particular bioprocesses of industrial interest.
Laser beam shaping for studying thermally induced damage
NASA Astrophysics Data System (ADS)
Masina, Bathusile N.; Bodkin, Richard; Mwakikunga, Bonex; Forbes, Andrew
2011-10-01
This paper presents an implementation of a laser beam shaping system for both heating a diamond tool and measuring the resulting temperature optically. The influence the initial laser parameters have on the resultant temperature profiles is shown experimentally and theoretically. A CO2 laser beam was used as the source to raise the temperature of the diamond tool and the resultant temperature was measured by using the blackbody principle. We have successfully transformed a Gaussian beam profile into a flat-top beam profile by using a diffractive optical element as a phase element in conjunction with a Fourier transforming lens. In this paper, we have successfully demonstrated temperature profiles across the diamond tool surface using two laser beam profiles and two optical setups, thus allowing a study of temperature influences with and without thermal stress. The generation of such temperature profiles on the diamond tool in the laboratory is important in the study of changes that occur in diamond tools, particularly the reduced efficiency of such tools in applications where extreme heating due to friction is expected.
Zwaan, Eva M; IJsselmuiden, Alexander J J; van Rosmalen, Joost; van Geuns, Robert-Jan M; Amoroso, Giovanni; Moerman, Esther; Ritt, Marco J P F; Schreuders, Ton A R; Kofflard, Marcel J M; Holtzer, Carlo A J
2016-12-01
The aim of this study is to provide a complete insight in the access-site morbidity and upper extremity function after Transradial Percutaneous Coronary Intervention (TR-PCI). In percutaneous coronary intervention the Transradial Approach (TRA) is gaining popularity as a default technique. It is a very promising technique with respect to post-procedure complications, but the exact effects of TRA on upper extremity function are unknown. The effects of trAnsRadial perCUtaneouS coronary intervention on upper extremity function (ARCUS) trial is a multicenter prospective cohort study that will be conducted in all patients admitted for TR-PCI. Clinical outcomes will be monitored during a follow-up of 6 months, with its primary endpoint at two weeks of follow-up. To investigate the complete upper extremity function, a combination of physical examinations and validated questionnaires will be used to provide information on anatomical integrity, strength, range of motion (ROM), coordination, sensibility, pain, and functioning in everyday life. Procedural and material specifications will be registered in order to include all possible aspects influencing upper extremity function. Results from this study will elucidate the effect of TR-PCI on upper extremity function. This creates the opportunity to further optimize TR-PCI, to make improvements in functional outcome and to prevent morbidity regarding full upper extremity function. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
2017-08-20
UNCLASSIFIED Effect of Extreme Cold Treatment on Morphology and Behavior of Hydrogels and Microgels BACKGROUND • Stimuli responsive hydrogel systems...particularly for cold weather and Arctic uniforms, • The effect of extreme cold on gel responsiveness however is not well studied • This project seeks...to understand the effect of cold temperature ( down to -80 ° C) on hydrogel and microgel particles properties and response to thermal stimuli • We
Can Terrestrial Microbes Grow on Mars?
NASA Technical Reports Server (NTRS)
Rothschild, Lynn
2012-01-01
The theme for AbSciCon 2012 is "Exploring Life: Past and Present, Near and Far." The conference will address our current understanding of life - from processes at the molecular level to those which operate at planetary scales. Studying these aspects of life on Earth provides an essential platform from which to examine the potential for life on other worlds, both within our solar system and beyond. Mars exhibits a variety of extreme environments characterized by high UV and ionizing radiation flux, low pressure anoxic atmosphere, scarce or absent liquid water, extreme low temperatures, etc. The ability of terrestrial microorganisms to survive and adapt to the Mars environment has profound implications for astrobiology, planetary protection, and Mars life detection missions. At the NASA Ames Synthetic Biology Initiative, we believe that synthetic biology has the potential to revolutionize human space exploration. As such, the initiative is dedicated to applying the tools and techniques of synthetic biology to space exploration and astrobiology. Biological solutions will be invaluable for space exploration because they are not resource intensive, and they are versatile and self-renewing. An understanding of how to work with DNA in an unfavorable environment is paramount to utilizing biological tools on space missions. Furthermore, the ability to adjust life to the parameters of Mars is vital both to discovering what life on Mars might look like, and to using biological tools under such conditions. As a first step, we need an energy-efficient, low cost means of transporting, storing, and protecting genomic DNA, DNA parts, and whole microbial strains. Our goal is to develop and demonstrate viable and superior alternatives to standard DNA storage methods, which can be optimized to the conditions of space exploration, using synthetic biology as a tool. This includes protocols and kit designs for easy and repeatable DNA and strain recovery from protective storage conditions. We are constructing newly engineered genetic parts for different valuable host organisms, designed to increased long-term survival and functional retention. These methods should be applied for DNA and strain storage and transportation. In parallel, we seek inspiration from natural organisms that have developed means for survival in extreme environmental conditions. We are utilizing novel techniques for analysis of lipid biomarkers in the Antarctic Dry Valleys in order to identify resident microbes in the Antarctic soil and permafrost, as well as biomarker fossils of organisms that survived in the valleys in ages past. Through the identification of these life forms, we hope to understand and draw on new biological tools and strategies for synthetic biological applications on Mars.
Wang, Jun Feng; Wu, Xue Zhong; Xiao, Rui; Dong, Pei Tao; Wang, Chao Guang
2014-01-01
A new high-performance surface-enhanced Raman scattering (SERS) substrate with extremely high SERS activity was produced. This SERS substrate combines the advantages of Au film over nanosphere (AuFON) substrate and Ag nanoparticles (AgNPs). A three order enhancement of SERS was observed when Rhodamine 6G (R6G) was used as a probe molecule to compare the SERS effects of the new substrate and commonly used AuFON substrate. These new SERS substrates can detect R6G down to 1 nM. The new substrate was also utilized to detect melamine, and the limit of detection (LOD) is 1 ppb. A linear relationship was also observed between the SERS intensity at Raman peak 682 cm−1 and the logarithm of melamine concentrations ranging from 10 ppm to 1 ppb. This ultrasensitive SERS substrate is a promising tool for detecting trace chemical molecules because of its simple and effective fabrication procedure, high sensitivity and high reproducibility of the SERS effect. PMID:24886913
Shock wave interactions in hypervelocity flow
NASA Astrophysics Data System (ADS)
Sanderson, S. R.; Sturtevant, B.
1994-08-01
The impingement of shock waves on blunt bodies in steady supersonic flow is known to cause extremely high local heat transfer rates and surface pressures. Although these problems have been studied in cold hypersonic flow, the effects of dissociative relaxation processes are unknown. In this paper we report a model aimed at determining the boundaries of the possible interaction regimes for an ideal dissociating gas. Local analysis about shock wave intersection points in the pressure-flow deflection angle plane with continuation of singular solutions is the fundamental tool employed. Further, we discuss an experimental investigation of the nominally two-dimensional mean flow that results from the impingement of an oblique shock wave on the leading edge of a cylinder. The effects of variations in shock impingement geometry were visualized using differential interferometry. Generally, real gas effects are seen to increase the range of shock impingement points for which enhanced heating occurs. They also reduce the type 4 interaction supersonic jet width and influence the type 2-3 transition process.
Azizi, Abu Bakar; Choy, May Yee; Noor, Zalina Mahmood; Noorlidah, Abdullah
2015-04-01
Spent Pleurotus sajor-caju compost mixed with livestock excreta, i.e. cow dung or goat manure, was contaminated with landfill leachate and vermiremediated in 75 days. Results showed an extreme decrease of heavy metals, i.e. Cd, Cr and Pb up to 99.81% removal as effect of vermiconversion process employing epigeic earthworms i.e. Lumbricus rubellus. In addition, there were increments of Cu and Zn from 15.01% to 85.63%, which was expected as non-accumulative in L. rubellus and secreted out as contained in vermicompost. This phenomenon is due to dual effects of heavy metal excretion period and mineralisation. Nonetheless, the increments were 50-fold below the limit set by EU and USA compost limits and the Malaysian Recommended Site Screening Levels for Contaminated Land (SSLs). Moreover, the vermicompost C:N ratio range is 20.65-22.93 and it can be an advantageous tool to revitalise insalubrious soil by acting as soil stabiliser or conditioner. Copyright © 2015 Elsevier Ltd. All rights reserved.
Wang, Jun Feng; Wu, Xue Zhong; Xiao, Rui; Dong, Pei Tao; Wang, Chao Guang
2014-01-01
A new high-performance surface-enhanced Raman scattering (SERS) substrate with extremely high SERS activity was produced. This SERS substrate combines the advantages of Au film over nanosphere (AuFON) substrate and Ag nanoparticles (AgNPs). A three order enhancement of SERS was observed when Rhodamine 6G (R6G) was used as a probe molecule to compare the SERS effects of the new substrate and commonly used AuFON substrate. These new SERS substrates can detect R6G down to 1 nM. The new substrate was also utilized to detect melamine, and the limit of detection (LOD) is 1 ppb. A linear relationship was also observed between the SERS intensity at Raman peak 682 cm(-1) and the logarithm of melamine concentrations ranging from 10 ppm to 1 ppb. This ultrasensitive SERS substrate is a promising tool for detecting trace chemical molecules because of its simple and effective fabrication procedure, high sensitivity and high reproducibility of the SERS effect.
PUCHEROS: a cost-effective solution for high-resolution spectroscopy with small telescopes
NASA Astrophysics Data System (ADS)
Vanzi, L.; Chacon, J.; Helminiak, K. G.; Baffico, M.; Rivinius, T.; Štefl, S.; Baade, D.; Avila, G.; Guirao, C.
2012-08-01
We present PUCHEROS, the high-resolution echelle spectrograph, developed at the Center of Astro-Engineering of Pontificia Universidad Catolica de Chile to provide an effective tool for research and teaching of astronomy. The instrument is fed by a single-channel optical fibre and it covers the visible range from 390 to 730 nm in one shot, reaching a spectral resolution of about 20 000. In the era of extremely large telescopes our instrument aims to exploit the capabilities offered by small telescopes in a cost-effective way, covering the observing needs of a community of astronomers, in Chile and elsewhere, which do not necessarily need large collecting areas for their research. In particular the instrument is well suited for long-term spectroscopic monitoring of bright variable and transient targets down to a V magnitude of about 10. We describe the instrument and present a number of text case examples of observations obtained during commissioning and early science.
Lee, Suhyun; Kim, Yumi; Lee, Byoung-Hee
2016-12-01
In the present study, we aimed to investigate the effect of virtual reality-based bilateral upper extremity training (VRBT) on paretic upper limb function and muscle strength in patients with stroke. Eighteen stroke survivors were assigned to either the VRBT group (n = 10) or the bilateral upper limb training group (BT, n = 8). Patients in the VRBT group performed bilateral upper extremity exercises in a virtual reality environment, whereas those in the BT group performed conventional bilateral upper extremity exercises. All training was conducted for 30 minutes day -1 , 3 days a week, for a period of 6 weeks. Patients were assessed for upper extremity function and hand strength. Compared with the BT group, the VRBT group exhibited significant improvements in upper extremity function and muscle strength (p < 0.05) after the 6-week training programme. The Box and Block test results revealed that upper extremity function and elbow flexion in hand strength were significantly improved in terms of group, time and interaction effect of group by time. Furthermore, the VRBT group demonstrated significant improvements in upper extremity function, as measured by the Jebsen Hand Function Test and Grooved Pegboard test, and in the hand strength test, as measured by elbow extension, grip, palmar pinch, lateral pinch and tip pinch, in both time and the interaction effect of group by time. These results suggest that VRBT is a feasible and beneficial means of improving upper extremity function and muscle strength in individuals following stroke. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
Detection of meteorological extreme effect on historical crop yield anomaly
NASA Astrophysics Data System (ADS)
Kim, W.; Iizumi, T.; Nishimori, M.
2017-12-01
Meteorological extremes of temperature and precipitation are a critical issue in the global climate change, and some studies investigating how the extreme changes in accordance with the climate change are continuously reported. However, it is rarely understandable that the extremes affect crop yield worldwide as heatwave, coolwave, drought, and flood, albeit some local or national reports are available. Therefore, we globally investigated the extremes effects on the variability of historical yield of maize, rice, soy, and wheat with a standardized index and a historical yield anomaly. For the regression analysis, the standardized index is annually aggregated in the consideration of a crop calendar, and the historical yield is detrended with 5-year moving average. Throughout this investigation, we found that the relationship between the aggregated standardized index and the historical yield anomaly shows not merely positive correlation but also negative correlation in all crops in the globe. Namely, the extremes cause decrease of crop yield as a matter of course, but increase in some regions contrastingly. These results help us to quantify the extremes effect on historical crop yield anomaly.
The Effects of Load Carriage and Muscle Fatigue on Lower-Extremity Joint Mechanics
ERIC Educational Resources Information Center
Wang, He; Frame, Jeff; Ozimek, Elicia; Leib, Daniel; Dugan, Eric L.
2013-01-01
Military personnel are commonly afflicted by lower-extremity overuse injuries. Load carriage and muscular fatigue are major stressors during military basic training. Purpose: To examine effects of load carriage and muscular fatigue on lower-extremity joint mechanics during walking. Method: Eighteen men performed the following tasks: unloaded…
Monocular tool control, eye dominance, and laterality in New Caledonian crows.
Martinho, Antone; Burns, Zackory T; von Bayern, Auguste M P; Kacelnik, Alex
2014-12-15
Tool use, though rare, is taxonomically widespread, but morphological adaptations for tool use are virtually unknown. We focus on the New Caledonian crow (NCC, Corvus moneduloides), which displays some of the most innovative tool-related behavior among nonhumans. One of their major food sources is larvae extracted from burrows with sticks held diagonally in the bill, oriented with individual, but not species-wide, laterality. Among possible behavioral and anatomical adaptations for tool use, NCCs possess unusually wide binocular visual fields (up to 60°), suggesting that extreme binocular vision may facilitate tool use. Here, we establish that during natural extractions, tool tips can only be viewed by the contralateral eye. Thus, maintaining binocular view of tool tips is unlikely to have selected for wide binocular fields; the selective factor is more likely to have been to allow each eye to see far enough across the midsagittal line to view the tool's tip monocularly. Consequently, we tested the hypothesis that tool side preference follows eye preference and found that eye dominance does predict tool laterality across individuals. This contrasts with humans' species-wide motor laterality and uncorrelated motor-visual laterality, possibly because bill-held tools are viewed monocularly and move in concert with eyes, whereas hand-held tools are visible to both eyes and allow independent combinations of eye preference and handedness. This difference may affect other models of coordination between vision and mechanical control, not necessarily involving tools. Copyright © 2014 Elsevier Ltd. All rights reserved.
Preventing heat-related morbidity and mortality: new approaches in a changing climate.
O'Neill, Marie S; Carter, Rebecca; Kish, Jonathan K; Gronlund, Carina J; White-Newsome, Jalonne L; Manarolla, Xico; Zanobetti, Antonella; Schwartz, Joel D
2009-10-20
Due to global climate change, the world will, on average, experience a higher number of heat waves, and the intensity and length of these heat waves is projected to increase. Knowledge about the implications of heat exposure to human health is growing, with excess mortality and illness occurring during hot weather in diverse regions. Certain groups, including the elderly, the urban poor, and those with chronic health conditions, are at higher risk. Preventive actions include: establishing heat wave warning systems; making cool environments available (through air conditioning or other means); public education; planting trees and other vegetation; and modifying the built environment to provide proper ventilation and use materials and colors that reduce heat build-up and optimize thermal comfort. However, to inspire local prevention activities, easily understood information about the strategies' benefits needs to be incorporated into decision tools. Integrating heat health information into a comprehensive adaptation planning process can alert local decision-makers to extreme heat risks and provide information necessary to choose strategies that yield the largest health improvements and cost savings. Tools to enable this include web-based programs that illustrate effective methods for including heat health in comprehensive local-level adaptation planning; calculate costs and benefits of several activities; maps showing zones of high potential heat exposure and vulnerable populations in a local area; and public awareness materials and training for implementing preventive activities. A new computer-based decision tool will enable local estimates of heat-related health effects and potential savings from implementing a range of prevention strategies.
Role of storms and forest practices in sedimentation of an Oregon Coast Range lake
NASA Astrophysics Data System (ADS)
Richardson, K.; Hatten, J. A.; Wheatcroft, R. A.; Guerrero, F. J.
2014-12-01
The design of better management practices in forested watersheds to face climate change and the associated increase in the frequency of extreme events requires a better understanding of watershed responses to extreme events in the past and also under management regimes. One of the most sensitive watershed processes affected is sediment yield. Lake sediments record events which occur in a watershed and provide an opportunity to examine the interaction of storms and forest management practices in the layers of the stratigraphy. We hypothesize that timber harvesting and road building since the 1900s has resulted in increases in sedimentation; however, the passage of the Oregon Forest Practices Act (OFPA) in 1972 has led to a decrease in sedimentation. Sediment cores were taken at Loon Lake in the Oregon Coast Range. The 32-m deep lake captures sediment from a catchment highly impacted by recent land use and episodic Pacific storms. We can use sedimentological tools to measure changes in sediment production as motivated by extreme floods before settlement, during a major timber harvesting period, and after installation of forestry Best Management Practices. Quantification of changes in particle size and elemental composition (C, N, C/N) throughout the cores can elucidate changes in watershed response to extreme events, as can changes in layer thickness. Age control in the cores is being established by Cesium-137 and radiocarbon dating. Given the instrumental meteorological data and decadal climate reconstructions, we will disentangle climate driven signals from changes in land use practices. The sediment shows distinct laminations and varying thickness of layers throughout the cores. Background deposition is composed of thin layers (<0.5 cm) of fine silts and clays, punctuated by thicker layers (3-25 cm) every 10 to 75 cm. These thick layers consist of distinctly textured units, generally fining upward. We interpret the thick layers in Loon Lake to be deposited by sediment-producing floods throughout much of the 1500-year lifespan of this lake. We will explore the relationship between sedimentation, land use, and climate forcing events to determine if the OFPA is having an effect on reducing sedimentation rates as a result of extreme magnitude storm events.
NASA Astrophysics Data System (ADS)
Gaonkar, Bilwaj; Hovda, David; Martin, Neil; Macyszyn, Luke
2016-03-01
Deep Learning, refers to large set of neural network based algorithms, have emerged as promising machine- learning tools in the general imaging and computer vision domains. Convolutional neural networks (CNNs), a specific class of deep learning algorithms, have been extremely effective in object recognition and localization in natural images. A characteristic feature of CNNs, is the use of a locally connected multi layer topology that is inspired by the animal visual cortex (the most powerful vision system in existence). While CNNs, perform admirably in object identification and localization tasks, typically require training on extremely large datasets. Unfortunately, in medical image analysis, large datasets are either unavailable or are extremely expensive to obtain. Further, the primary tasks in medical imaging are organ identification and segmentation from 3D scans, which are different from the standard computer vision tasks of object recognition. Thus, in order to translate the advantages of deep learning to medical image analysis, there is a need to develop deep network topologies and training methodologies, that are geared towards medical imaging related tasks and can work in a setting where dataset sizes are relatively small. In this paper, we present a technique for stacked supervised training of deep feed forward neural networks for segmenting organs from medical scans. Each `neural network layer' in the stack is trained to identify a sub region of the original image, that contains the organ of interest. By layering several such stacks together a very deep neural network is constructed. Such a network can be used to identify extremely small regions of interest in extremely large images, inspite of a lack of clear contrast in the signal or easily identifiable shape characteristics. What is even more intriguing is that the network stack achieves accurate segmentation even when it is trained on a single image with manually labelled ground truth. We validate this approach,using a publicly available head and neck CT dataset. We also show that a deep neural network of similar depth, if trained directly using backpropagation, cannot acheive the tasks achieved using our layer wise training paradigm.
Uehara, Kosuke; Ogura, Koichi; Akiyama, Toru; Shinoda, Yusuke; Iwata, Shintaro; Kobayashi, Eisuke; Tanzawa, Yoshikazu; Yonemoto, Tsukasa; Kawano, Hirotaka; Kawai, Akira
2017-09-01
The Musculoskeletal Tumor Society (MSTS) scoring system developed in 1993 is a widely used disease-specific evaluation tool for assessment of physical function in patients with musculoskeletal tumors; however, only a few studies have confirmed its reliability and validity. The aim of this study was to validate the MSTS scoring system for the upper extremity (MSTS-UE) in Japanese patients with musculoskeletal tumors for use by others in research. Does the MSTS-UE have: (1) sufficient reliability and internal consistency; (2) adequate construct validity; and (3) reasonable criterion validity in comparison to the Toronto Extremity Salvage Score (TESS) or SF-36? Reliability was performed using test-retest analysis, and internal consistency was evaluated with Cronbach's alpha coefficient. Construct validity was evaluated using a scree plot to confirm the construct number and the Akaike information criterion network. Criterion validity was evaluated by comparing the MSTS-UE with the TESS and SF-36. The test-retest reliability with intraclass correlation coefficient (0.95; 95% CI, 0.91-0.97) was excellent, and internal consistency with Cronbach's α (0.7; 95% CI, 0.53-0.81) was acceptable. There were no ceiling and floor effects. The Akaike Information Criterion network showed that lifting ability, pain, and dexterity played central roles among the components. The MSTS-UE showed substantial correlation with the TESS scoring scale (r = 0.75; p < 0.001) and fair correlation with the SF-36 physical component summary (r = 0.37; p = 0.007). Although the MSTS-UE showed slight correlation with the SF-36 mental component summary, the emotional acceptance component of the MSTS-UE showed fair correlation (r = 0.29; p = 0.039). We can conclude that the MSTS is not an adequate measure of general health-related quality of life; however, this system was designed mainly to be a simple measure of function in a single extremity. To evaluate the mental state of patients with musculoskeletal tumors in the upper extremity, further study is needed.
Effects of Climate Change on Extreme Streamflow Risks in the Olympic National Park
NASA Astrophysics Data System (ADS)
Tohver, I. M.; Lee, S.; Hamlet, A.
2011-12-01
Conventionally, natural resource management practices are designed within the framework that past conditions serve as a baseline for future conditions. However, the warmer future climate projected for the Pacific Northwest will alter the region's flood and low flow risks, posing considerable challenges to resource managers in the Olympic National Forest (ONF) and Olympic National Park (ONP). Shifts in extreme streamflow will influence two key management objectives in the ONF and ONP: the protection of wildlife and the maintenance of road infrastructure. The ONF is charged with managing habitat for species listed under the Endangered Species Act (ESA), and with maintaining the network of forest roads and culverts. Climate-induced increases in flood severity will introduce additional challenges in road and culvert design. Furthermore, the aging road infrastructure and more extreme summer low flows will compromise aquatic habitats, intrinsic to the health of threatened and endangered fish species listed under the ESA. Current practice uses estimates of Q100 (or the peak flow with an estimated 100 year return frequency) as the standard metric for stream crossing design. Simple regression models relating annual precipitation and basin area to Q100 are used in the design process. Low flow estimates are based on historical streamflow data to calculate the 7-day consecutive lowest flow with a 10-year return interval, or 7Q10. Under the projections a changing climate, these methods for estimating extreme flows are ill equipped to capture the complex and spatially varying effects of seasonal changes in temperature, precipitation, and snowpack on extreme flow risk. As an alternative approach, this study applies a physically-based hydrologic model to estimate historical and future flood risk at 1/16th degree (latitude/longitude) resolution (about 32 km2). We downscaled climate data derived from 10 global climate models to use as input for the Variable Infiltration Capacity (VIC) model, a macro-scale hydrologic model, which simulates various hydrologic variables at a daily time step. Using the VIC estimates for baseflow and run-off, we calculated Q100 and 7Q10 for the historical period and under two emission scenarios, A1B and B1, at three future time intervals: the 2020s, the 2040s and the 2080s. We also calculated Q100 and 7Q10 at the spatial scale of the 12-digit hydrologic unit codes (HUCs) as delineated by the United States Geologic Survey. The results demonstrate the sensitivity of snowpack at mid-elevation basins to a warmer climate, resulting in more severe winter flooding and lower streamflows in the summertime. These ensemble estimates of extreme streamflows will serve as a tool for management practices by providing high-resolution maps of changing risk over the ONF and ONP.
Control of molecular rotation with an optical centrifuge
NASA Astrophysics Data System (ADS)
Korobenko, Aleksey
2017-04-01
The main purpose of this work is the experimental study of the applicability of an optical centrifuge - a novel tool, utilizing non-resonant broadband laser radiation to excite molecular rotation - to produce and control molecules in extremely high rotational states, so called molecular ``super rotors'', and to study their optical, magnetic, acoustic, hydrodynamic and quantum mechanical properties.
Rock Your Classroom!: Use Subwoofers to Teach Electricity and Science
ERIC Educational Resources Information Center
Karns, Robert J.
2007-01-01
It may not seem like school is the best place to crank up the bass for glass-cracking extreme energy, but subwoofers just might make better teaching and learning tools than they do music makers. Faced with the fact that electricity is invisible, electrical technology students encounter significant challenges in the classroom when abstract concepts…
ERIC Educational Resources Information Center
Sikkens, Elga; van San, Marion; Sieckelinck, Stijn; Boeije, Hennie; de Winter, Micha
2017-01-01
Social media are useful facilitators when recruiting hidden populations for research. In our research on youth and radicalization, we were able to find and contact young people with extreme ideals through Facebook. In this article, we discuss our experiences using Facebook as a tool for finding respondents who do not trust researchers. Facebook…
ERIC Educational Resources Information Center
Batallan, Graciela; Dente, Liliana; Ritta, Loreley
2017-01-01
This article aims to open up a debate on methodological aspects of ethnographic research, arguing for the legitimacy of the information produced in a research "taller" or workshop using a participatory methodology and video production as a methodological tool. Based on the theoretical foundations and analysis of a "taller"…
USDA-ARS?s Scientific Manuscript database
Valencia peanuts (Arachis hypogaea L. ssp. fastigiata) are able to complete seed development in an environment where extreme temperature variation and water deficit are common and growing season is short. Valencia seed can command a premium in food products as consumers like special properties like...
ERIC Educational Resources Information Center
Onwuegbuzie, Anthony J.; Frels, Rebecca K.
2015-01-01
Although focus group discussions (FGDs) represent a popular data collection tool for researchers, they contain an extremely serious flaw: FGD researchers have ultimate power over all decisions made at every stage of the research process--from the conceptualization of the research, to the planning of the research study, to the implementation of the…
Draft Genome Sequence of a Bacillus Bacterium from the Atacama Desert Wetlands Metagenome
Vilo, Claudia; Galetovic, Alexandra; Araya, Jorge E.; Dong, Qunfeng
2015-01-01
We report here the draft genome sequence of a Bacillus bacterium isolated from the microflora of Nostoc colonies grown at the Andean wetlands in northern Chile. We consider this genome sequence to be a molecular tool for exploring microbial relationships and adaptation strategies to the prevailing extreme conditions at the Atacama Desert. PMID:26294639
Teaching Mental Abacus Calculation to Students with Mental Retardation
ERIC Educational Resources Information Center
Shen, Hong
2006-01-01
The abacus is a calculating tool that has been used in Asia for thousands of years. Mental abacus calculation is a skill in which an abacus image in the mind is used without the actual physical manipulation of the abacus. Using this method, people can perform extremely rapid and accurate mental calculations. Research indicates that abacus training…
ERIC Educational Resources Information Center
Cornelius, Cawood; Vest, Terri
2009-01-01
Looking for a "habanero" (extremely hot) lesson to engage first-year Spanish language students in an in-depth study of Spanish-speaking countries? This article offers an overview of how the authors used 21st-century tools to get students excited about the not-so-new assignment of reporting on the people and culture of another country. Chances are,…
Bringing Technology to the Resource Manager ... and Not the Reverse
Daniel L. Schmoldt
1992-01-01
Many natural resource managers envision their jobs as pressed between the resources that they have a mandate to manage and the technological aides that are essential tools to conduct those management activities. On the one hand, managers are straining to understand an extremely complex array of natural systems and the management pressures placed on those systems. Then...
How to assess extreme weather impacts - case European transport network
NASA Astrophysics Data System (ADS)
Leviäkangas, P.
2010-09-01
To assess the impacts of climate change and preparing for impacts is a process. This process we must understand and learn to apply. EWENT (Extreme Weather impacts on European Networks of Transport) will be a test bench for one prospective approach. It has the following main components: 1) identifying what is "extreme", 2) assessing the change in the probabilities, 3) constructing the causal impact models, 4) finding appropriate methods of pricing and costing, 5) finding alternative strategy option, 6) assessing the efficiency of strategy option. This process follows actually the steps of standardized risk management process. Each step is challenging, but if EWENT project succeeds to assess the extreme weather impacts on European transport networks, it is one possible benchmark how to carry out similar analyses in other regions and on country level. EWENT approach could particularly useful for weather and climate information service providers, offering tools for transport authorities and financiers to assess weather risks, and then rationally managing the risks. EWENT project is financed by the European Commission and participated by met-service organisations and transport research institutes from different parts of Europe. The presentation will explain EWENT approach in detail and bring forth the findings of the first work packages.
Trend of annual temperature and frequency of extreme events in the MATOPIBA region of Brazil
NASA Astrophysics Data System (ADS)
Salvador, Mozar de A.; de Brito, J. I. B.
2017-06-01
During the 1980s, a new agricultural frontier arouse in Brazil, which occupied part of the states of Maranhão, Tocantins, Piauí, and Bahia. Currently, this new frontier is known as the MATOPIBA region. The region went through intense transformations in its social and environmental characteristics, with the emergence of extensive areas of intensive agriculture and large herds. The purpose of this research was to study the climatic variabilities of temperature in the MATOPIBA region through extreme climate indexes of ClimAp tool. Data from 11 weather stations were analyzed for yearly air temperature (maximum and minimum) in the period of 1970 to 2012. To verify the trend in the series, we used methods of linear regression analysis and Kendall-tau test. The annual analysis of maximum and minimum temperatures and of the temperature extremes indexes showed a strong positive trend in practically every series (with p value less than 0.05). These results indicated that the region went through to a significant heating process in the last 3 decades. The indices of extreme also showed a significant positive trend in most of the analyzed stations, indicating a higher frequency of warm days during the year.
DiMaio, F; Chiu, W
2016-01-01
Electron cryo-microscopy (cryoEM) has advanced dramatically to become a viable tool for high-resolution structural biology research. The ultimate outcome of a cryoEM study is an atomic model of a macromolecule or its complex with interacting partners. This chapter describes a variety of algorithms and software to build a de novo model based on the cryoEM 3D density map, to optimize the model with the best stereochemistry restraints and finally to validate the model with proper protocols. The full process of atomic structure determination from a cryoEM map is described. The tools outlined in this chapter should prove extremely valuable in revealing atomic interactions guided by cryoEM data. © 2016 Elsevier Inc. All rights reserved.
WhatsApp in Stroke Systems: Current Use and Regulatory Concerns.
Calleja-Castillo, Juan M; Gonzalez-Calderon, Gina
2018-01-01
Smartphone use is extremely common. Applications such as WhatsApp have billions of users and physicians are no exception. Stroke Medicine is a field where instant communication among fairly large groups is essential. In developing countries, economic limitations preclude the possibility of acquiring proper communication platforms. Thus, WhatsApp has been used as an organizational tool, for sharing clinical data, and for real time guidance of clinical care decisions. It has evolved into a cheap, accessible tool for telemedicine. Nevertheless, regulatory and privacy issues must be addressed. Some countries have implemented legislation to address this issue, while others lag behind. In this article, we present an overview on the different roles WhatsApp has acquired as a clinical tool in stroke systems and the potential privacy concerns of its use.
WhatsApp in Stroke Systems: Current Use and Regulatory Concerns
Calleja-Castillo, Juan M.; Gonzalez-Calderon, Gina
2018-01-01
Smartphone use is extremely common. Applications such as WhatsApp have billions of users and physicians are no exception. Stroke Medicine is a field where instant communication among fairly large groups is essential. In developing countries, economic limitations preclude the possibility of acquiring proper communication platforms. Thus, WhatsApp has been used as an organizational tool, for sharing clinical data, and for real time guidance of clinical care decisions. It has evolved into a cheap, accessible tool for telemedicine. Nevertheless, regulatory and privacy issues must be addressed. Some countries have implemented legislation to address this issue, while others lag behind. In this article, we present an overview on the different roles WhatsApp has acquired as a clinical tool in stroke systems and the potential privacy concerns of its use. PMID:29904369
Challenges Facing Design and Analysis Tools
NASA Technical Reports Server (NTRS)
Knight, Norman F., Jr.; Broduer, Steve (Technical Monitor)
2001-01-01
The design and analysis of future aerospace systems will strongly rely on advanced engineering analysis tools used in combination with risk mitigation procedures. The implications of such a trend place increased demands on these tools to assess off-nominal conditions, residual strength, damage propagation, and extreme loading conditions in order to understand and quantify these effects as they affect mission success. Advances in computer hardware such as CPU processing speed, memory, secondary storage, and visualization provide significant resources for the engineer to exploit in engineering design. The challenges facing design and analysis tools fall into three primary areas. The first area involves mechanics needs such as constitutive modeling, contact and penetration simulation, crack growth prediction, damage initiation and progression prediction, transient dynamics and deployment simulations, and solution algorithms. The second area involves computational needs such as fast, robust solvers, adaptivity for model and solution strategies, control processes for concurrent, distributed computing for uncertainty assessments, and immersive technology. Traditional finite element codes still require fast direct solvers which when coupled to current CPU power enables new insight as a result of high-fidelity modeling. The third area involves decision making by the analyst. This area involves the integration and interrogation of vast amounts of information - some global in character while local details are critical and often drive the design. The proposed presentation will describe and illustrate these areas using composite structures, energy-absorbing structures, and inflatable space structures. While certain engineering approximations within the finite element model may be adequate for global response prediction, they generally are inadequate in a design setting or when local response prediction is critical. Pitfalls to be avoided and trends for emerging analysis tools will be described.
Evaluation of Phosphorus Site Assessment Tools: Lessons from the USA.
Sharpley, Andrew; Kleinman, Peter; Baffaut, Claire; Beegle, Doug; Bolster, Carl; Collick, Amy; Easton, Zachary; Lory, John; Nelson, Nathan; Osmond, Deanna; Radcliffe, David; Veith, Tamie; Weld, Jennifer
2017-11-01
Critical source area identification through phosphorus (P) site assessment is a fundamental part of modern nutrient management planning in the United States, yet there has been only sparse testing of the many versions of the P Index that now exist. Each P site assessment tool was developed to be applicable across a range of field conditions found in a given geographic area, making evaluation extremely difficult. In general, evaluation with in-field monitoring data has been limited, focusing primarily on corroborating manure and fertilizer "source" factors. Thus, a multiregional effort (Chesapeake Bay, Heartland, and Southern States) was undertaken to evaluate P Indices using a combination of limited field data, as well as output from simulation models (i.e., Agricultural Policy Environmental eXtender, Annual P Loss Estimator, Soil and Water Assessment Tool [SWAT], and Texas Best Management Practice Evaluation Tool [TBET]) to compare against P Index ratings. These comparisons show promise for advancing the weighting and formulation of qualitative P Index components but require careful vetting of the simulation models. Differences among regional conclusions highlight model strengths and weaknesses. For example, the Southern States region found that, although models could simulate the effects of nutrient management on P runoff, they often more accurately predicted hydrology than total P loads. Furthermore, SWAT and TBET overpredicted particulate P and underpredicted dissolved P, resulting in correct total P predictions but for the wrong reasons. Experience in the United States supports expanded regional approaches to P site assessment, assuming closely coordinated efforts that engage science, policy, and implementation communities, but limited scientific validity exists for uniform national P site assessment tools at the present time. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
GlycReSoft: A Software Package for Automated Recognition of Glycans from LC/MS Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maxwell, Evan; Tan, Yan; Tan, Yuxiang
2012-09-26
Glycosylation modifies the physicochemical properties and protein binding functions of glycoconjugates. These modifications are biosynthesized in the endoplasmic reticulum and Golgi apparatus by a series of enzymatic transformations that are under complex control. As a result, mature glycans on a given site are heterogeneous mixtures of glycoforms. This gives rise to a spectrum of adhesive properties that strongly influences interactions with binding partners and resultant biological effects. In order to understand the roles glycosylation plays in normal and disease processes, efficient structural analysis tools are necessary. In the field of glycomics, liquid chromatography/mass spectrometry (LC/MS) is used to profile themore » glycans present in a given sample. This technology enables comparison of glycan compositions and abundances among different biological samples, i.e. normal versus disease, normal versus mutant, etc. Manual analysis of the glycan profiling LC/MS data is extremely time-consuming and efficient software tools are needed to eliminate this bottleneck. In this work, we have developed a tool to computationally model LC/MS data to enable efficient profiling of glycans. Using LC/MS data deconvoluted by Decon2LS/DeconTools, we built a list of unique neutral masses corresponding to candidate glycan compositions summarized over their various charge states, adducts and range of elution times. Our work aims to provide confident identification of true compounds in complex data sets that are not amenable to manual interpretation. This capability is an essential part of glycomics work flows. We demonstrate this tool, GlycReSoft, using an LC/MS dataset on tissue derived heparan sulfate oligosaccharides. The software, code and a test data set are publically archived under an open source license.« less
NASA Astrophysics Data System (ADS)
Becker, A.; Burroughs, R.
2014-12-01
This presentation discusses a new method to assess vulnerability and resilience strategies for stakeholders of coastal-dependent transportation infrastructure, such as seaports. Much coastal infrastructure faces increasing risk to extreme events resulting from sea level rise and tropical storms. As seen after Hurricane Sandy, natural disasters result in economic costs, damages to the environment, and negative consequences on resident's quality of life. In the coming decades, tough decisions will need to be made about investment measures to protect critical infrastructure. Coastal communities will need to weigh the costs and benefits of a new storm barrier, for example, against those of retrofitting, elevating or simply doing nothing. These decisions require understanding the priorities and concerns of stakeholders. For ports, these include shippers, insurers, tenants, and ultimate consumers of the port cargo on a local and global scale, all of whom have a stake in addressing port vulnerabilities.Decision-makers in exposed coastal areas need tools to understand stakeholders concerns and perceptions of potential resilience strategies. For ports, they need answers to: 1) How will stakeholders be affected? 2) What strategies could be implemented to build resilience? 3) How effectively would the strategies mitigate stakeholder concerns? 4) What level of time and investment would strategies require? 5) Which stakeholders could/should take responsibility? Our stakeholder-based method provides answers to questions 1-3 and forms the basis for further work to address 4 and 5.Together with an expert group, we developed a pilot study for stakeholders of Rhode Island's critical energy port, the Port of Providence. Our method uses a plausible extreme storm scenario with localized visualizations and a portfolio of potential resilience strategies. We tailor a multi-criteria decision analysis tool and, through a series of workshops, we use the storm scenario, resilience strategies, and decision tool to elicit perceptions and priorities of port stakeholders. Results provide new knowledge to assist decision-makers allocate investments of time, money, and staff resources. We intend for our method to be utilized in other port communities around Rhode Island and in other coastal states.