Sample records for extremely sensitive tool

  1. Estimation of resist sensitivity for extreme ultraviolet lithography using an electron beam

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oyama, Tomoko Gowa, E-mail: ohyama.tomoko@qst.go.jp; Oshima, Akihiro; Tagawa, Seiichi, E-mail: tagawa@sanken.osaka-u.ac.jp

    2016-08-15

    It is a challenge to obtain sufficient extreme ultraviolet (EUV) exposure time for fundamental research on developing a new class of high sensitivity resists for extreme ultraviolet lithography (EUVL) because there are few EUV exposure tools that are very expensive. In this paper, we introduce an easy method for predicting EUV resist sensitivity by using conventional electron beam (EB) sources. If the chemical reactions induced by two ionizing sources (EB and EUV) are the same, the required absorbed energies corresponding to each required exposure dose (sensitivity) for the EB and EUV would be almost equivalent. Based on this theory, wemore » calculated the resist sensitivities for the EUV/soft X-ray region. The estimated sensitivities were found to be comparable to the experimentally obtained sensitivities. It was concluded that EB is a very useful exposure tool that accelerates the development of new resists and sensitivity enhancement processes for 13.5 nm EUVL and 6.x nm beyond-EUVL (BEUVL).« less

  2. Accuracy of a disability instrument to identify workers likely to develop upper extremity musculoskeletal disorders.

    PubMed

    Stover, Bert; Silverstein, Barbara; Wickizer, Thomas; Martin, Diane P; Kaufman, Joel

    2007-06-01

    Work related upper extremity musculoskeletal disorders (MSD) result in substantial disability, and expense. Identifying workers or jobs with high risk can trigger intervention before workers are injured or the condition worsens. We investigated a disability instrument, the QuickDASH, as a workplace screening tool to identify workers at high risk of developing upper extremity MSDs. Subjects included workers reporting recurring upper extremity MSD symptoms in the past 7 days (n = 559). The QuickDASH was reasonably accurate at baseline with sensitivity of 73% for MSD diagnosis, and 96% for symptom severity. Specificity was 56% for diagnosis, and 53% for symptom severity. At 1-year follow-up sensitivity and specificity for MSD diagnosis was 72% and 54%, respectively, as predicted by the baseline QuickDASH score. For symptom severity, sensitivity and specificity were 86% and 52%. An a priori target sensitivity of 70% and specificity of 50% was met by symptom severity, work pace and quality, and MSD diagnosis. The QuickDASH may be useful for identifying jobs or workers with increased risk for upper extremity MSDs. It may provide an efficient health surveillance screening tool useful for targeting early workplace intervention for prevention of upper extremity MSD problems.

  3. [Ultrasound examination for lower extremity deep vein thrombosis].

    PubMed

    Toyota, Kosaku

    2014-09-01

    Surgery is known to be a major risk factor of vein thrombosis. Progression from lower extremity deep vein thrombosis (DVT) to pulmonary embolism can lead to catastrophic outcome, although the incidence ratio is low. The ability to rule in or rule out DVT is becoming essential for anesthesiologists. Non-invasive technique of ultrasonography is a sensitive and specific tool for the assessment of lower extremity DVT. This article introduces the basics and practical methods of ultrasound examination for lower extremity DVT.

  4. Air pollution as it affects orchids at the New York Botanical Garden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adderley, L.

    A general discussion of the effects of air pollution on orchids is presented, along with ameliorative measures. One orchid, Dendrobium Phalaenopsis, is suggested as an air pollution bioassay tool, in that it is extremely sensitive to air pollution.

  5. Association between frontal plane knee control and lower extremity injuries: a prospective study on young team sport athletes

    PubMed Central

    Pasanen, Kati; Krosshaug, Tron; Vasankari, Tommi; Kannus, Pekka; Heinonen, Ari; Kujala, Urho M; Avela, Janne; Perttunen, Jarmo; Parkkari, Jari

    2018-01-01

    Background/aim Poor frontal plane knee control can manifest as increased dynamic knee valgus during athletic tasks. The purpose of this study was to investigate the association between frontal plane knee control and the risk of acute lower extremity injuries. In addition, we wanted to study if the single-leg squat (SLS) test can be used as a screening tool to identify athletes with an increased injury risk. Methods A total of 306 basketball and floorball players participated in the baseline SLS test and a 12-month injury registration follow-up. Acute lower extremity time-loss injuries were registered. Frontal plane knee projection angles (FPKPA) during the SLS were calculated using a two-dimensional video analysis. Results Athletes displaying a high FPKPA were 2.7 times more likely to sustain a lower extremity injury (adjusted OR 2.67, 95% CI 1.23 to 5.83) and 2.4 times more likely to sustain an ankle injury (OR 2.37, 95% CI 1.13 to 4.98). There was no statistically significant association between FPKPA and knee injury (OR 1.49, 95% CI 0.56 to 3.98). The receiver operating characteristic curve analyses indicated poor combined sensitivity and specificity when FPKPA was used as a screening test for lower extremity injuries (area under the curve of 0.59) and ankle injuries (area under the curve of 0.58). Conclusions Athletes displaying a large FPKPA in the SLS test had an elevated risk of acute lower extremity and ankle injuries. However, the SLS test is not sensitive and specific enough to be used as a screening tool for future injury risk. PMID:29387448

  6. Direct determination of trace phthalate esters in alcoholic spirits by spray-inlet microwave plasma torch ionization tandem mass spectrometry.

    PubMed

    Miao, Meng; Zhao, Gaosheng; Xu, Li; Dong, Junguo; Cheng, Ping

    2018-03-01

    A direct analytical method based on spray-inlet microwave plasma torch tandem mass spectrometry was applied to simultaneously determine 4 phthalate esters (PAEs), namely, benzyl butyl phthalate, diethyl phthalate, dipentyl phthalate, and dodecyl phthalate with extremely high sensitivity in spirits without sample treatment. Among the 4 brands of spirit products, 3 kinds of PAE compounds were directly determined at very low concentrations from 1.30 to 114 ng·g -1 . Compared with other online and off-line methods, the spray-inlet microwave plasma torch tandem mass spectrometry technique is extremely simple, rapid, sensitive, and high efficient, providing an ideal screening tool for PAEs in spirits. Copyright © 2017 John Wiley & Sons, Ltd.

  7. Scattering Tools for Nanostructure Phonon Engineering

    DTIC Science & Technology

    2013-09-25

    characterization of phonons in nanomaterials, such as Raman scattering, are sensitive only to phonon modes with wavevectors of extremely small magnitude...Fundamentally the wavevectors that can be probed by Raman scattering are limited by the small momentum of photons in the visible spectrum. Our work...serious characterization challenge because existing experimental techniques for the characterization of phonons in nanomaterials, such as Raman

  8. A vulnerability tool for adapting water and aquatic resources to climate change and extremes on the Shoshone National Forest, Wyoming

    NASA Astrophysics Data System (ADS)

    Rice, J.; Joyce, L. A.; Armel, B.; Bevenger, G.; Zubic, R.

    2011-12-01

    Climate change introduces a significant challenge for land managers and decision makers managing the natural resources that provide many benefits from forests. These benefits include water for urban and agricultural uses, wildlife habitat, erosion and climate control, aquifer recharge, stream flows regulation, water temperature regulation, and cultural services such as outdoor recreation and aesthetic enjoyment. The Forest Service has responded to this challenge by developing a national strategy for responding to climate change (the National Roadmap for Responding to Climate Change, July 2010). In concert with this national strategy, the Forest Service's Westwide Climate Initiative has conducted 4 case studies on individual Forests in the western U.S to develop climate adaptation tools. Western National Forests are particularly vulnerable to climate change as they have high-mountain topography, diversity in climate and vegetation, large areas of water limited ecosystems, and increasing urbanization. Information about the vulnerability and capacity of resources to adapt to climate change and extremes is lacking. There is an urgent need to provide customized tools and synthesized local scale information about the impacts to resources from future climate change and extremes, as well as develop science based adaptation options and strategies in National Forest management and planning. The case study on the Shoshone National Forest has aligned its objectives with management needs by developing a climate extreme vulnerability tool that guides adaptation options development. The vulnerability tool determines the likely degree to which native Yellowstone cutthroat trout and water availability are susceptible to, or unable to cope with adverse effects of climate change extremes. We spatially categorize vulnerability for water and native trout resources using exposure, sensitivity, and adaptive capacity indicators that use minimum and maximum climate and GIS data. Results show that the vulnerability of water availability may increase in areas that have less storage and become more dominated by rain instead of snow. Native trout habitat was found to improve in some areas from warmer temperatures suggesting future refugia habitat may need to be a focus of conservation efforts. The climate extreme vulnerability tool provides Forest Service resource managers science based information that guides adaptation strategy development; prioritize conservation projects; guides monitoring efforts, and helps promote more resilient ecosystems undergoing the effects of climate change.

  9. [The informative value of the functional step test for the purpose of computed optical topography in the children presenting with the functional disorders of the musculoskeletal system].

    PubMed

    Trukhmanov, I M; Suslova, G A; Ponomarenko, G N

    This paper is devoted to the characteristic of the informative value of the functional step test with the application of the heel cushions in the children for the purpose of differential diagnostics of anatomic and functional differences in the length of the lower extremities. A total of 85 schoolchildren with different length of the lower extremities have been examined. The comparative evaluation of the results of clinical and instrumental examinations was undertaken. The data obtained with the help of the functional step test give evidence of its very high sensitivity, specificity, and clinical significant as a tool for the examination of the children with different length of the low extremities. It is concluded that the test is one of the most informative predictors of the effectiveness of rehabilitation in the children with different length of the lower extremities.

  10. Assessing the Impact of Climate Change on Extreme Streamflow and Reservoir Operation for Nuuanu Watershed, Oahu, Hawaii

    NASA Astrophysics Data System (ADS)

    Leta, O. T.; El-Kadi, A. I.; Dulaiova, H.

    2016-12-01

    Extreme events, such as flooding and drought, are expected to occur at increased frequencies worldwide due to climate change influencing the water cycle. This is particularly critical for tropical islands where the local freshwater resources are very sensitive to climate. This study examined the impact of climate change on extreme streamflow, reservoir water volume and outflow for the Nuuanu watershed, using the Soil and Water Assessment Tool (SWAT) model. Based on the sensitive parameters screened by the Latin Hypercube-One-factor-At-a-Time (LH-OAT) method, SWAT was calibrated and validated to daily streamflow using the SWAT Calibration and Uncertainty Program (SWAT-CUP) at three streamflow gauging stations. Results showed that SWAT adequately reproduced the observed daily streamflow hydrographs at all stations. This was verified with Nash-Sutcliffe Efficiency that resulted in acceptable values of 0.58 to 0.88, whereby more than 90% of observations were bracketed within 95% model prediction uncertainty interval for both calibration and validation periods, signifying the potential applicability of SWAT for future prediction. The climate change impact on extreme flows, reservoir water volume and outflow was assessed under the Representative Concentration Pathways of 4.5 and 8.5 scenarios. We found wide changes in extreme peak and low flows ranging from -44% to 20% and -50% to -2%, respectively, compared to baseline. Consequently, the amount of water stored in Nuuanu reservoir will be decreased up to 27% while the corresponding outflow rates are expected to decrease up to 37% relative to the baseline. In addition, the stored water and extreme flows are highly sensitive to rainfall change when compared to temperature and solar radiation changes. It is concluded that the decrease in extreme low and peak flows can have serious consequences, such as flooding, drought, with detrimental effects on riparian ecological functioning. This study's results are expected to aid in reservoir operation as well as in identifying appropriate climate change adaptation strategies.

  11. Brunnstrom Recovery Stage and Motricity Index for the Evaluation of Upper Extremity in Stroke: Analysis for Correlation and Responsiveness

    ERIC Educational Resources Information Center

    Safaz, Ismail; Ylmaz, Bilge; Yasar, Evren; Alaca, Rdvan

    2009-01-01

    The aim of this study was to find out first whether Brunnstrom recovery stage (BRS) and motricity index (MI) were correlated with each other and second to observe whether the two assessment tools were sensitive to changes regarding the rehabilitation outcome. Forty-six stroke patients who were admitted to the Stroke Rehabilitation Unit at our…

  12. Glycine and GABAA Ultra-Sensitive Ethanol Receptors as Novel Tools for Alcohol and Brain Research

    PubMed Central

    Naito, Anna; Muchhala, Karan H.; Asatryan, Liana; Trudell, James R.; Homanics, Gregg E.; Perkins, Daya I.; Alkana, Ronald L.

    2014-01-01

    A critical obstacle to developing effective medications to prevent and/or treat alcohol use disorders is the lack of specific knowledge regarding the plethora of molecular targets and mechanisms underlying alcohol (ethanol) action in the brain. To identify the role of individual receptor subunits in ethanol-induced behaviors, we developed a novel class of ultra-sensitive ethanol receptors (USERs) that allow activation of a single receptor subunit population sensitized to extremely low ethanol concentrations. USERs were created by mutating as few as four residues in the extracellular loop 2 region of glycine receptors (GlyRs) or γ-aminobutyric acid type A receptors (GABAARs), which are implicated in causing many behavioral effects linked to ethanol abuse. USERs, expressed in Xenopus oocytes and tested using two-electrode voltage clamp, demonstrated an increase in ethanol sensitivity of 100-fold over wild-type receptors by significantly decreasing the threshold and increasing the magnitude of ethanol response, without altering general receptor properties including sensitivity to the neurosteroid, allopregnanolone. These profound changes in ethanol sensitivity were observed across multiple subunits of GlyRs and GABAARs. Collectively, our studies set the stage for using USER technology in genetically engineered animals as a unique tool to increase understanding of the neurobiological basis of the behavioral effects of ethanol. PMID:25245406

  13. AgroClimate: Simulating and Monitoring the Risk of Extreme Weather Events from a Crop Phenology Perspective

    NASA Astrophysics Data System (ADS)

    Fraisse, C.; Pequeno, D.; Staub, C. G.; Perry, C.

    2016-12-01

    Climate variability, particularly the occurrence of extreme weather conditions such as dry spells and heat stress during sensitive crop developmental phases can substantially increase the prospect of reduced crop yields. Yield losses or crop failure risk due to stressful weather conditions vary mainly due to stress severity and exposure time and duration. The magnitude of stress effects is also crop specific, differing in terms of thresholds and adaptation to environmental conditions. To help producers in the Southeast USA mitigate and monitor the risk of crop losses due to extreme weather events we developed a web-based tool that evaluates the risk of extreme weather events during the season taking into account the crop development stages. Producers can enter their plans for the upcoming season in a given field (e.g. crop, variety, planting date, acreage etc.), select or not a specific El Nino Southern Oscillation (ENSO) phase, and will be presented with the probabilities (ranging from 0 -100%) of extreme weather events occurring during sensitive phases of the growing season for the selected conditions. The DSSAT models CERES-Maize, CROPGRO-Soybean, CROPGRO-Cotton, and N-Wheat phenology models have been translated from FORTRAN to a standalone versions in R language. These models have been tested in collaboration with Extension faculty and producers during the 2016 season and their usefulness for risk mitigation and monitoring evaluated. A companion AgroClimate app was also developed to help producers track and monitor phenology development during the cropping season.

  14. Ultra-sensitive flow measurement in individual nanopores through pressure--driven particle translocation.

    PubMed

    Gadaleta, Alessandro; Biance, Anne-Laure; Siria, Alessandro; Bocquet, Lyderic

    2015-05-07

    A challenge for the development of nanofluidics is to develop new instrumentation tools, able to probe the extremely small mass transport across individual nanochannels. Such tools are a prerequisite for the fundamental exploration of the breakdown of continuum transport in nanometric confinement. In this letter, we propose a novel method for the measurement of the hydrodynamic permeability of nanometric pores, by diverting the classical technique of Coulter counting to characterize a pressure-driven flow across an individual nanopore. Both the analysis of the translocation rate, as well as the detailed statistics of the dwell time of nanoparticles flowing across a single nanopore, allow us to evaluate the permeability of the system. We reach a sensitivity for the water flow down to a few femtoliters per second, which is more than two orders of magnitude better than state-of-the-art alternative methods.

  15. Stochastic sensitivity measure for mistuned high-performance turbines

    NASA Technical Reports Server (NTRS)

    Murthy, Durbha V.; Pierre, Christophe

    1992-01-01

    A stochastic measure of sensitivity is developed in order to predict the effects of small random blade mistuning on the dynamic aeroelastic response of turbomachinery blade assemblies. This sensitivity measure is based solely on the nominal system design (i.e., on tuned system information), which makes it extremely easy and inexpensive to calculate. The measure has the potential to become a valuable design tool that will enable designers to evaluate mistuning effects at a preliminary design stage and thus assess the need for a full mistuned rotor analysis. The predictive capability of the sensitivity measure is illustrated by examining the effects of mistuning on the aeroelastic modes of the first stage of the oxidizer turbopump in the Space Shuttle Main Engine. Results from a full analysis mistuned systems confirm that the simple stochastic sensitivity measure predicts consistently the drastic changes due to misturning and the localization of aeroelastic vibration to a few blades.

  16. Digital vibration threshold testing and ergonomic stressors in automobile manufacturing workers: a cross-sectional assessment.

    PubMed

    Gold, J E; Punnett, L; Cherniack, M; Wegman, D H

    2005-01-01

    Upper extremity musculoskeletal disorders (UEMSDs) comprise a large proportion of work-related illnesses in the USA. Physical risk factors including manual force and segmental vibration have been associated with UEMSDs. Reduced sensitivity to vibration in the fingertips (a function of nerve integrity) has been found in those exposed to segmental vibration, to hand force, and in office workers. The objective of this study was to determine whether an association exists between digital vibration thresholds (VTs) and exposure to ergonomic stressors in automobile manufacturing. Interviews and physical examinations were conducted in a cross-sectional survey of workers (n = 1174). In multivariable robust regression modelling, associations with workers' estimates of ergonomic stressors stratified on tool use were determined. VTs were separately associated with hand force, vibration as felt through the floor (whole body vibration), and with an index of multiple exposures in both tool users and non-tool users. Additional associations with contact stress and awkward upper extremity postures were found in tool users. Segmental vibration was not associated with VTs. Further epidemiologic and laboratory studies are needed to confirm the associations found. The association with self-reported whole body vibration exposure suggests a possible sympathetic nervous system effect, which remains to be explored.

  17. The Exploring Nature of Language Learning Strategies (LLSs) and Their Relationship with Various Variables with Focus on Personality Traits in the Current Studies of Second/Foreign Language Learning

    ERIC Educational Resources Information Center

    Fazeli, Seyed Hossein

    2011-01-01

    Since Language Learning Strategies (LLSs) have the potential to be "an extremely powerful learning tool" (O'Malley, Chamot, Stewner-Manzanares, Russo & Kupper, 1985a, p.43), the use of LLSs helps the learners retrieve and store material, and facilitate their learning (Grander & Maclntyre, 1992), they are sensitive to the learning context and to…

  18. Construction of a Fiber Optic Gradient Hydrophone Using a Michelson Configuration.

    DTIC Science & Technology

    1986-03-27

    Michelson interferometers; * Fabry - Perot interferometers; • Intermode interferometers; • Sagnac interferometers. Of these, the first two categories show the...most promise for hydrophone applications. The Fabry - Perot design is an excellent tool for precision length measurements but is extremely sensitive to...Pa was measured. Using the demodulation technique in Mills, [Ref. 13: pp. 94-95], one can make a comparison to the USRD type G63 stan- dard pressure

  19. EUV via hole pattern fidelity enhancement through novel resist and post-litho plasma treatment

    NASA Astrophysics Data System (ADS)

    Yaegashi, Hidetami; Koike, Kyohei; Fonseca, Carlos; Yamashita, Fumiko; Kaushik, Kumar; Morikita, Shinya; Ito, Kiyohito; Yoshimura, Shota; Timoshkov, Vadim; Maslow, Mark; Jee, Tae Kwon; Reijnen, Liesbeth; Choi, Peter; Feng, Mu; Spence, Chris; Schoofs, Stijn

    2018-03-01

    Extreme UV(EUV) technology must be potential solution for sustainable scaling, and its adoption in high volume manufacturing(HVM) is getting realistic more and more. This technology has a wide capability to mitigate various technical problem in Multi-patterning (LELELE) for via hole patterning with 193-i. It induced local pattern fidelity error such like CDU, CER, Pattern placement error. Exactly, EUV must be desirable scaling-driving tool, however, specific technical issue, named RLS (Resolution-LER-Sensitivity) triangle, obvious remaining issue. In this work, we examined hole patterning sensitizing (Lower dose approach) utilizing hole patterning restoration technique named "CD-Healing" as post-Litho. treatment.

  20. The use of copula functions for predictive analysis of correlations between extreme storm tides

    NASA Astrophysics Data System (ADS)

    Domino, Krzysztof; Błachowicz, Tomasz; Ciupak, Maurycy

    2014-11-01

    In this paper we present a method used in quantitative description of weakly predictable hydrological, extreme events at inland sea. Investigations for correlations between variations of individual measuring points, employing combined statistical methods, were carried out. As a main tool for this analysis we used a two-dimensional copula function sensitive for correlated extreme effects. Additionally, a new proposed methodology, based on Detrended Fluctuations Analysis (DFA) and Anomalous Diffusion (AD), was used for the prediction of negative and positive auto-correlations and associated optimum choice of copula functions. As a practical example we analysed maximum storm tides data recorded at five spatially separated places at the Baltic Sea. For the analysis we used Gumbel, Clayton, and Frank copula functions and introduced the reversed Clayton copula. The application of our research model is associated with modelling the risk of high storm tides and possible storm flooding.

  1. Repeatability of Non-Contrast-Enhanced Lower-Extremity Angiography Using the Flow-Spoiled Fresh Blood Imaging.

    PubMed

    Zhang, Yuyang; Xing, Zhen; She, Dejun; Huang, Nan; Cao, Dairong

    The aim of this study was to prospectively evaluate the repeatability of non-contrast-enhanced lower-extremity magnetic resonance angiography using the flow-spoiled fresh blood imaging (FS-FBI). Forty-three healthy volunteers and 15 patients with lower-extremity arterial stenosis were recruited in this study and were examined by FS-FBI. Digital subtraction angiography was performed within a week after the FS-FBI in the patient group. Repeatability was assessed by the following parameters: grading of image quality, diameter and area of major arteries, and grading of stenosis of lower-extremity arteries. Two experienced radiologists blinded for patient data independently evaluated the FS-FBI and digital subtraction angiography images. Intraclass correlation coefficients (ICCs), sensitivity, and specificity were used for statistical analysis. The grading of image quality of most data was satisfactory. The ICCs for the first and second measures were 0.792 and 0.884 in the femoral segment and 0.803 and 0.796 in the tibiofibular segment for healthy volunteer group, 0.873 and 1.000 in the femoral segment, and 0.737 and 0.737 in the tibiofibular segment for the patient group. Intraobserver and interobserver agreements on diameter and area of arteries were excellent, with ICCs mostly greater than 0.75 in the volunteer group. For stenosis grading analysis, intraobserver ICCs range from 0.784 to 0.862 and from 0.778 to 0.854, respectively. Flow-spoiled fresh blood imaging yielded a mean sensitivity and specificity to detect arterial stenosis or occlusion of 90% and 80% for femoral segment and 86.7% and 93.3% for tibiofibular segment at least. Lower-extremity angiography with FS-FBI is a reliable and reproducible screening tool for lower-extremity atherosclerotic disease, especially for patients with impaired renal function.

  2. High-Sensitivity Nuclear Magnetic Resonance at Giga-Pascal Pressures: A New Tool for Probing Electronic and Chemical Properties of Condensed Matter under Extreme Conditions

    PubMed Central

    Meier, Thomas; Haase, Jürgen

    2014-01-01

    Nuclear Magnetic Resonance (NMR) is one of the most important techniques for the study of condensed matter systems, their chemical structure, and their electronic properties. The application of high pressure enables one to synthesize new materials, but the response of known materials to high pressure is a very useful tool for studying their electronic structure and developing theories. For example, high-pressure synthesis might be at the origin of life; and understanding the behavior of small molecules under extreme pressure will tell us more about fundamental processes in our universe. It is no wonder that there has always been great interest in having NMR available at high pressures. Unfortunately, the desired pressures are often well into the Giga-Pascal (GPa) range and require special anvil cell devices where only very small, secluded volumes are available. This has restricted the use of NMR almost entirely in the past, and only recently, a new approach to high-sensitivity GPa NMR, which has a resonating micro-coil inside the sample chamber, was put forward. This approach enables us to achieve high sensitivity with experiments that bring the power of NMR to Giga-Pascal pressure condensed matter research. First applications, the detection of a topological electronic transition in ordinary aluminum metal and the closing of the pseudo-gap in high-temperature superconductivity, show the power of such an approach. Meanwhile, the range of achievable pressures was increased tremendously with a new generation of anvil cells (up to 10.1 GPa), that fit standard-bore NMR magnets. This approach might become a new, important tool for the investigation of many condensed matter systems, in chemistry, geochemistry, and in physics, since we can now watch structural changes with the eyes of a very versatile probe. PMID:25350694

  3. High-sensitivity nuclear magnetic resonance at Giga-Pascal pressures: a new tool for probing electronic and chemical properties of condensed matter under extreme conditions.

    PubMed

    Meier, Thomas; Haase, Jürgen

    2014-10-10

    Nuclear Magnetic Resonance (NMR) is one of the most important techniques for the study of condensed matter systems, their chemical structure, and their electronic properties. The application of high pressure enables one to synthesize new materials, but the response of known materials to high pressure is a very useful tool for studying their electronic structure and developing theories. For example, high-pressure synthesis might be at the origin of life; and understanding the behavior of small molecules under extreme pressure will tell us more about fundamental processes in our universe. It is no wonder that there has always been great interest in having NMR available at high pressures. Unfortunately, the desired pressures are often well into the Giga-Pascal (GPa) range and require special anvil cell devices where only very small, secluded volumes are available. This has restricted the use of NMR almost entirely in the past, and only recently, a new approach to high-sensitivity GPa NMR, which has a resonating micro-coil inside the sample chamber, was put forward. This approach enables us to achieve high sensitivity with experiments that bring the power of NMR to Giga-Pascal pressure condensed matter research. First applications, the detection of a topological electronic transition in ordinary aluminum metal and the closing of the pseudo-gap in high-temperature superconductivity, show the power of such an approach. Meanwhile, the range of achievable pressures was increased tremendously with a new generation of anvil cells (up to 10.1 GPa), that fit standard-bore NMR magnets. This approach might become a new, important tool for the investigation of many condensed matter systems, in chemistry, geochemistry, and in physics, since we can now watch structural changes with the eyes of a very versatile probe.

  4. Molecular Tools for the Detection of Nitrogen Cycling Archaea

    PubMed Central

    Rusch, Antje

    2013-01-01

    Archaea are widespread in extreme and temperate environments, and cultured representatives cover a broad spectrum of metabolic capacities, which sets them up for potentially major roles in the biogeochemistry of their ecosystems. The detection, characterization, and quantification of archaeal functions in mixed communities require Archaea-specific primers or probes for the corresponding metabolic genes. Five pairs of degenerate primers were designed to target archaeal genes encoding key enzymes of nitrogen cycling: nitrite reductases NirA and NirB, nitrous oxide reductase (NosZ), nitrogenase reductase (NifH), and nitrate reductases NapA/NarG. Sensitivity towards their archaeal target gene, phylogenetic specificity, and gene specificity were evaluated in silico and in vitro. Owing to their moderate sensitivity/coverage, the novel nirB-targeted primers are suitable for pure culture studies only. The nirA-targeted primers showed sufficient sensitivity and phylogenetic specificity, but poor gene specificity. The primers designed for amplification of archaeal nosZ performed well in all 3 criteria; their discrimination against bacterial homologs appears to be weakened when Archaea are strongly outnumbered by bacteria in a mixed community. The novel nifH-targeted primers showed high sensitivity and gene specificity, but failed to discriminate against bacterial homologs. Despite limitations, 4 of the new primer pairs are suitable tools in several molecular methods applied in archaeal ecology. PMID:23365509

  5. Impact of design-parameters on the optical performance of a high-power adaptive mirror

    NASA Astrophysics Data System (ADS)

    Koek, Wouter D.; Nijkerk, David; Smeltink, Jeroen A.; van den Dool, Teun C.; van Zwet, Erwin J.; van Baars, Gregor E.

    2017-02-01

    TNO is developing a High Power Adaptive Mirror (HPAM) to be used in the CO2 laser beam path of an Extreme Ultra- Violet (EUV) light source for next-generation lithography. In this paper we report on a developed methodology, and the necessary simulation tools, to assess the performance and associated sensitivities of this deformable mirror. Our analyses show that, given the current limited insight concerning the process window of EUV generation, the HPAM module should have an actuator pitch of <= 4 mm. Furthermore we have modelled the sensitivity of performance with respect to dimpling and actuator noise. For example, for a deformable mirror with an actuator pitch of 4 mm, and if the associated performance impact is to be limited to smaller than 5%, the actuator noise should be smaller than 45 nm (rms). Our tools assist in the detailed design process by assessing the performance impact of various design choices, including for example those that affect the shape and spectral content of the influence function.

  6. Fabrication of dense wavelength division multiplexing filters with large useful area

    NASA Astrophysics Data System (ADS)

    Lee, Cheng-Chung; Chen, Sheng-Hui; Hsu, Jin-Cherng; Kuo, Chien-Cheng

    2006-08-01

    Dense Wavelength Division Multiplexers (DWDM), a kind of narrow band-pass filter, are extremely sensitive to the optical thickness error in each composite layer. Therefore to have a large useful coating area is extreme difficult because of the uniformity problem. To enlarge the useful coating area it is necessary to improve their design and their fabrication. In this study, we discuss how the tooling factors at different positions and for different materials are related to the optical performance of the design. 100GHz DWDM filters were fabricated by E-gun evaporation with ion-assisted deposition (IAD). To improve the coating uniformity, an analysis technique called shaping tooling factor (STF) was used to analyze the deviation of the optical thickness in different materials so as to enlarge the useful coating area. Also a technique of etching the deposited layers with oxygen ions was introduced. When the above techniques were applied in the fabrication of 100 GHz DWDM filters, the uniformity was better than +/-0.002% over an area of 72 mm in diameter and better than +/-0.0006% over 20mm in diameter.

  7. Sensitivity enhancement of chemically amplified resists and performance study using extreme ultraviolet interference lithography

    NASA Astrophysics Data System (ADS)

    Buitrago, Elizabeth; Nagahara, Seiji; Yildirim, Oktay; Nakagawa, Hisashi; Tagawa, Seiichi; Meeuwissen, Marieke; Nagai, Tomoki; Naruoka, Takehiko; Verspaget, Coen; Hoefnagels, Rik; Rispens, Gijsbert; Shiraishi, Gosuke; Terashita, Yuichi; Minekawa, Yukie; Yoshihara, Kosuke; Oshima, Akihiro; Vockenhuber, Michaela; Ekinci, Yasin

    2016-07-01

    Extreme ultraviolet lithography (EUVL, λ=13.5 nm) is the most promising candidate to manufacture electronic devices for future technology nodes in the semiconductor industry. Nonetheless, EUVL still faces many technological challenges as it moves toward high-volume manufacturing (HVM). A key bottleneck from the tool design and performance point of view has been the development of an efficient, high-power EUV light source for high throughput production. Consequently, there has been extensive research on different methodologies to enhance EUV resist sensitivity. Resist performance is measured in terms of its ultimate printing resolution, line width roughness (LWR), sensitivity [S or best energy (BE)], and exposure latitude (EL). However, there are well-known fundamental trade-off relationships (line width roughness, resolution and sensitivity trade-off) among these parameters for chemically amplified resists (CARs). We present early proof-of-principle results for a multiexposure lithography process that has the potential for high sensitivity enhancement without compromising other important performance characteristics by the use of a "Photosensitized Chemically Amplified Resist™" (PSCAR™). With this method, we seek to increase the sensitivity by combining a first EUV pattern exposure with a second UV-flood exposure (λ=365 nm) and the use of a PSCAR. In addition, we have evaluated over 50 different state-of-the-art EUV CARs. Among these, we have identified several promising candidates that simultaneously meet sensitivity, LWR, and EL high-performance requirements with the aim of resolving line space (L/S) features for the 7- and 5-nm logic node [16- and 13-nm half-pitch (HP), respectively] for HVM. Several CARs were additionally found to be well resolved down to 12- and 11-nm HP with minimal pattern collapse and bridging, a remarkable feat for CARs. Finally, the performance of two negative tone state-of-the-art alternative resist platforms previously investigated was compared to the CAR performance at and below 16-nm HP resolution, demonstrating the need for alternative resist solutions at 13-nm resolution and below. EUV interference lithography (IL) has provided and continues to provide a simple yet powerful platform for academic and industrial research, enabling the characterization and development of resist materials before commercial EUV exposure tools become available. Our experiments have been performed at the EUV-IL set-up in the Swiss Light Source (SLS) synchrotron facility located at the Paul Scherrer Institute (PSI).

  8. The utility of the KJOC score in professional baseball in the United States.

    PubMed

    Franz, Justin O; McCulloch, Patrick C; Kneip, Chris J; Noble, Philip C; Lintner, David M

    2013-09-01

    The Kerlan-Jobe Orthopaedic Clinic (KJOC) Shoulder and Elbow questionnaire has been shown by previous studies to be more sensitive than other validated subjective measurement tools in the detection of upper extremity dysfunction in overhead-throwing athletes. The primary objective was to establish normative data for KJOC scores in professional baseball players in the United States. The secondary objectives were to evaluate the effect of player age, playing position, professional competition level, history of injury, history of surgery, and time point of administration on the KJOC score. Cross-sectional study; Level of evidence, 3. From 2011 to 2012, a total of 203 major league and minor league baseball players within the Houston Astros professional baseball organization completed the KJOC questionnaire. The questionnaire was administered at 3 time points: spring training 2011, end of season 2011, and spring training 2012. The KJOC scores were analyzed for significant differences based on player age, injury history, surgery history, fielding position, competition level, self-reported playing status, and time point of KJOC administration. The average KJOC score among healthy players with no history of injury was 97.1 for major league players and 96.8 for minor league players. The time point of administration did not significantly affect the final KJOC score (P = .224), and KJOC outcomes did not vary with player age (r = -0.012; P = .867). Significantly lower average KJOC scores were reported by players with a history of upper extremity injury (86.7; P < .001) and upper extremity surgery (75.4; P < .0001). The KJOC results did vary with playing position (P = .0313), with the lowest average scores being reported by pitchers (90.9) and infielders (91.3). This study establishes a quantitative baseline for the future evaluation of professional baseball players with the KJOC score. Age and time of administration had no significant effect on the outcome of the KJOC score. Missed practices or games within the previous year because of injury were the most significant demographic predictors of lower KJOC scores. The KJOC score was shown to be a sensitive measurement tool for detecting subtle changes in the upper extremity performance of the professional baseball population studied.

  9. Pyridoxylamine reactivity kinetics as an amine based nucleophile for screening electrophilic dermal sensitizers

    PubMed Central

    Chipinda, Itai; Mbiya, Wilbes; Adigun, Risikat Ajibola; Morakinyo, Moshood K.; Law, Brandon F.; Simoyi, Reuben H.; Siegel, Paul D.

    2015-01-01

    Chemical allergens bind directly, or after metabolic or abiotic activation, to endogenous proteins to become allergenic. Assessment of this initial binding has been suggested as a target for development of assays to screen chemicals for their allergenic potential. Recently we reported a nitrobenzenethiol (NBT) based method for screening thiol reactive skin sensitizers, however, amine selective sensitizers are not detected by this assay. In the present study we describe an amine (pyridoxylamine (PDA)) based kinetic assay to complement the NBT assay for identification of amine-selective and non-selective skin sensitizers. UV-Vis spectrophotometry and fluorescence were used to measure PDA reactivity for 57 chemicals including anhydrides, aldehydes, and quinones where reaction rates ranged from 116 to 6.2 × 10−6 M−1 s−1 for extreme to weak sensitizers, respectively. No reactivity towards PDA was observed with the thiol-selective sensitizers, non-sensitizers and prohaptens. The PDA rate constants correlated significantly with their respective murine local lymph node assay (LLNA) threshold EC3 values (R2 = 0.76). The use of PDA serves as a simple, inexpensive amine based method that shows promise as a preliminary screening tool for electrophilic, amine-selective skin sensitizers. PMID:24333919

  10. Repeatability of Non–Contrast-Enhanced Lower-Extremity Angiography Using the Flow-Spoiled Fresh Blood Imaging

    PubMed Central

    Zhang, Yuyang; Xing, Zhen; She, Dejun; Huang, Nan; Cao, Dairong

    2018-01-01

    Purpose The aim of this study was to prospectively evaluate the repeatability of non–contrast-enhanced lower-extremity magnetic resonance angiography using the flow-spoiled fresh blood imaging (FS-FBI). Methods Forty-three healthy volunteers and 15 patients with lower-extremity arterial stenosis were recruited in this study and were examined by FS-FBI. Digital subtraction angiography was performed within a week after the FS-FBI in the patient group. Repeatability was assessed by the following parameters: grading of image quality, diameter and area of major arteries, and grading of stenosis of lower-extremity arteries. Two experienced radiologists blinded for patient data independently evaluated the FS-FBI and digital subtraction angiography images. Intraclass correlation coefficients (ICCs), sensitivity, and specificity were used for statistical analysis. Results The grading of image quality of most data was satisfactory. The ICCs for the first and second measures were 0.792 and 0.884 in the femoral segment and 0.803 and 0.796 in the tibiofibular segment for healthy volunteer group, 0.873 and 1.000 in the femoral segment, and 0.737 and 0.737 in the tibiofibular segment for the patient group. Intraobserver and interobserver agreements on diameter and area of arteries were excellent, with ICCs mostly greater than 0.75 in the volunteer group. For stenosis grading analysis, intraobserver ICCs range from 0.784 to 0.862 and from 0.778 to 0.854, respectively. Flow-spoiled fresh blood imaging yielded a mean sensitivity and specificity to detect arterial stenosis or occlusion of 90% and 80% for femoral segment and 86.7% and 93.3% for tibiofibular segment at least. Conclusions Lower-extremity angiography with FS-FBI is a reliable and reproducible screening tool for lower-extremity atherosclerotic disease, especially for patients with impaired renal function. PMID:28787351

  11. A Statistical Tool for Examining Heat Waves and Other Extreme Phenomena Arising from Multiple Factors

    NASA Astrophysics Data System (ADS)

    Cooley, D. S.; Castillo, F.; Thibaud, E.

    2017-12-01

    A 2015 heatwave in Pakistan is blamed for over a thousand deaths. This event consisted of several days of very high temperatures and unusually high humidity for this region. However, none of these days exceeded the threshold for "extreme danger" in terms of the heat index. The heat index is a univariate function of both temperature and humidity which is universally applied at all locations regardless of local climate. Understanding extremes which arise from multiple factors is challenging. In this paper we will present a tool for examining bivariate extreme behavior. The tool, developed in the statistical software R, draws isolines of equal exceedance probability. These isolines can be understood as bivariate "return levels". The tool is based on a dependence framework specific for extremes, is semiparametric, and is able to extrapolate isolines beyond the range of the data. We illustrate this tool using the Pakistan heat wave data and other bivariate data.

  12. Cognitive Screening in Brain Tumors: Short but Sensitive Enough?

    PubMed Central

    Robinson, Gail A.; Biggs, Vivien; Walker, David G.

    2015-01-01

    Cognitive deficits in brain tumors are generally thought to be relatively mild and non-specific, although recent evidence challenges this notion. One possibility is that cognitive screening tools are being used to assess cognitive functions but their sensitivity to detect cognitive impairment may be limited. For improved sensitivity to recognize mild and/or focal cognitive deficits in brain tumors, neuropsychological evaluation tailored to detect specific impairments has been thought crucial. This study investigates the sensitivity of a cognitive screening tool, the Montreal Cognitive Assessment (MoCA), compared to a brief but tailored cognitive assessment (CA) for identifying cognitive deficits in an unselected primary brain tumor sample (i.e., low/high-grade gliomas, meningiomas). Performance is compared on broad measures of impairment: (a) number of patients impaired on the global screening measure or in any cognitive domain; and (b) number of cognitive domains impaired and specific analyses of MoCA-Intact and MoCA-Impaired patients on specific cognitive tests. The MoCA-Impaired group obtained lower naming and word fluency scores than the MoCA-Intact group, but otherwise performed comparably on cognitive tests. Overall, based on our results from patients with brain tumor, the MoCA has extremely poor sensitivity for detecting cognitive impairments and a brief but tailored CA is necessary. These findings will be discussed in relation to broader issues for clinical management and planning, as well as specific considerations for neuropsychological assessment of brain tumor patients. PMID:25815273

  13. Sensitivity Observing System Experiment (SOSE)-a new effective NWP-based tool in designing the global observing system

    NASA Astrophysics Data System (ADS)

    Marseille, Gert-Jan; Stoffelen, Ad; Barkmeijer, Jan

    2008-03-01

    Lacking an established methodology to test the potential impact of prospective extensions to the global observing system (GOS) in real atmospheric cases we developed such a method, called Sensitivity Observing System Experiment (SOSE). For example, since the GOS is non uniform it is of interest to investigate the benefit of complementary observing systems filling its gaps. In a SOSE adjoint sensitivity structures are used to define a pseudo true atmospheric state for the simulation of the prospective observing system. Next, the synthetic observations are used together with real observations from the existing GOS in a state-of-the-art Numerical Weather Prediction (NWP) model to assess the potential added value of the new observing system. Unlike full observing system simulation experiments (OSSE), SOSE can be applied to real extreme events that were badly forecast operationally and only requires the simulation of the new instrument. As such SOSE is an effective tool, for example, to define observation requirements for extensions to the GOS. These observation requirements may serve as input for the design of an operational network of prospective observing systems. In a companion paper we use SOSE to simulate potential future space borne Doppler Wind Lidar (DWL) scenarios and assess their capability to sample meteorologically sensitive areas not well captured by the current GOS, in particular over the Northern Hemisphere oceans.

  14. Muver, a computational framework for accurately calling accumulated mutations.

    PubMed

    Burkholder, Adam B; Lujan, Scott A; Lavender, Christopher A; Grimm, Sara A; Kunkel, Thomas A; Fargo, David C

    2018-05-09

    Identification of mutations from next-generation sequencing data typically requires a balance between sensitivity and accuracy. This is particularly true of DNA insertions and deletions (indels), that can impart significant phenotypic consequences on cells but are harder to call than substitution mutations from whole genome mutation accumulation experiments. To overcome these difficulties, we present muver, a computational framework that integrates established bioinformatics tools with novel analytical methods to generate mutation calls with the extremely low false positive rates and high sensitivity required for accurate mutation rate determination and comparison. Muver uses statistical comparison of ancestral and descendant allelic frequencies to identify variant loci and assigns genotypes with models that include per-sample assessments of sequencing errors by mutation type and repeat context. Muver identifies maximally parsimonious mutation pathways that connect these genotypes, differentiating potential allelic conversion events and delineating ambiguities in mutation location, type, and size. Benchmarking with a human gold standard father-son pair demonstrates muver's sensitivity and low false positive rates. In DNA mismatch repair (MMR) deficient Saccharomyces cerevisiae, muver detects multi-base deletions in homopolymers longer than the replicative polymerase footprint at rates greater than predicted for sequential single-base deletions, implying a novel multi-repeat-unit slippage mechanism. Benchmarking results demonstrate the high accuracy and sensitivity achieved with muver, particularly for indels, relative to available tools. Applied to an MMR-deficient Saccharomyces cerevisiae system, muver mutation calls facilitate mechanistic insights into DNA replication fidelity.

  15. Bayes Factors Unmask Highly Variable Information Content, Bias, and Extreme Influence in Phylogenomic Analyses.

    PubMed

    Brown, Jeremy M; Thomson, Robert C

    2017-07-01

    As the application of genomic data in phylogenetics has become routine, a number of cases have arisen where alternative data sets strongly support conflicting conclusions. This sensitivity to analytical decisions has prevented firm resolution of some of the most recalcitrant nodes in the tree of life. To better understand the causes and nature of this sensitivity, we analyzed several phylogenomic data sets using an alternative measure of topological support (the Bayes factor) that both demonstrates and averts several limitations of more frequently employed support measures (such as Markov chain Monte Carlo estimates of posterior probabilities). Bayes factors reveal important, previously hidden, differences across six "phylogenomic" data sets collected to resolve the phylogenetic placement of turtles within Amniota. These data sets vary substantially in their support for well-established amniote relationships, particularly in the proportion of genes that contain extreme amounts of information as well as the proportion that strongly reject these uncontroversial relationships. All six data sets contain little information to resolve the phylogenetic placement of turtles relative to other amniotes. Bayes factors also reveal that a very small number of extremely influential genes (less than 1% of genes in a data set) can fundamentally change significant phylogenetic conclusions. In one example, these genes are shown to contain previously unrecognized paralogs. This study demonstrates both that the resolution of difficult phylogenomic problems remains sensitive to seemingly minor analysis details and that Bayes factors are a valuable tool for identifying and solving these challenges. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Using Machine Learning to Predict MCNP Bias

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grechanuk, Pavel Aleksandrovi

    For many real-world applications in radiation transport where simulations are compared to experimental measurements, like in nuclear criticality safety, the bias (simulated - experimental k eff) in the calculation is an extremely important quantity used for code validation. The objective of this project is to accurately predict the bias of MCNP6 [1] criticality calculations using machine learning (ML) algorithms, with the intention of creating a tool that can complement the current nuclear criticality safety methods. In the latest release of MCNP6, the Whisper tool is available for criticality safety analysts and includes a large catalogue of experimental benchmarks, sensitivity profiles,more » and nuclear data covariance matrices. This data, coming from 1100+ benchmark cases, is used in this study of ML algorithms for criticality safety bias predictions.« less

  17. Molecular Imprinting Technology in Quartz Crystal Microbalance (QCM) Sensors.

    PubMed

    Emir Diltemiz, Sibel; Keçili, Rüstem; Ersöz, Arzu; Say, Rıdvan

    2017-02-24

    Molecularly imprinted polymers (MIPs) as artificial antibodies have received considerable scientific attention in the past years in the field of (bio)sensors since they have unique features that distinguish them from natural antibodies such as robustness, multiple binding sites, low cost, facile preparation and high stability under extreme operation conditions (higher pH and temperature values, etc.). On the other hand, the Quartz Crystal Microbalance (QCM) is an analytical tool based on the measurement of small mass changes on the sensor surface. QCM sensors are practical and convenient monitoring tools because of their specificity, sensitivity, high accuracy, stability and reproducibility. QCM devices are highly suitable for converting the recognition process achieved using MIP-based memories into a sensor signal. Therefore, the combination of a QCM and MIPs as synthetic receptors enhances the sensitivity through MIP process-based multiplexed binding sites using size, 3D-shape and chemical function having molecular memories of the prepared sensor system toward the target compound to be detected. This review aims to highlight and summarize the recent progress and studies in the field of (bio)sensor systems based on QCMs combined with molecular imprinting technology.

  18. EUV lithography for 30nm half pitch and beyond: exploring resolution, sensitivity, and LWR tradeoffs

    NASA Astrophysics Data System (ADS)

    Putna, E. Steve; Younkin, Todd R.; Chandhok, Manish; Frasure, Kent

    2009-03-01

    The International Technology Roadmap for Semiconductors (ITRS) denotes Extreme Ultraviolet (EUV) lithography as a leading technology option for realizing the 32nm half-pitch node and beyond. Readiness of EUV materials is currently one high risk area according to assessments made at the 2008 EUVL Symposium. The main development issue regarding EUV resist has been how to simultaneously achieve high sensitivity, high resolution, and low line width roughness (LWR). This paper describes the strategy and current status of EUV resist development at Intel Corporation. Data is presented utilizing Intel's Micro-Exposure Tool (MET) examining the feasibility of establishing a resist process that simultaneously exhibits <=30nm half-pitch (HP) L/S resolution at <=10mJ/cm2 with <=4nm LWR.

  19. EUV lithography for 22nm half pitch and beyond: exploring resolution, LWR, and sensitivity tradeoffs

    NASA Astrophysics Data System (ADS)

    Putna, E. Steve; Younkin, Todd R.; Caudillo, Roman; Chandhok, Manish

    2010-04-01

    The International Technology Roadmap for Semiconductors (ITRS) denotes Extreme Ultraviolet (EUV) lithography as a leading technology option for realizing the 22nm half pitch node and beyond. Readiness of EUV materials is currently one high risk area according to recent assessments made at the 2009 EUVL Symposium. The main development issue regarding EUV resist has been how to simultaneously achieve high sensitivity, high resolution, and low line width roughness (LWR). This paper describes the strategy and current status of EUV resist development at Intel Corporation. Data collected utilizing Intel's Micro-Exposure Tool (MET) is presented in order to examine the feasibility of establishing a resist process that simultaneously exhibits <=22nm half-pitch (HP) L/S resolution at <= 12.5mJ/cm2 with <= 4nm LWR.

  20. weather@home 2: validation of an improved global-regional climate modelling system

    NASA Astrophysics Data System (ADS)

    Guillod, Benoit P.; Jones, Richard G.; Bowery, Andy; Haustein, Karsten; Massey, Neil R.; Mitchell, Daniel M.; Otto, Friederike E. L.; Sparrow, Sarah N.; Uhe, Peter; Wallom, David C. H.; Wilson, Simon; Allen, Myles R.

    2017-05-01

    Extreme weather events can have large impacts on society and, in many regions, are expected to change in frequency and intensity with climate change. Owing to the relatively short observational record, climate models are useful tools as they allow for generation of a larger sample of extreme events, to attribute recent events to anthropogenic climate change, and to project changes in such events into the future. The modelling system known as weather@home, consisting of a global climate model (GCM) with a nested regional climate model (RCM) and driven by sea surface temperatures, allows one to generate a very large ensemble with the help of volunteer distributed computing. This is a key tool to understanding many aspects of extreme events. Here, a new version of the weather@home system (weather@home 2) with a higher-resolution RCM over Europe is documented and a broad validation of the climate is performed. The new model includes a more recent land-surface scheme in both GCM and RCM, where subgrid-scale land-surface heterogeneity is newly represented using tiles, and an increase in RCM resolution from 50 to 25 km. The GCM performs similarly to the previous version, with some improvements in the representation of mean climate. The European RCM temperature biases are overall reduced, in particular the warm bias over eastern Europe, but large biases remain. Precipitation is improved over the Alps in summer, with mixed changes in other regions and seasons. The model is shown to represent the main classes of regional extreme events reasonably well and shows a good sensitivity to its drivers. In particular, given the improvements in this version of the weather@home system, it is likely that more reliable statements can be made with regards to impact statements, especially at more localized scales.

  1. Numerical tools to predict the environmental loads for offshore structures under extreme weather conditions

    NASA Astrophysics Data System (ADS)

    Wu, Yanling

    2018-05-01

    In this paper, the extreme waves were generated using the open source computational fluid dynamic (CFD) tools — OpenFOAM and Waves2FOAM — using linear and nonlinear NewWave input. They were used to conduct the numerical simulation of the wave impact process. Numerical tools based on first-order (with and without stretching) and second-order NewWave are investigated. The simulation to predict force loading for the offshore platform under the extreme weather condition is implemented and compared.

  2. 1,4-Dioxane Remediation by Extreme Soil Vapor Extraction (XSVE). Screening-Level Feasibility Assessment and Design Tool in Support of 1,4-Dioxane Remediation by Extreme Soil Vapor Extraction (XSVE) ESTCP Project ER 201326

    DTIC Science & Technology

    2017-10-01

    USER GUIDE 1,4-Dioxane Remediation by Extreme Soil Vapor Extraction (XSVE) Screening-Level Feasibility Assessment and Design Tool in...Support of 1,4-Dioxane Remediation by Extreme Soil Vapor Extraction (XSVE) ESTCP Project ER-201326 OCTOBER 2017 Rob Hinchee Integrated Science...Technology, Inc. 1509 Coastal Highway Panacea, FL 32346 8/8/2013 - 8/8/2018 10-2017 1,4-Dioxane Remediation by Extreme Soil Vapor Extraction (XSVE) Screening

  3. Accelerator mass spectrometry in the biomedical sciences: applications in low-exposure biomedical and environmental dosimetry

    NASA Astrophysics Data System (ADS)

    Felton, J. S.; Turteltaub, K. W.; Vogel, J. S.; Balhorn, R.; Gledhill, B. L.; Southon, J. R.; Caffee, M. W.; Finkel, R. C.; Nelson, D. E.; Proctor, I. D.; Davis, J. C.

    1990-12-01

    We are utilizing accelerator mass spectrometry as a sensitive detector for tracking the disposition of radioisotopically labeled molecules in the biomedical sciences. These applications have shown the effectiveness of AMS as a tool to quantify biologically important molecules at extremely low levels. For example, AMS is being used to determine the amount of carcinogen covalently bound to animal DNA (DNA adduct) at levels relevent to human exposure. Detection sensitivities are 1 carcinogen molecule bound in 1011 to 1012 DNA bases, depending on the specific activity of the radiolabeled carcinogen. Studies have been undertaken in our laboratory utilizing heterocyclic amine food-borne carcinogens and 2,3,7,8-tetrachlorodibenzo-p-dioxin (TCDD), a potent environmental carcinogen, to study the metabolism of carcinogens at low doses. In addition, AMS is being used to detect the presence of rare proteins (mutant forms of protamine) in human sperm. Approximately l per 106 sperm analyzed contain the rare form of the protamine. Protamine isolated from this small number of cells is being analyzed by AMS, following 14C labeling. Thus, AMS can be used to verify the identity of an extremely small amount of biological material. Furthermore, an additional improvement of 2 orders of magnitude in the sensitivity of biomédical tracer studies is suggested by preliminary work with bacterial hosts depleted in radiocarbon. Other problems in the life sciences where detection sensitivity or sample sizes are limitations should also benefit from AMS. Studies are underway to measure the molecular targeting of cancer chemotherapeutics in human tissue and to pursue applications for receptor biology. We are also applying other candidate isotopes, such as 3H (double labeling with 14C) and 41Ca (bone absorption) to problems in biology. The detection of 36Cl and 26Al have applications for determination of human neutron exposure and understanding neurological toxicity, respectively. The results described here with 14C-labeled molecules coupled with new isotope applications clearly show AMS technology to be an important new tool for the biomedical sciences community.

  4. Regularised extreme learning machine with misclassification cost and rejection cost for gene expression data classification.

    PubMed

    Lu, Huijuan; Wei, Shasha; Zhou, Zili; Miao, Yanzi; Lu, Yi

    2015-01-01

    The main purpose of traditional classification algorithms on bioinformatics application is to acquire better classification accuracy. However, these algorithms cannot meet the requirement that minimises the average misclassification cost. In this paper, a new algorithm of cost-sensitive regularised extreme learning machine (CS-RELM) was proposed by using probability estimation and misclassification cost to reconstruct the classification results. By improving the classification accuracy of a group of small sample which higher misclassification cost, the new CS-RELM can minimise the classification cost. The 'rejection cost' was integrated into CS-RELM algorithm to further reduce the average misclassification cost. By using Colon Tumour dataset and SRBCT (Small Round Blue Cells Tumour) dataset, CS-RELM was compared with other cost-sensitive algorithms such as extreme learning machine (ELM), cost-sensitive extreme learning machine, regularised extreme learning machine, cost-sensitive support vector machine (SVM). The results of experiments show that CS-RELM with embedded rejection cost could reduce the average cost of misclassification and made more credible classification decision than others.

  5. Sensitivity of UK butterflies to local climatic extremes: which life stages are most at risk?

    PubMed

    McDermott Long, Osgur; Warren, Rachel; Price, Jeff; Brereton, Tom M; Botham, Marc S; Franco, Aldina M A

    2017-01-01

    There is growing recognition as to the importance of extreme climatic events (ECEs) in determining changes in species populations. In fact, it is often the extent of climate variability that determines a population's ability to persist at a given site. This study examined the impact of ECEs on the resident UK butterfly species (n = 41) over a 37-year period. The study investigated the sensitivity of butterflies to four extremes (drought, extreme precipitation, extreme heat and extreme cold), identified at the site level, across each species' life stages. Variations in the vulnerability of butterflies at the site level were also compared based on three life-history traits (voltinism, habitat requirement and range). This is the first study to examine the effects of ECEs at the site level across all life stages of a butterfly, identifying sensitive life stages and unravelling the role life-history traits play in species sensitivity to ECEs. Butterfly population changes were found to be primarily driven by temperature extremes. Extreme heat was detrimental during overwintering periods and beneficial during adult periods and extreme cold had opposite impacts on both of these life stages. Previously undocumented detrimental effects were identified for extreme precipitation during the pupal life stage for univoltine species. Generalists were found to have significantly more negative associations with ECEs than specialists. With future projections of warmer, wetter winters and more severe weather events, UK butterflies could come under severe pressure given the findings of this study. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.

  6. Biotechnical use of polymerase chain reaction for microbiological analysis of biological samples.

    PubMed

    Lantz, P G; Abu al-Soud, W; Knutsson, R; Hahn-Hägerdal, B; Rådström, P

    2000-01-01

    Since its introduction in the mid-80s, polymerase chain reaction (PCR) technology has been recognised as a rapid, sensitive and specific molecular diagnostic tool for the analysis of micro-organisms in clinical, environmental and food samples. Although this technique can be extremely effective with pure solutions of nucleic acids, it's sensitivity may be reduced dramatically when applied directly to biological samples. This review describes PCR technology as a microbial detection method, PCR inhibitors in biological samples and various sample preparation techniques that can be used to facilitate PCR detection, by either separating the micro-organisms from PCR inhibitors and/or by concentrating the micro-organisms to detectable concentrations. Parts of this review are updated and based on a doctoral thesis by Lantz [1] and on a review discussing methods to overcome PCR inhibition in foods [2].

  7. Recent Advances in Bacteria Identification by Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry Using Nanomaterials as Affinity Probes

    PubMed Central

    Chiu, Tai-Chia

    2014-01-01

    Identifying trace amounts of bacteria rapidly, accurately, selectively, and with high sensitivity is important to ensuring the safety of food and diagnosing infectious bacterial diseases. Microbial diseases constitute the major cause of death in many developing and developed countries of the world. The early detection of pathogenic bacteria is crucial in preventing, treating, and containing the spread of infections, and there is an urgent requirement for sensitive, specific, and accurate diagnostic tests. Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is an extremely selective and sensitive analytical tool that can be used to characterize different species of pathogenic bacteria. Various functionalized or unmodified nanomaterials can be used as affinity probes to capture and concentrate microorganisms. Recent developments in bacterial detection using nanomaterials-assisted MALDI-MS approaches are highlighted in this article. A comprehensive table listing MALDI-MS approaches for identifying pathogenic bacteria, categorized by the nanomaterials used, is provided. PMID:24786089

  8. Recent advances in bacteria identification by matrix-assisted laser desorption/ionization mass spectrometry using nanomaterials as affinity probes.

    PubMed

    Chiu, Tai-Chia

    2014-04-28

    Identifying trace amounts of bacteria rapidly, accurately, selectively, and with high sensitivity is important to ensuring the safety of food and diagnosing infectious bacterial diseases. Microbial diseases constitute the major cause of death in many developing and developed countries of the world. The early detection of pathogenic bacteria is crucial in preventing, treating, and containing the spread of infections, and there is an urgent requirement for sensitive, specific, and accurate diagnostic tests. Matrix-assisted laser desorption/ionization mass spectrometry (MALDI-MS) is an extremely selective and sensitive analytical tool that can be used to characterize different species of pathogenic bacteria. Various functionalized or unmodified nanomaterials can be used as affinity probes to capture and concentrate microorganisms. Recent developments in bacterial detection using nanomaterials-assisted MALDI-MS approaches are highlighted in this article. A comprehensive table listing MALDI-MS approaches for identifying pathogenic bacteria, categorized by the nanomaterials used, is provided.

  9. The Extreme Ultraviolet Explorer

    NASA Technical Reports Server (NTRS)

    Malina, R. F.; Bowyer, S.; Lampton, M.; Finley, D.; Paresce, F.; Penegor, G.; Heetderks, H.

    1982-01-01

    The Extreme Ultraviolet Explorer Mission is described. The purpose of this mission is to search the celestial sphere for astronomical sources of extreme ultraviolet (EUV) radiation (100 to 1000 A). The search will be accomplished with the use of three EUV telescopes, each sensitive to different bands within the EUV band. A fourth telescope will perform a higher sensitivity search of a limited sample of the sky in a single EUV band. In six months, the entire sky will be scanned at a sensitivity level comparable to existing surveys in other more traditional astronomical bandpasses.

  10. An evaluation of selected in silico models for the assessment ...

    EPA Pesticide Factsheets

    Skin sensitization remains an important endpoint for consumers, manufacturers and regulators. Although the development of alternative approaches to assess skin sensitization potential has been extremely active over many years, the implication of regulations such as REACH and the Cosmetics Directive in EU has provided a much stronger impetus to actualize this research into practical tools for decision making. Thus there has been considerable focus on the development, evaluation, and integration of alternative approaches for skin sensitization hazard and risk assessment. This includes in silico approaches such as (Q)SARs and expert systems. This study aimed to evaluate the predictive performance of a selection of in silico models and then to explore whether combining those models led to an improvement in accuracy. A dataset of 473 substances that had been tested in the local lymph node assay (LLNA) was compiled. This comprised 295 sensitizers and 178 non-sensitizers. Four freely available models were identified - 2 statistical models VEGA and MultiCASE model A33 for skin sensitization (MCASE A33) from the Danish National Food Institute and two mechanistic models Toxtree’s Skin sensitization Reaction domains (Toxtree SS Rxn domains) and the OASIS v1.3 protein binding alerts for skin sensitization from the OECD Toolbox (OASIS). VEGA and MCASE A33 aim to predict sensitization as a binary score whereas the mechanistic models identified reaction domains or structura

  11. Uncertainty Quantification for Ice Sheet Science and Sea Level Projections

    NASA Astrophysics Data System (ADS)

    Boening, C.; Schlegel, N.; Limonadi, D.; Schodlok, M.; Seroussi, H. L.; Larour, E. Y.; Watkins, M. M.

    2017-12-01

    In order to better quantify uncertainties in global mean sea level rise projections and in particular upper bounds, we aim at systematically evaluating the contributions from ice sheets and potential for extreme sea level rise due to sudden ice mass loss. Here, we take advantage of established uncertainty quantification tools embedded within the Ice Sheet System Model (ISSM) as well as sensitivities to ice/ocean interactions using melt rates and melt potential derived from MITgcm/ECCO2. With the use of these tools, we conduct Monte-Carlo style sampling experiments on forward simulations of the Antarctic ice sheet, by varying internal parameters and boundary conditions of the system over both extreme and credible worst-case ranges. Uncertainty bounds for climate forcing are informed by CMIP5 ensemble precipitation and ice melt estimates for year 2100, and uncertainty bounds for ocean melt rates are derived from a suite of regional sensitivity experiments using MITgcm. Resulting statistics allow us to assess how regional uncertainty in various parameters affect model estimates of century-scale sea level rise projections. The results inform efforts to a) isolate the processes and inputs that are most responsible for determining ice sheet contribution to sea level; b) redefine uncertainty brackets for century-scale projections; and c) provide a prioritized list of measurements, along with quantitative information on spatial and temporal resolution, required for reducing uncertainty in future sea level rise projections. Results indicate that ice sheet mass loss is dependent on the spatial resolution of key boundary conditions - such as bedrock topography and melt rates at the ice-ocean interface. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  12. A Sensitivity-Based Approach to Quantifying the Costs of Weather and Climate Impacts: A Case Study of the Southern Pennsylvania Transportation Authority Adaptation Pilot Project

    NASA Astrophysics Data System (ADS)

    Casola, J.; Johanson, E.; Groth, P.; Snow, C.; Choate, A.

    2012-12-01

    Southeastern Pennsylvania Transportation Authority (SEPTA), with support from the Federal Transit Administration, has been investigating its agency's vulnerability to weather-related disruption and damages as a way to inform an overall adaptation strategy for climate variability and change. Exploiting daily rail service records maintained by SEPTA and observations from nearby weather stations, we have developed a methodology for quantifying the sensitivity of SEPTA's Manayunk/Norristown rail line to various weather events (e.g., snow storms, heat waves, heavy rainfall and flooding, tropical storms). For each type of event, sensitivity is equated to the frequency and extent of service disruptions associated with the event, and includes the identification of thresholds beyond which impacts are observed. In addition, we have estimated the monetary costs associated with repair and replacement of infrastructure following these events. Our results have facilitated discussions with SEPTA operational staff, who have outlined the institutional aspects of their preparation and response processes for these weather events. We envision the methodology as being useful for resource and infrastructure managers across the public and private sector, and potentially scalable to smaller or larger operations. There are several advantageous aspects of the method: 1) the quantification of sensitivity, and the coupling of that sensitivity to cost information, provides credible input to SEPTA decision-makers as they establish the priorities and level of investment associated with their adaptation actions for addressing extreme weather; 2) the method provides a conceptual foundation for estimating the magnitude, frequency, and costs of potential future impacts at a local scale, especially with regard to heat waves; 3) the sensitivity information serves as an excellent discussion tool, enabling further research and information gathering about institutional relationships and procedures. These relationships and procedures are critical to the effectiveness of preparation for and responses to extreme weather events, but are often not explicitly documented.

  13. Extreme ultraviolet patterned mask inspection performance of advanced projection electron microscope system for 11nm half-pitch generation

    NASA Astrophysics Data System (ADS)

    Hirano, Ryoichi; Iida, Susumu; Amano, Tsuyoshi; Watanabe, Hidehiro; Hatakeyama, Masahiro; Murakami, Takeshi; Suematsu, Kenichi; Terao, Kenji

    2016-03-01

    Novel projection electron microscope optics have been developed and integrated into a new inspection system named EBEYE-V30 ("Model EBEYE" is an EBARA's model code) , and the resulting system shows promise for application to half-pitch (hp) 16-nm node extreme ultraviolet lithography (EUVL) patterned mask inspection. To improve the system's inspection throughput for 11-nm hp generation defect detection, a new electron-sensitive area image sensor with a high-speed data processing unit, a bright and stable electron source, and an image capture area deflector that operates simultaneously with the mask scanning motion have been developed. A learning system has been used for the mask inspection tool to meet the requirements of hp 11-nm node EUV patterned mask inspection. Defects are identified by the projection electron microscope system using the "defectivity" from the characteristics of the acquired image. The learning system has been developed to reduce the labor and costs associated with adjustment of the detection capability to cope with newly-defined mask defects. We describe the integration of the developed elements into the inspection tool and the verification of the designed specification. We have also verified the effectiveness of the learning system, which shows enhanced detection capability for the hp 11-nm node.

  14. Bayesian hierarchical modeling for subject-level response classification in peptide microarray immunoassays

    PubMed Central

    Imholte, Gregory; Gottardo, Raphael

    2017-01-01

    Summary The peptide microarray immunoassay simultaneously screens sample serum against thousands of peptides, determining the presence of antibodies bound to array probes. Peptide microarrays tiling immunogenic regions of pathogens (e.g. envelope proteins of a virus) are an important high throughput tool for querying and mapping antibody binding. Because of the assay’s many steps, from probe synthesis to incubation, peptide microarray data can be noisy with extreme outliers. In addition, subjects may produce different antibody profiles in response to an identical vaccine stimulus or infection, due to variability among subjects’ immune systems. We present a robust Bayesian hierarchical model for peptide microarray experiments, pepBayes, to estimate the probability of antibody response for each subject/peptide combination. Heavy-tailed error distributions accommodate outliers and extreme responses, and tailored random effect terms automatically incorporate technical effects prevalent in the assay. We apply our model to two vaccine trial datasets to demonstrate model performance. Our approach enjoys high sensitivity and specificity when detecting vaccine induced antibody responses. A simulation study shows an adaptive thresholding classification method has appropriate false discovery rate control with high sensitivity, and receiver operating characteristics generated on vaccine trial data suggest that pepBayes clearly separates responses from non-responses. PMID:27061097

  15. Fast assessment of planar chromatographic layers quality using pulse thermovision method.

    PubMed

    Suszyński, Zbigniew; Świta, Robert; Loś, Joanna; Zarzycka, Magdalena B; Kaleniecka, Aleksandra; Zarzycki, Paweł K

    2014-12-19

    The main goal of this paper is to demonstrate capability of pulse thermovision (thermal-wave) methodology for sensitive detection of photothermal non-uniformities within light scattering and semi-transparent planar stationary phases. Successful visualization of stationary phases defects required signal processing protocols based on wavelet filtration, correlation analysis and k-means 3D segmentation. Such post-processing data handling approach allows extremely sensitive detection of thickness and structural changes within commercially available planar chromatographic layers. Particularly, a number of TLC and HPTLC stationary phases including silica, cellulose, aluminum oxide, polyamide and octadecylsilane coated with adsorbent layer ranging from 100 to 250μm were investigated. Presented detection protocol can be used as an efficient tool for fast screening the overall heterogeneity of any layered materials. Moreover, described procedure is very fast (few seconds including acquisition and data processing) and may be applied for fabrication processes online controlling. In spite of planar chromatographic plates this protocol can be used for assessment of different planar separation tools like paper based analytical devices or micro total analysis systems, consisted of organic and non-organic layers. Copyright © 2014 Elsevier B.V. All rights reserved.

  16. Molecular Imprinting Technology in Quartz Crystal Microbalance (QCM) Sensors

    PubMed Central

    Emir Diltemiz, Sibel; Keçili, Rüstem; Ersöz, Arzu; Say, Rıdvan

    2017-01-01

    Molecularly imprinted polymers (MIPs) as artificial antibodies have received considerable scientific attention in the past years in the field of (bio)sensors since they have unique features that distinguish them from natural antibodies such as robustness, multiple binding sites, low cost, facile preparation and high stability under extreme operation conditions (higher pH and temperature values, etc.). On the other hand, the Quartz Crystal Microbalance (QCM) is an analytical tool based on the measurement of small mass changes on the sensor surface. QCM sensors are practical and convenient monitoring tools because of their specificity, sensitivity, high accuracy, stability and reproducibility. QCM devices are highly suitable for converting the recognition process achieved using MIP-based memories into a sensor signal. Therefore, the combination of a QCM and MIPs as synthetic receptors enhances the sensitivity through MIP process-based multiplexed binding sites using size, 3D-shape and chemical function having molecular memories of the prepared sensor system toward the target compound to be detected. This review aims to highlight and summarize the recent progress and studies in the field of (bio)sensor systems based on QCMs combined with molecular imprinting technology. PMID:28245588

  17. Sensitivity of Rainfall Extremes Under Warming Climate in Urban India

    NASA Astrophysics Data System (ADS)

    Ali, H.; Mishra, V.

    2017-12-01

    Extreme rainfall events in urban India halted transportation, damaged infrastructure, and affected human lives. Rainfall extremes are projected to increase under the future climate. We evaluated the relationship (scaling) between rainfall extremes at different temporal resolutions (daily, 3-hourly, and 30 minutes), daily dewpoint temperature (DPT) and daily air temperature at 850 hPa (T850) for 23 urban areas in India. Daily rainfall extremes obtained from Global Surface Summary of Day Data (GSOD) showed positive regression slopes for most of the cities with median of 14%/K for the period of 1979-2013 for DPT and T850, which is higher than Clausius-Clapeyron (C-C) rate ( 7%). Moreover, sub-daily rainfall extremes are more sensitive to both DPT and T850. For instance, 3-hourly rainfall extremes obtained from Tropical Rainfall Measurement Mission (TRMM 3B42 V7) showed regression slopes more than 16%/K aginst DPT and T850 for the period of 1998-2015. Half-hourly rainfall extremes from the Integrated Multi-satellitE Retrievals (IMERGE) of Global precipitation mission (GPM) also showed higher sensitivity against changes in DPT and T850. The super scaling of rainfall extremes against changes in DPT and T850 can be attributed to convective nature of precipitation in India. Our results show that urban India may witness non-stationary rainfall extremes, which, in turn will affect stromwater designs and frequency and magniture of urban flooding.

  18. Upper extremity outcome measures for collagen VI-related myopathy and LAMA2-related muscular dystrophy

    PubMed Central

    Bendixen, Roxanna M.; Butrum, Jocelyn; Jain, Mina S.; Parks, Rebecca; Hodsdon, Bonnie; Nichols, Carmel; Hsia, Michelle; Nelson, Leslie; Keller, Katherine C.; McGuire, Michelle; Elliott, Jeffrey S.; Linton, Melody M.; Arveson, Irene C.; Tounkara, Fatou; Vasavada, Ruhi; Harnett, Elizabeth; Punjabi, Monal; Donkervoort, Sandra; Dastgir, Jahannaz; Leach, Meganne E.; Rutkowski, Anne; Waite, Melissa; Collins, James; Bönnemann, Carsten G.; Meilleur, Katherine G.

    2017-01-01

    Purpose Congenital muscular dystrophy (CMD) comprises a rare group of genetic muscle diseases that present at birth or early during infancy. Two common subtypes of CMD are collagen VI-related muscular dystrophy (COL6-RD) and laminin alpha 2-related dystrophy (LAMA2-RD). Traditional outcome measures in CMD include gross motor and mobility assessments, yet significant motor declines underscore the need for valid upper extremity (UE) motor assessments as a clinical endpoint. This study validated a battery of UE measures in these two CMD subtypes for future clinical trials. Methods For this cross-sectional study, 42 participants were assessed over the same 2–5 day period at the National Institutes of Health Clinical Center (CC). All UE measures were correlated with the Motor Function Measure 32 (MFM32). The battery of UE assessments included the Jebsen Taylor Hand Function Test, Quality of Upper Extremity Skills Test (QUEST), hand held dynamometry, goniometry, and MyoSet Tools. Spearman Rho was used for correlations to the MFM32. Pearson was performed to correlate the Jebsen, QUEST, hand-held dynamometry, goniometry and the MyoSet Tools. Correlations were considered significant at the 0.01 level (2-tailed). Results Significant correlations were found between both the MFM32 and MFM Dimension 3 only (Distal Motor function) and the Jebsen, QUEST, MyoGrip and MyoPinch, elbow flexion/extension ROM and myometry. Additional correlations between the assessments are reported. Conclusions The Jebsen, the Grasp and Dissociated Movements domains of the QUEST, the MyoGrip and the MyoPinch tools, as well as elbow ROM and myometry were determined to be valid and feasible in this population, provided variation in test items, and assessed a range of difficulty in CMD. To move forward, it will be of utmost importance to determine whether these UE measures are reproducible and sensitive to change over time. PMID:28087121

  19. An Agent-Based Modeling Approach to Integrate Tsunami Science, Human Behavior, and Unplanned Network Disruptions for Nearfield Tsunami Evacuation

    NASA Astrophysics Data System (ADS)

    Cox, D. T.; Wang, H.; Cramer, L.; Mostafizi, A.; Park, H.

    2016-12-01

    A 2015 heatwave in Pakistan is blamed for over a thousand deaths. This event consisted of several days of very high temperatures and unusually high humidity for this region. However, none of these days exceeded the threshold for "extreme danger" in terms of the heat index. The heat index is a univariate function of both temperature and humidity which is universally applied at all locations regardless of local climate. Understanding extremes which arise from multiple factors is challenging. In this paper we will present a tool for examining bivariate extreme behavior. The tool, developed in the statistical software R, draws isolines of equal exceedance probability. These isolines can be understood as bivariate "return levels". The tool is based on a dependence framework specific for extremes, is semiparametric, and is able to extrapolate isolines beyond the range of the data. We illustrate this tool using the Pakistan heat wave data and other bivariate data.

  20. EUV lithography for 22nm half pitch and beyond: exploring resolution, LWR, and sensitivity tradeoffs

    NASA Astrophysics Data System (ADS)

    Putna, E. Steve; Younkin, Todd R.; Leeson, Michael; Caudillo, Roman; Bacuita, Terence; Shah, Uday; Chandhok, Manish

    2011-04-01

    The International Technology Roadmap for Semiconductors (ITRS) denotes Extreme Ultraviolet (EUV) lithography as a leading technology option for realizing the 22nm half pitch node and beyond. According to recent assessments made at the 2010 EUVL Symposium, the readiness of EUV materials remains one of the top risk items for EUV adoption. The main development issue regarding EUV resists has been how to simultaneously achieve high resolution, high sensitivity, and low line width roughness (LWR). This paper describes our strategy, the current status of EUV materials, and the integrated post-development LWR reduction efforts made at Intel Corporation. Data collected utilizing Intel's Micro- Exposure Tool (MET) is presented in order to examine the feasibility of establishing a resist process that simultaneously exhibits <=22nm half-pitch (HP) L/S resolution at <=11.3mJ/cm2 with <=3nm LWR.

  1. Electrical impedance myography in the diagnosis of radiculopathy.

    PubMed

    Spieker, Andrew J; Narayanaswami, Pushpa; Fleming, Laura; Keel, John C; Muzin, Stefan C; Rutkove, Seward B

    2013-11-01

    We sought to determine whether electrical impedance myography (EIM) could serve as a diagnostic procedure for evaluation of radiculopathy. Twenty-seven patients with clinically and radiologically diagnosed cervical or lumbosacral radiculopathy who met a "gold standard" definition underwent EIM and standard needle electromyography (EMG) of multiple upper or lower extremity muscles. EIM reactance values revealed consistent reductions in the radiculopathy-affected myotomal muscles as compared with those on the unaffected side; the degree of asymmetry was associated strongly with the degree of EMG abnormality (P < 0.001). EIM had a sensitivity of 64.5% and a specificity of 77.0%; in comparison, EMG had a sensitivity of 79.7% but a specificity of 69.7%. These findings support the potential for EIM to serve as a new non-invasive tool to assist in diagnosis of radiculopathy; however, further refinement of the technique is needed for this specific application. Copyright © 2013 Wiley Periodicals, Inc.

  2. Pushing precipitation to the extremes in distributed experiments: Recommendations for simulating wet and dry years

    USGS Publications Warehouse

    Knapp, Alan K.; Avolio, Meghan L.; Beier, Claus; Carroll, Charles J.W.; Collins, Scott L.; Dukes, Jeffrey S.; Fraser, Lauchlan H.; Griffin-Nolan, Robert J.; Hoover, David L.; Jentsch, Anke; Loik, Michael E.; Phillips, Richard P.; Post, Alison K.; Sala, Osvaldo E.; Slette, Ingrid J.; Yahdjian, Laura; Smith, Melinda D.

    2017-01-01

    Intensification of the global hydrological cycle, ranging from larger individual precipitation events to more extreme multiyear droughts, has the potential to cause widespread alterations in ecosystem structure and function. With evidence that the incidence of extreme precipitation years (defined statistically from historical precipitation records) is increasing, there is a clear need to identify ecosystems that are most vulnerable to these changes and understand why some ecosystems are more sensitive to extremes than others. To date, opportunistic studies of naturally occurring extreme precipitation years, combined with results from a relatively small number of experiments, have provided limited mechanistic understanding of differences in ecosystem sensitivity, suggesting that new approaches are needed. Coordinated distributed experiments (CDEs) arrayed across multiple ecosystem types and focused on water can enhance our understanding of differential ecosystem sensitivity to precipitation extremes, but there are many design challenges to overcome (e.g., cost, comparability, standardization). Here, we evaluate contemporary experimental approaches for manipulating precipitation under field conditions to inform the design of ‘Drought-Net’, a relatively low-cost CDE that simulates extreme precipitation years. A common method for imposing both dry and wet years is to alter each ambient precipitation event. We endorse this approach for imposing extreme precipitation years because it simultaneously alters other precipitation characteristics (i.e., event size) consistent with natural precipitation patterns. However, we do not advocate applying identical treatment levels at all sites – a common approach to standardization in CDEs. This is because precipitation variability varies >fivefold globally resulting in a wide range of ecosystem-specific thresholds for defining extreme precipitation years. For CDEs focused on precipitation extremes, treatments should be based on each site's past climatic characteristics. This approach, though not often used by ecologists, allows ecological responses to be directly compared across disparate ecosystems and climates, facilitating process-level understanding of ecosystem sensitivity to precipitation extremes.

  3. Pushing precipitation to the extremes in distributed experiments: recommendations for simulating wet and dry years.

    PubMed

    Knapp, Alan K; Avolio, Meghan L; Beier, Claus; Carroll, Charles J W; Collins, Scott L; Dukes, Jeffrey S; Fraser, Lauchlan H; Griffin-Nolan, Robert J; Hoover, David L; Jentsch, Anke; Loik, Michael E; Phillips, Richard P; Post, Alison K; Sala, Osvaldo E; Slette, Ingrid J; Yahdjian, Laura; Smith, Melinda D

    2017-05-01

    Intensification of the global hydrological cycle, ranging from larger individual precipitation events to more extreme multiyear droughts, has the potential to cause widespread alterations in ecosystem structure and function. With evidence that the incidence of extreme precipitation years (defined statistically from historical precipitation records) is increasing, there is a clear need to identify ecosystems that are most vulnerable to these changes and understand why some ecosystems are more sensitive to extremes than others. To date, opportunistic studies of naturally occurring extreme precipitation years, combined with results from a relatively small number of experiments, have provided limited mechanistic understanding of differences in ecosystem sensitivity, suggesting that new approaches are needed. Coordinated distributed experiments (CDEs) arrayed across multiple ecosystem types and focused on water can enhance our understanding of differential ecosystem sensitivity to precipitation extremes, but there are many design challenges to overcome (e.g., cost, comparability, standardization). Here, we evaluate contemporary experimental approaches for manipulating precipitation under field conditions to inform the design of 'Drought-Net', a relatively low-cost CDE that simulates extreme precipitation years. A common method for imposing both dry and wet years is to alter each ambient precipitation event. We endorse this approach for imposing extreme precipitation years because it simultaneously alters other precipitation characteristics (i.e., event size) consistent with natural precipitation patterns. However, we do not advocate applying identical treatment levels at all sites - a common approach to standardization in CDEs. This is because precipitation variability varies >fivefold globally resulting in a wide range of ecosystem-specific thresholds for defining extreme precipitation years. For CDEs focused on precipitation extremes, treatments should be based on each site's past climatic characteristics. This approach, though not often used by ecologists, allows ecological responses to be directly compared across disparate ecosystems and climates, facilitating process-level understanding of ecosystem sensitivity to precipitation extremes. © 2016 John Wiley & Sons Ltd.

  4. Creation of a simple natural language processing tool to support an imaging utilization quality dashboard.

    PubMed

    Swartz, Jordan; Koziatek, Christian; Theobald, Jason; Smith, Silas; Iturrate, Eduardo

    2017-05-01

    Testing for venous thromboembolism (VTE) is associated with cost and risk to patients (e.g. radiation). To assess the appropriateness of imaging utilization at the provider level, it is important to know that provider's diagnostic yield (percentage of tests positive for the diagnostic entity of interest). However, determining diagnostic yield typically requires either time-consuming, manual review of radiology reports or the use of complex and/or proprietary natural language processing software. The objectives of this study were twofold: 1) to develop and implement a simple, user-configurable, and open-source natural language processing tool to classify radiology reports with high accuracy and 2) to use the results of the tool to design a provider-specific VTE imaging dashboard, consisting of both utilization rate and diagnostic yield. Two physicians reviewed a training set of 400 lower extremity ultrasound (UTZ) and computed tomography pulmonary angiogram (CTPA) reports to understand the language used in VTE-positive and VTE-negative reports. The insights from this review informed the arguments to the five modifiable parameters of the NLP tool. A validation set of 2,000 studies was then independently classified by the reviewers and by the tool; the classifications were compared and the performance of the tool was calculated. The tool was highly accurate in classifying the presence and absence of VTE for both the UTZ (sensitivity 95.7%; 95% CI 91.5-99.8, specificity 100%; 95% CI 100-100) and CTPA reports (sensitivity 97.1%; 95% CI 94.3-99.9, specificity 98.6%; 95% CI 97.8-99.4). The diagnostic yield was then calculated at the individual provider level and the imaging dashboard was created. We have created a novel NLP tool designed for users without a background in computer programming, which has been used to classify venous thromboembolism reports with a high degree of accuracy. The tool is open-source and available for download at http://iturrate.com/simpleNLP. Results obtained using this tool can be applied to enhance quality by presenting information about utilization and yield to providers via an imaging dashboard. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Clinical validation of the C-VAT 2.0 assessment tool for gaming disorder: A sensitivity analysis of the proposed DSM-5 criteria and the clinical characteristics of young patients with 'video game addiction'.

    PubMed

    van Rooij, Antonius J; Schoenmakers, Tim M; van de Mheen, Dike

    2017-01-01

    Clinicians struggle with the identification of video gaming problems. To address this issue, a clinical assessment tool (C-VAT 2.0) was developed and tested in a clinical setting. The instrument allows exploration of the validity of the DSM-5 proposal for 'internet gaming disorder'. Using C-VAT 2.0, the current study provides a sensitivity analysis of the proposed DSM-5 criteria in a clinical youth sample (13-23years old) in treatment for video gaming disorder (N=32). The study also explores the clinical characteristics of these patients. The patients were all male and reported spending extensive amounts of time on video games. At least half of the patients reported playing online games (n=15). Comorbid problems were common (n=22) and included (social) anxiety disorders, PDD NOS, ADHD/ADD, Parent-Child relationship problem, and various types of depressive mood problems. The sensitivity of the test was good: results further show that the C-VAT correctly identified 91% of the sample at the proposed cut-off score of at least 5 out of 9 of the criteria. As our study did not include healthy, extreme gamers, we could not assess the specificity of the tool: future research should make this a priority. Using the proposed DSM-5 cut-off score, the C-VAT 2.0 shows preliminary validity in a sample of gamers in treatment for gaming disorder, but the discriminating value of the instrument should be studied further. In the meantime, it is crucial that therapists try to avoid false positives by using expert judgment of functional impairment in each case. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Transportation Resilience Tools from the U.S. Department of Transportation

    NASA Astrophysics Data System (ADS)

    Snow, C.; Rodehorst, B.; Miller, R.; Choate, A.; Hyman, R.; Kafalenos, R.; Beucler, B.

    2014-12-01

    The U.S. Department of Transportation (U.S. DOT) and ICF International have been working to develop tools and resources to help state departments of transportation (DOTs) and metropolitan planning organizations (MPOs) prepare for the impacts of climate change. U.S. DOT recently released a set of climate change and extreme weather tools for state DOTs and MPOs that address key challenges they have faced in increasing their climate change resilience. The tools were developed under the U.S. DOT Gulf Coast Study, Phase 2. The CMIP Climate Data Processing Tool provides an easy way for users to gather and process downscaled climate model data at the local level, and "translates" that data into information relevant to transportation engineers and planners. The Vulnerability Assessment Scoring Tool (VAST), provides a step-by-step approach for users to assess their vulnerability to climate change in a transparent, cost-effective way. The Transportation Climate Change Sensitivity Matrix provides detailed information on how 11 different climate stressors may affect transportation infrastructure and operations. These tools significantly advance the state of the practice for transportation agencies to respond to climate change impacts, and beta-versions have been used successfully by several state DOTs and MPOs. This presentation will focus on these tools, examples of how they can be applied within transportation agencies, and opportunities to apply the lessons learned from the tools—or even the tools themselves—beyond the transportation sector, including as part of the national Climate Resilience Toolkit.

  7. Assessing changes in extreme river flow regulation from non-stationarity in hydrological scaling laws

    NASA Astrophysics Data System (ADS)

    Rodríguez, Estiven; Salazar, Juan Fernando; Villegas, Juan Camilo; Mercado-Bettín, Daniel

    2018-07-01

    Extreme flows are key components of river flow regimes that affect manifold hydrological, geomorphological and ecological processes with societal relevance. One fundamental characteristic of extreme flows in river basins is that they exhibit scaling properties which can be identified through scaling (power) laws. Understanding the physical mechanisms behind such scaling laws is a continuing challenge in hydrology, with potential implications for the prediction of river flow regimes in a changing environment and ungauged basins. After highlighting that the scaling properties are sensitive to environmental change, we develop a physical interpretation of how temporal changes in scaling exponents relate to the capacity of river basins to regulate extreme river flows. Regulation is defined here as the basins' capacity to either dampen high flows or to enhance low flows. Further, we use this framework to infer temporal changes in the regulation capacity of five large basins in tropical South America. Our results indicate that, during the last few decades, the Amazon river basin has been reducing its capacity to enhance low flows, likely as a consequence of pronounced environmental change in its south and south-eastern sub-basins. The proposed framework is widely applicable to different basins, and provides foundations for using scaling laws as empirical tools for inferring temporal changes of hydrological regulation, particularly relevant for identifying and managing hydrological consequences of environmental change.

  8. Contamination Effects on EUV Optics

    NASA Technical Reports Server (NTRS)

    Tveekrem, J.

    1999-01-01

    During ground-based assembly and upon exposure to the space environment, optical surfaces accumulate both particles and molecular condensibles, inevitably resulting in degradation of optical instrument performance. Currently, this performance degradation (and the resulting end-of-life instrument performance) cannot be predicted with sufficient accuracy using existing software tools. Optical design codes exist to calculate instrument performance, but these codes generally assume uncontaminated optical surfaces. Contamination models exist which predict approximate end-of-life contamination levels, but the optical effects of these contamination levels can not be quantified without detailed information about the optical constants and scattering properties of the contaminant. The problem is particularly pronounced in the extreme ultraviolet (EUV, 300-1,200 A) and far (FUV, 1,200-2,000 A) regimes due to a lack of data and a lack of knowledge of the detailed physical and chemical processes involved. Yet it is in precisely these wavelength regimes that accurate predictions are most important, because EUV/FUV instruments are extremely sensitive to contamination.

  9. Revised upper limb module for spinal muscular atrophy: Development of a new module.

    PubMed

    Mazzone, Elena S; Mayhew, Anna; Montes, Jacqueline; Ramsey, Danielle; Fanelli, Lavinia; Young, Sally Dunaway; Salazar, Rachel; De Sanctis, Roberto; Pasternak, Amy; Glanzman, Allan; Coratti, Giorgia; Civitello, Matthew; Forcina, Nicola; Gee, Richard; Duong, Tina; Pane, Marika; Scoto, Mariacristina; Pera, Maria Carmela; Messina, Sonia; Tennekoon, Gihan; Day, John W; Darras, Basil T; De Vivo, Darryl C; Finkel, Richard; Muntoni, Francesco; Mercuri, Eugenio

    2017-06-01

    There is a growing need for a robust clinical measure to assess upper limb motor function in spinal muscular atrophy (SMA), as the available scales lack sensitivity at the extremes of the clinical spectrum. We report the development of the Revised Upper Limb Module (RULM), an assessment specifically designed for upper limb function in SMA patients. An international panel with specific neuromuscular expertise performed a thorough review of scales currently available to assess upper limb function in SMA. This review facilitated a revision of the existing upper limb function scales to make a more robust clinical scale. Multiple revisions of the scale included statistical analysis and captured clinically relevant changes to fulfill requirements by regulators and advocacy groups. The resulting RULM scale shows good reliability and validity, making it a suitable tool to assess upper extremity function in the SMA population for multi-center clinical research. Muscle Nerve 55: 869-874, 2017. © 2016 Wiley Periodicals, Inc.

  10. Decision-support tools for Extreme Weather and Climate Events in the Northeast United States

    NASA Astrophysics Data System (ADS)

    Kumar, S.; Lowery, M.; Whelchel, A.

    2013-12-01

    Decision-support tools were assessed for the 2013 National Climate Assessment technical input document, "Climate Change in the Northeast, A Sourcebook". The assessment included tools designed to generate and deliver actionable information to assist states and highly populated urban and other communities in assessment of climate change vulnerability and risk, quantification of effects, and identification of adaptive strategies in the context of adaptation planning across inter-annual, seasonal and multi-decadal time scales. State-level adaptation planning in the Northeast has generally relied on qualitative vulnerability assessments by expert panels and stakeholders, although some states have undertaken initiatives to develop statewide databases to support vulnerability assessments by urban and local governments, and state agencies. The devastation caused by Superstorm Sandy in October 2012 has raised awareness of the potential for extreme weather events to unprecedented levels and created urgency for action, especially in coastal urban and suburban communities that experienced pronounced impacts - especially in New Jersey, New York and Connecticut. Planning approaches vary, but any adaptation and resiliency planning process must include the following: - Knowledge of the probable change in a climate variable (e.g., precipitation, temperature, sea-level rise) over time or that the climate variable will attain a certain threshold deemed to be significant; - Knowledge of intensity and frequency of climate hazards (past, current or future events or conditions with potential to cause harm) and their relationship with climate variables; - Assessment of climate vulnerabilities (sensitive resources, infrastructure or populations exposed to climate-related hazards); - Assessment of relative risks to vulnerable resources; - Identification and prioritization of adaptive strategies to address risks. Many organizations are developing decision-support tools to assist in the urban planning process by addressing some of these needs. In this paper we highlight the decision tools available today, discuss their application in selected case studies, and present a gap analysis with opportunities for innovation and future work.

  11. Observed and predicted sensitivities of extreme surface ozone to meteorological drivers in three US cities

    NASA Astrophysics Data System (ADS)

    Fix, Miranda J.; Cooley, Daniel; Hodzic, Alma; Gilleland, Eric; Russell, Brook T.; Porter, William C.; Pfister, Gabriele G.

    2018-03-01

    We conduct a case study of observed and simulated maximum daily 8-h average (MDA8) ozone (O3) in three US cities for summers during 1996-2005. The purpose of this study is to evaluate the ability of a high resolution atmospheric chemistry model to reproduce observed relationships between meteorology and high or extreme O3. We employ regional coupled chemistry-transport model simulations to make three types of comparisons between simulated and observational data, comparing (1) tails of the O3 response variable, (2) distributions of meteorological predictor variables, and (3) sensitivities of high and extreme O3 to meteorological predictors. This last comparison is made using two methods: quantile regression, for the 0.95 quantile of O3, and tail dependence optimization, which is used to investigate even higher O3 extremes. Across all three locations, we find substantial differences between simulations and observational data in both meteorology and meteorological sensitivities of high and extreme O3.

  12. PROSPECTIVE FUNCTIONAL PERFORMANCE TESTING AND RELATIONSHIP TO LOWER EXTREMITY INJURY INCIDENCE IN ADOLESCENT SPORTS PARTICIPANTS

    PubMed Central

    DePhillipo, Nick; Kimura, Iris; Kocher, Morgan; Hetzler, Ronald

    2017-01-01

    Background Due to the high number of adolescent athletes and subsequent lower extremity injuries, improvements of injury prevention strategies with emphasis on clinic-based and practical assessments are warranted. Purpose The purpose of this study was to prospectively investigate if a battery of functional performance tests (FPT) could be used as a preseason-screening tool to identify adolescent athletes at risk for sports-related acute lower extremity injury via comparison of injured and uninjured subjects. Methods One hundred adolescent volleyball, basketball and soccer athletes (female, n=62; male, n=38; mean age = 14.4 ± 1.6) participated. The FPT assessment included: triple hop for distance, star excursion balance test, double leg lowering maneuver, drop jump video test, and multi-stage fitness test. Composite scores were calculated using a derived equation. Subjects were monitored throughout their designated sport season(s), which consisted of a six-month surveillance period. The schools certified athletic trainer (ATC) recorded all injuries. Subjects were categorized into groups according to sex and injury incidence (acute lower extremity injury vs. uninjured) for analysis. Results Mean FPT composite scores were significantly lower for the injured compared to the uninjured groups in both sexes (males: 19.06 ± 3.59 vs. 21.90 ± 2.44; females: 19.48 ± 3.35 vs. 22.10 ± 3.06 injured and uninjured, respectively)(p < .05). The receiver-operator characteristic analysis determined the cut-off score at ≤ 20 for both genders (sensitivity=.71, specificity=.81, for males; sensitivity=.67, specificity=.69, for females)(p<.05) for acute noncontact lower extremity injuries. Significant positive correlations were found between the FPT composite score and the multi-stage fitness test in male subjects (r=.474, p=.003), suggesting a relationship between functional performance, aerobic capacity, and potential injury risk. Conclusion A comprehensive assessment of functional performance tests may be beneficial to identify high-injury risk adolescents prior to athletic participation. PMID:28515975

  13. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knio, Omar M.

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather inputmore » in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.« less

  14. Setting up a proper power spectral density (PSD) and autocorrelation analysis for material and process characterization

    NASA Astrophysics Data System (ADS)

    Rutigliani, Vito; Lorusso, Gian Francesco; De Simone, Danilo; Lazzarino, Frederic; Rispens, Gijsbert; Papavieros, George; Gogolides, Evangelos; Constantoudis, Vassilios; Mack, Chris A.

    2018-03-01

    Power spectral density (PSD) analysis is playing more and more a critical role in the understanding of line-edge roughness (LER) and linewidth roughness (LWR) in a variety of applications across the industry. It is an essential step to get an unbiased LWR estimate, as well as an extremely useful tool for process and material characterization. However, PSD estimate can be affected by both random to systematic artifacts caused by image acquisition and measurement settings, which could irremediably alter its information content. In this paper, we report on the impact of various setting parameters (smoothing image processing filters, pixel size, and SEM noise levels) on the PSD estimate. We discuss also the use of PSD analysis tool in a variety of cases. Looking beyond the basic roughness estimate, we use PSD and autocorrelation analysis to characterize resist blur[1], as well as low and high frequency roughness contents and we apply this technique to guide the EUV material stack selection. Our results clearly indicate that, if properly used, PSD methodology is a very sensitive tool to investigate material and process variations

  15. Development of enzyme linked immunosorbent assay (ELISA) for the detection of root-knot nematode Meloidogyne incognita.

    PubMed

    Kapur-Ghai, J; Kaur, M; Goel, P

    2014-09-01

    Root-knot nematodes (Meloidogyne incognita) are obligate, sedentary plant endoparasites that are extremely polyphagous in nature and cause severe economic losses in agriculture. Hence, it is essential to control the parasite at an early stage. For any control strategy to be effective, an early and accurate diagnosis is of paramount importance. Immunoassays have the inherent advantages of sensitivity and specificity; have the potential to identify and quantify these plant-parasitic nematodes. Hence, in the present studies, enzyme-linked immunosorbent assay (ELISA) has been developed for the detection of M.incognita antigens. First an indirect ELISA was developed for detection and titration of anti-M.incognita antibodies. Results indicated as high as 320 K titre of the antisera. Finally competitive inhibition ELISA was developed employing these anti-M.incognita antibodies for detection of M.incognita antigens. Sensitivity of ELISA was 10 fg. Competitive inhibition ELISA developed in the present studies has the potential of being used as an easy, rapid, specific and sensitive diagnostic tool for the detection of M.incognita infection.

  16. Pediatric post-thrombotic syndrome in children: Toward the development of a new diagnostic and evaluative measurement tool.

    PubMed

    Avila, M L; Brandão, L R; Williams, S; Ward, L C; Montoya, M I; Stinson, J; Kiss, A; Lara-Corrales, I; Feldman, B M

    2016-08-01

    Our goal was to conduct the item generation and piloting phases of a new discriminative and evaluative tool for pediatric post-thrombotic syndrome. We followed a formative model for the development of the tool, focusing on the signs/symptoms (items) that define post-thrombotic syndrome. For item generation, pediatric thrombosis experts and subjects diagnosed with extremity post-thrombotic syndrome during childhood nominated items. In the piloting phase, items were cross-sectionally measured in children with limb deep vein thrombosis to examine item performance. Twenty-three experts and 16 subjects listed 34 items, which were then measured in 140 subjects with previous diagnosis of limb deep vein thrombosis (70 upper extremity and 70 lower extremity). The items with strongest correlation with post-thrombotic syndrome severity and largest area under the curve were pain (in older children), paresthesia, and swollen limb for the upper extremity group, and pain (in older children), tired limb, heaviness, tightness and paresthesia for the lower extremity group. The diagnostic properties of the items and their correlations with post-thrombotic syndrome severity varied according to the assessed venous territory. The information gathered in this study will help experts decide which item should be considered for inclusion in the new tool. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Sensitivity to change of mobility measures in musculoskeletal conditions on lower extremities in outpatient rehabilitation settings.

    PubMed

    Navarro-Pujalte, Esther; Gacto-Sánchez, Mariano; Montilla-Herrador, Joaquina; Escolar-Reina, Pilar; Ángeles Franco-Sierra, María; Medina-Mirapeix, Francesc

    2018-01-12

    Prospective longitudinal study. To examine the sensitivity of the Mobility Activities Measure for lower extremities and to compare it to the sensitivity of the Physical Functioning Scale (PF-10) and the Patient-Specific Functional Scale (PSFS) at week 4 and week 8 post-hospitalization in outpatient rehabilitation settings. Mobility Activities Measure is a set of short mobility measures to track outpatient rehabilitation progress: its scales have shown good properties but its sensitivity to change has not been reported. Patients with musculoskeletal conditions were recruited at admission in three outpatient rehabilitation settings in Spain. Data were collected at admission, week 4 and week 8 from an initial sample of 236 patients (mean age ± SD = 36.7 ± 11.1). Mobility Activities Measure scales for lower extremity; PF-10; and PSFS. All the Mobility Activities Measure scales were sensitive to both positive and negative changes (the Standardized Response Means (SRMs) ranged between 1.05 and 1.53 at week 4, and between 0.63 and 1.47 at week 8). The summary measure encompassing the three Mobility Activities Measure scales detected a higher proportion of participants who had improved beyond the minimal detectable change (MDC) than detected by the PSFS and the PF-10 both at week 4 (86.64% vs. 69.81% and 42.23%, respectively) and week 8 (71.14% vs. 55.65% and 60.81%, respectively). The three Mobility Activities Measure scales assessing the lower extremity can be used across outpatient rehabilitation settings to provide consistent and sensitive measures of changes in patients' mobility. Implications for rehabilitation All the scales of the Mobility Activities Measure for the lower extremity were sensitive to both positive and negative change across the follow-up periods. Overall, the summary measure encompassing the three Mobility Activities Measure scales for the lower extremity appeared more sensitive to positive changes than the Physical Functioning Scale, especially during the first four weeks of treatment. The summary measure also detected a higher percentage of participants with positive change that exceeded the minimal detectable change than the Patient-Specific Functional Scale and the Physical Functioning Scale at the first follow-up period. By demonstrating their consistency and sensitivity to change, the three Mobility Activities Measures scales can now be considered in order to track patients' functional progress. Mobility Activities Measure can be therefore used in patients with musculoskeletal conditions across outpatient rehabilitation settings to provide estimates of change in mobility activities focusing on the lower extremity.

  18. MysiRNA: improving siRNA efficacy prediction using a machine-learning model combining multi-tools and whole stacking energy (ΔG).

    PubMed

    Mysara, Mohamed; Elhefnawi, Mahmoud; Garibaldi, Jonathan M

    2012-06-01

    The investigation of small interfering RNA (siRNA) and its posttranscriptional gene-regulation has become an extremely important research topic, both for fundamental reasons and for potential longer-term therapeutic benefits. Several factors affect the functionality of siRNA including positional preferences, target accessibility and other thermodynamic features. State of the art tools aim to optimize the selection of target siRNAs by identifying those that may have high experimental inhibition. Such tools implement artificial neural network models as Biopredsi and ThermoComposition21, and linear regression models as DSIR, i-Score and Scales, among others. However, all these models have limitations in performance. In this work, a neural-network trained new siRNA scoring/efficacy prediction model was developed based on combining two existing scoring algorithms (ThermoComposition21 and i-Score), together with the whole stacking energy (ΔG), in a multi-layer artificial neural network. These three parameters were chosen after a comparative combinatorial study between five well known tools. Our developed model, 'MysiRNA' was trained on 2431 siRNA records and tested using three further datasets. MysiRNA was compared with 11 alternative existing scoring tools in an evaluation study to assess the predicted and experimental siRNA efficiency where it achieved the highest performance both in terms of correlation coefficient (R(2)=0.600) and receiver operating characteristics analysis (AUC=0.808), improving the prediction accuracy by up to 18% with respect to sensitivity and specificity of the best available tools. MysiRNA is a novel, freely accessible model capable of predicting siRNA inhibition efficiency with improved specificity and sensitivity. This multiclassifier approach could help improve the performance of prediction in several bioinformatics areas. MysiRNA model, part of MysiRNA-Designer package [1], is expected to play a key role in siRNA selection and evaluation. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. High resolution modelling of extreme precipitation events in urban areas

    NASA Astrophysics Data System (ADS)

    Siemerink, Martijn; Volp, Nicolette; Schuurmans, Wytze; Deckers, Dave

    2015-04-01

    The present day society needs to adjust to the effects of climate change. More extreme weather conditions are expected, which can lead to longer periods of drought, but also to more extreme precipitation events. Urban water systems are not designed for such extreme events. Most sewer systems are not able to drain the excessive storm water, causing urban flooding. This leads to high economic damage. In order to take appropriate measures against extreme urban storms, detailed knowledge about the behaviour of the urban water system above and below the streets is required. To investigate the behaviour of urban water systems during extreme precipitation events new assessment tools are necessary. These tools should provide a detailed and integral description of the flow in the full domain of overland runoff, sewer flow, surface water flow and groundwater flow. We developed a new assessment tool, called 3Di, which provides detailed insight in the urban water system. This tool is based on a new numerical methodology that can accurately deal with the interaction between overland runoff, sewer flow and surface water flow. A one-dimensional model for the sewer system and open channel flow is fully coupled to a two-dimensional depth-averaged model that simulates the overland flow. The tool uses a subgrid-based approach in order to take high resolution information of the sewer system and of the terrain into account [1, 2]. The combination of using the high resolution information and the subgrid based approach results in an accurate and efficient modelling tool. It is now possible to simulate entire urban water systems using extreme high resolution (0.5m x 0.5m) terrain data in combination with a detailed sewer and surface water network representation. The new tool has been tested in several Dutch cities, such as Rotterdam, Amsterdam and The Hague. We will present the results of an extreme precipitation event in the city of Schiedam (The Netherlands). This city deals with significant soil consolidation and the low-lying areas are prone to urban flooding. The simulation results are compared with measurements in the sewer network. References [1] Guus S. Stelling G.S., 2012. Quadtree flood simulations with subgrid digital elevation models. Water Management 165 (WM1):1329-1354. [2] Vincenzo Cassuli and Guus S. Stelling, 2013. A semi-implicit numerical model for urban drainage systems. International Journal for Numerical Methods in Fluids. Vol. 73:600-614. DOI: 10.1002/fld.3817

  20. On the nonlinearity of spatial scales in extreme weather attribution statements

    NASA Astrophysics Data System (ADS)

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; Alexander, Lisa V.; Wehner, Michael; Shiogama, Hideo; Wolski, Piotr; Ciavarella, Andrew; Christidis, Nikolaos

    2018-04-01

    In the context of ongoing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporal scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.

  1. On the nonlinearity of spatial scales in extreme weather attribution statements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah

    In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less

  2. On the nonlinearity of spatial scales in extreme weather attribution statements

    DOE PAGES

    Angélil, Oliver; Stone, Daíthí; Perkins-Kirkpatrick, Sarah; ...

    2017-06-17

    In the context of continuing climate change, extreme weather events are drawing increasing attention from the public and news media. A question often asked is how the likelihood of extremes might have changed by anthropogenic greenhouse-gas emissions. Answers to the question are strongly influenced by the model used, duration, spatial extent, and geographic location of the event—some of these factors often overlooked. Using output from four global climate models, we provide attribution statements characterised by a change in probability of occurrence due to anthropogenic greenhouse-gas emissions, for rainfall and temperature extremes occurring at seven discretised spatial scales and three temporalmore » scales. An understanding of the sensitivity of attribution statements to a range of spatial and temporal scales of extremes allows for the scaling of attribution statements, rendering them relevant to other extremes having similar but non-identical characteristics. This is a procedure simple enough to approximate timely estimates of the anthropogenic contribution to the event probability. Furthermore, since real extremes do not have well-defined physical borders, scaling can help quantify uncertainty around attribution results due to uncertainty around the event definition. Results suggest that the sensitivity of attribution statements to spatial scale is similar across models and that the sensitivity of attribution statements to the model used is often greater than the sensitivity to a doubling or halving of the spatial scale of the event. The use of a range of spatial scales allows us to identify a nonlinear relationship between the spatial scale of the event studied and the attribution statement.« less

  3. Electron-hole pairs generated in ZrO2 nanoparticle resist upon exposure to extreme ultraviolet radiation

    NASA Astrophysics Data System (ADS)

    Kozawa, Takahiro; Santillan, Julius Joseph; Itani, Toshiro

    2018-02-01

    Metal oxide nanoparticle resists have attracted much attention as the next-generation resist used for the high-volume production of semiconductor devices. However, the sensitization mechanism of the metal oxide nanoparticle resists is unknown. Understanding the sensitization mechanism is important for the efficient development of resist materials. In this study, the energy deposition in a zirconium oxide (ZrO2) nanoparticle resist was investigated. The numbers of electron-hole pairs generated in a ZrO2 core and an methacrylic acid (MAA) ligand shell upon exposure to 1 mJ cm-2 (exposure dose) extreme ultraviolet (EUV) radiations were theoretically estimated to be 0.16 at most and 0.04-0.17 cm2 mJ-1, respectively. By comparing the calculated distribution of electron-hole pairs with the line-and-space patterns of the ZrO2 nanoparticle resist fabricated by an EUV exposure tool, the number of electron-hole pairs required for the solubility change of the resist films was estimated to be 1.3-2.2 per NP. NP denotes a nanoparticle consisting of a metal oxide core with a ligand shell. In the material design of metal oxide nanoparticle resists, it is important to efficiently use the electron-hole pairs generated in the metal oxide core for the chemical change of ligand molecules.

  4. Rapid and Sensitive Quantification of Vibrio cholerae and Vibrio mimicus Cells in Water Samples by Use of Catalyzed Reporter Deposition Fluorescence In Situ Hybridization Combined with Solid-Phase Cytometry

    PubMed Central

    Schauer, Sonja; Sommer, Regina; Farnleitner, Andreas H.

    2012-01-01

    A new protocol for rapid, specific, and sensitive cell-based quantification of Vibrio cholerae/Vibrio mimicus in water samples was developed. The protocol is based on catalyzed reporter deposition fluorescence in situ hybridization (CARD-FISH) in combination with solid-phase cytometry. For pure cultures, we were able to quantify down to 6 V. cholerae cells on one membrane with a relative precision of 39% and down to 12 cells with a relative precision of 17% after hybridization with the horseradish peroxidase (HRP)-labeled probe Vchomim1276 (specific for V. cholerae and V. mimicus) and signal amplification. The corresponding position of the probe on the 16S rRNA is highly accessible even when labeled with HRP. For the first time, we were also able to successfully quantify V. cholerae/V. mimicus via solid-phase cytometry in extremely turbid environmental water samples collected in Austria. Cell numbers ranged from 4.5 × 101 cells ml−1 in the large saline lake Neusiedler See to 5.6 × 104 cells ml−1 in an extremely turbid shallow soda lake situated nearby. We therefore suggest CARD-FISH in combination with solid-phase cytometry as a powerful tool to quantify V. cholerae/V. mimicus in ecological studies as well as for risk assessment and monitoring programs. PMID:22885749

  5. An efficient diagnosis system for Parkinson's disease using kernel-based extreme learning machine with subtractive clustering features weighting approach.

    PubMed

    Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Zhao, Xue-Hua

    2014-01-01

    A novel hybrid method named SCFW-KELM, which integrates effective subtractive clustering features weighting and a fast classifier kernel-based extreme learning machine (KELM), has been introduced for the diagnosis of PD. In the proposed method, SCFW is used as a data preprocessing tool, which aims at decreasing the variance in features of the PD dataset, in order to further improve the diagnostic accuracy of the KELM classifier. The impact of the type of kernel functions on the performance of KELM has been investigated in detail. The efficiency and effectiveness of the proposed method have been rigorously evaluated against the PD dataset in terms of classification accuracy, sensitivity, specificity, area under the receiver operating characteristic (ROC) curve (AUC), f-measure, and kappa statistics value. Experimental results have demonstrated that the proposed SCFW-KELM significantly outperforms SVM-based, KNN-based, and ELM-based approaches and other methods in the literature and achieved highest classification results reported so far via 10-fold cross validation scheme, with the classification accuracy of 99.49%, the sensitivity of 100%, the specificity of 99.39%, AUC of 99.69%, the f-measure value of 0.9964, and kappa value of 0.9867. Promisingly, the proposed method might serve as a new candidate of powerful methods for the diagnosis of PD with excellent performance.

  6. An Efficient Diagnosis System for Parkinson's Disease Using Kernel-Based Extreme Learning Machine with Subtractive Clustering Features Weighting Approach

    PubMed Central

    Ma, Chao; Ouyang, Jihong; Chen, Hui-Ling; Zhao, Xue-Hua

    2014-01-01

    A novel hybrid method named SCFW-KELM, which integrates effective subtractive clustering features weighting and a fast classifier kernel-based extreme learning machine (KELM), has been introduced for the diagnosis of PD. In the proposed method, SCFW is used as a data preprocessing tool, which aims at decreasing the variance in features of the PD dataset, in order to further improve the diagnostic accuracy of the KELM classifier. The impact of the type of kernel functions on the performance of KELM has been investigated in detail. The efficiency and effectiveness of the proposed method have been rigorously evaluated against the PD dataset in terms of classification accuracy, sensitivity, specificity, area under the receiver operating characteristic (ROC) curve (AUC), f-measure, and kappa statistics value. Experimental results have demonstrated that the proposed SCFW-KELM significantly outperforms SVM-based, KNN-based, and ELM-based approaches and other methods in the literature and achieved highest classification results reported so far via 10-fold cross validation scheme, with the classification accuracy of 99.49%, the sensitivity of 100%, the specificity of 99.39%, AUC of 99.69%, the f-measure value of 0.9964, and kappa value of 0.9867. Promisingly, the proposed method might serve as a new candidate of powerful methods for the diagnosis of PD with excellent performance. PMID:25484912

  7. Trends and sensitivities of low streamflow extremes to discharge timing and magnitude in pacific northwest mountain streams

    USDA-ARS?s Scientific Manuscript database

    Historical streamflow data from the Pacific Northwest indicate that the precipitation amount has been the dominant control on the magnitude of low streamflow extremes compared to the air temperature-affected timing of snowmelt runoff. The relative sensitivities of low streamflow to precipitation and...

  8. A Normative Data Set for the Clinical Assessment of Achromatic and Chromatic Contrast Sensitivity Using a qCSF Approach.

    PubMed

    Kim, Yeon Jin; Reynaud, Alexandre; Hess, Robert F; Mullen, Kathy T

    2017-07-01

    The measurement of achromatic sensitivity has been an important tool for monitoring subtle changes in vision as the result of disease or response to therapy. In this study, we aimed to provide a normative data set for achromatic and chromatic contrast sensitivity functions within a common cone contrast space using an abbreviated measurement approach suitable for clinical practice. In addition, we aimed to provide comparisons of achromatic and chromatic binocular summation across spatial frequency. We estimated monocular cone contrast sensitivity functions (CCSFs) using a quick Contrast Sensitivity Function (qCSF) approach for achromatic as well as isoluminant, L/M cone opponent, and S cone opponent stimuli in a healthy population of 51 subjects. We determined the binocular CCSFs for achromatic and chromatic vision to evaluate the degree of binocular summation across spatial frequency for these three different mechanisms in a subset of 20 subjects. Each data set shows consistent contrast sensitivity across the population. They highlight the extremely high cone contrast sensitivity of L/M cone opponency compared with the S-cone and achromatic responses. We also find that the two chromatic sensitivities are correlated across the healthy population. In addition, binocular summation for all mechanisms depends strongly on stimulus spatial frequency. This study, using an approach well suited to the clinic, is the first to provide a comparative normative data set for the chromatic and achromatic contrast sensitivity functions, yielding quantitative comparisons of achromatic, L/M cone opponent, and S cone opponent chromatic sensitivities as a function of spatial frequency.

  9. Heterogeneous Sensitivity of Tropical Precipitation Extremes during Growth and Mature Phases of Atmospheric Warming

    NASA Astrophysics Data System (ADS)

    Parhi, P.; Giannini, A.; Lall, U.; Gentine, P.

    2016-12-01

    Assessing and managing risks posed by climate variability and change is challenging in the tropics, from both a socio-economic and a scientific perspective. Most of the vulnerable countries with a limited climate adaptation capability are in the tropics. However, climate projections, particularly of extreme precipitation, are highly uncertain there. The CMIP5 (Coupled Model Inter- comparison Project - Phase 5) inter-model range of extreme precipitation sensitivity to the global temperature under climate change is much larger in the tropics as compared to the extra-tropics. It ranges from nearly 0% to greater than 30% across models (O'Gorman 2012). The uncertainty is also large in historical gauge or satellite based observational records. These large uncertainties in the sensitivity of tropical precipitation extremes highlight the need to better understand how tropical precipitation extremes respond to warming. We hypothesize that one of the factors explaining the large uncertainty is due to differing sensitivities during different phases of warming. We consider the `growth' and `mature' phases of warming under climate variability case- typically associated with an El Niño event. In the remote tropics (away from tropical Pacific Ocean), the response of the precipitation extremes during the two phases can be through different pathways: i) a direct and fast changing radiative forcing in an atmospheric column, acting top-down due to the tropospheric warming, and/or ii) an indirect effect via changes in surface temperatures, acting bottom-up through surface water and energy fluxes. We also speculate that the insights gained here might be useful in interpreting the large sensitivity under climate change scenarios, since the physical mechanisms during the two warming phases under climate variability case, have some correspondence with an increasing and stabilized green house gas emission scenarios.

  10. Understanding environmental DNA detection probabilities: A case study using a stream-dwelling char Salvelinus fontinalis

    USGS Publications Warehouse

    Wilcox, Taylor M; Mckelvey, Kevin S.; Young, Michael K.; Sepulveda, Adam; Shepard, Bradley B.; Jane, Stephen F; Whiteley, Andrew R.; Lowe, Winsor H.; Schwartz, Michael K.

    2016-01-01

    Environmental DNA sampling (eDNA) has emerged as a powerful tool for detecting aquatic animals. Previous research suggests that eDNA methods are substantially more sensitive than traditional sampling. However, the factors influencing eDNA detection and the resulting sampling costs are still not well understood. Here we use multiple experiments to derive independent estimates of eDNA production rates and downstream persistence from brook trout (Salvelinus fontinalis) in streams. We use these estimates to parameterize models comparing the false negative detection rates of eDNA sampling and traditional backpack electrofishing. We find that using the protocols in this study eDNA had reasonable detection probabilities at extremely low animal densities (e.g., probability of detection 0.18 at densities of one fish per stream kilometer) and very high detection probabilities at population-level densities (e.g., probability of detection > 0.99 at densities of ≥ 3 fish per 100 m). This is substantially more sensitive than traditional electrofishing for determining the presence of brook trout and may translate into important cost savings when animals are rare. Our findings are consistent with a growing body of literature showing that eDNA sampling is a powerful tool for the detection of aquatic species, particularly those that are rare and difficult to sample using traditional methods.

  11. Combined use of the postpartum depression screening scale (PDSS) and Edinburgh postnatal depression scale (EPDS) to identify antenatal depression among Chinese pregnant women with obstetric complications.

    PubMed

    Zhao, Ying; Kane, Irene; Wang, Jing; Shen, Beibei; Luo, Jianfeng; Shi, Shenxun

    2015-03-30

    The purpose of the present study was to evaluate antenatal depression screening employing two scales: the Postpartum Depression Screening Scale (PDSS) and Edinburgh Postnatal Depression Scale (EPDS) for the population of Chinese pregnant women with obstetric complications. A convenience sample of 842 Chinese pregnant women with complications participated in this study. The PDSS total score correlated strongly with the EPDS total score (r=0.652, p=0.000). Each tool performed extremely well for detecting major and major/minor depressions with PDSS resulting in a better psychometric performance than EPDS (p<0.01). If combined use, the recommended EPDS cut-off score was 8/9 for major depression, at which the sensitivity (71.6%) and specificity (87.6%) were the best, and the recommended PDSS cut-off score was 79/80 for major depression, along with its best sensitivity (86.4%) and specificity (100%). The study concluded that EPDS and PDSS appear to be reliable assessments for major and minor depression among the Chinese pregnant women with obstetric complications. Combined use of these tools should consider lower cutoff scores to reduce the misdiagnosis and improve the screening validity. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  12. Plant pathogen nanodiagnostic techniques: forthcoming changes?

    PubMed Central

    Khiyami, Mohammad A.; Almoammar, Hassan; Awad, Yasser M.; Alghuthaymi, Mousa A.; Abd-Elsalam, Kamel A.

    2014-01-01

    Plant diseases are among the major factors limiting crop productivity. A first step towards managing a plant disease under greenhouse and field conditions is to correctly identify the pathogen. Current technologies, such as quantitative polymerase chain reaction (Q-PCR), require a relatively large amount of target tissue and rely on multiple assays to accurately identify distinct plant pathogens. The common disadvantage of the traditional diagnostic methods is that they are time consuming and lack high sensitivity. Consequently, developing low-cost methods to improve the accuracy and rapidity of plant pathogens diagnosis is needed. Nanotechnology, nano particles and quantum dots (QDs) have emerged as essential tools for fast detection of a particular biological marker with extreme accuracy. Biosensor, QDs, nanostructured platforms, nanoimaging and nanopore DNA sequencing tools have the potential to raise sensitivity, specificity and speed of the pathogen detection, facilitate high-throughput analysis, and to be used for high-quality monitoring and crop protection. Furthermore, nanodiagnostic kit equipment can easily and quickly detect potential serious plant pathogens, allowing experts to help farmers in the prevention of epidemic diseases. The current review deals with the application of nanotechnology for quicker, more cost-effective and precise diagnostic procedures of plant diseases. Such an accurate technology may help to design a proper integrated disease management system which may modify crop environments to adversely affect crop pathogens. PMID:26740775

  13. Recent Advances in Biosensing With Photonic Crystal Surfaces: A Review

    PubMed Central

    Cunningham, B.T.; Zhang, M.; Zhuo, Y.; Kwon, L.; Race, C.

    2016-01-01

    Photonic crystal surfaces that are designed to function as wavelength-selective optical resonators have become a widely adopted platform for label-free biosensing, and for enhancement of the output of photon-emitting tags used throughout life science research and in vitro diagnostics. While some applications, such as analysis of drug-protein interactions, require extremely high resolution and the ability to accurately correct for measurement artifacts, others require sensitivity that is high enough for detection of disease biomarkers in serum with concentrations less than 1 pg/ml. As the analysis of cells becomes increasingly important for studying the behavior of stem cells, cancer cells, and biofilms under a variety of conditions, approaches that enable high resolution imaging of live cells without cytotoxic stains or photobleachable fluorescent dyes are providing new tools to biologists who seek to observe individual cells over extended time periods. This paper will review several recent advances in photonic crystal biosensor detection instrumentation and device structures that are being applied towards direct detection of small molecules in the context of high throughput drug screening, photonic crystal fluorescence enhancement as utilized for high sensitivity multiplexed cancer biomarker detection, and label-free high resolution imaging of cells and individual nanoparticles as a new tool for life science research and single-molecule diagnostics. PMID:27642265

  14. Building Flexible User Interfaces for Solving PDEs

    NASA Astrophysics Data System (ADS)

    Logg, Anders; Wells, Garth N.

    2010-09-01

    FEniCS is a collection of software tools for the automated solution of differential equations by finite element methods. In this note, we describe how FEniCS can be used to solve a simple nonlinear model problem with varying levels of automation. At one extreme, FEniCS provides tools for the fully automated and adaptive solution of nonlinear partial differential equations. At the other extreme, FEniCS provides a range of tools that allow the computational scientist to experiment with novel solution algorithms.

  15. Vulnerability of global food production to extreme climatic events.

    PubMed

    Yeni, F; Alpas, H

    2017-06-01

    It is known that the frequency, intensity or duration of the extreme climatic events have been changing substantially. The ultimate goal of this study was to identify current vulnerabilities of global primary food production against extreme climatic events, and to discuss potential entry points for adaptation planning by means of an explorative vulnerability analysis. Outcomes of this analysis were demonstrated as a composite index where 118 country performances in maintaining safety of food production were compared and ranked against climate change. In order to better interpret the results, cluster analysis technique was used as a tool to group the countries based on their vulnerability index (VI) scores. Results suggested that one sixth of the countries analyzed were subject to high level of exposure (0.45-1), one third to high to very high level of sensitivity (0.41-1) and low to moderate level of adaptive capacity (0-0.59). Proper adaptation strategies for reducing the microbial and chemical contamination of food products, soil and waters on the field were proposed. Finally, availability of data on food safety management systems and occurrence of foodborne outbreaks with global coverage were proposed as key factors for improving the robustness of future vulnerability assessments. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Simulating the Response of Urban Water Quality to Climate and Land Use Change in Partially Urbanized Basins

    NASA Astrophysics Data System (ADS)

    Sun, N.; Yearsley, J. R.; Nijssen, B.; Lettenmaier, D. P.

    2014-12-01

    Urban stream quality is particularly susceptible to extreme precipitation events and land use change. Although the projected effects of extreme events and land use change on hydrology have been resonably well studied, the impacts on urban water quality have not been widely examined due in part to the scale mismatch between global climate models and the spatial scales required to represent urban hydrology and water quality signals. Here we describe a grid-based modeling system that integrates the Distributed Hydrology Soil Vegetation Model (DHSVM) and urban water quality module adpated from EPA's Storm Water Management Model (SWMM) and Soil and water assessment tool (SWAT). Using the model system, we evaluate, for four partially urbanized catchments within the Puget Sound basin, urban water quality under current climate conditions, and projected potential changes in urban water quality associated with future changes in climate and land use. We examine in particular total suspended solids, toal nitrogen, total phosphorous, and coliform bacteria, with catchment representations at the 150-meter spatial resolution and the sub-daily timestep. We report long-term streamflow and water quality predictions in response to extreme precipitation events of varying magnitudes in the four partially urbanized catchments. Our simulations show that urban water quality is highly sensitive to both climatic and land use change.

  17. Trends and sensitivities of low streamflow extremes to discharge timing and magnitude in Pacific Northwest mountain streams

    Treesearch

    Patrick R. Kormos; Charlie Luce; Seth J. Wenger; Wouter R. Berghuijs

    2016-01-01

    Path analyses of historical streamflow data from the Pacific Northwest indicate that the precipitation amount has been the dominant control on the magnitude of low streamflow extremes compared to the air temperature-affected timing of snowmelt runoff. The relative sensitivities of low streamflow to precipitation and temperature changes have important...

  18. Diagnostic accuracy of ultrasound in upper and lower extremity long bone fractures of emergency department trauma patients.

    PubMed

    Frouzan, Arash; Masoumi, Kambiz; Delirroyfard, Ali; Mazdaie, Behnaz; Bagherzadegan, Elnaz

    2017-08-01

    Long bone fractures are common injuries caused by trauma. Some studies have demonstrated that ultrasound has a high sensitivity and specificity in the diagnosis of upper and lower extremity long bone fractures. The aim of this study was to determine the accuracy of ultrasound compared with plain radiography in diagnosis of upper and lower extremity long bone fractures in traumatic patients. This cross-sectional study assessed 100 patients admitted to the emergency department of Imam Khomeini Hospital, Ahvaz, Iran with trauma to the upper and lower extremities, from September 2014 through October 2015. In all patients, first ultrasound and then standard plain radiography for the upper and lower limb was performed. Data were analyzed by SPSS version 21 to determine the specificity and sensitivity. The mean age of patients with upper and lower limb trauma were 31.43±12.32 years and 29.63±5.89 years, respectively. Radius fracture was the most frequent compared to other fractures (27%). Sensitivity, specificity, positive predicted value, and negative predicted value of ultrasound compared with plain radiography in the diagnosis of upper extremity long bones were 95.3%, 87.7%, 87.2% and 96.2%, respectively, and the highest accuracy was observed in left arm fractures (100%). Tibia and fibula fractures were the most frequent types compared to other fractures (89.2%). Sensitivity, specificity, PPV and NPV of ultrasound compared with plain radiography in the diagnosis of upper extremity long bone fractures were 98.6%, 83%, 65.4% and 87.1%, respectively, and the highest accuracy was observed in men, lower ages and femoral fractures. The results of this study showed that ultrasound compared with plain radiography has a high accuracy in the diagnosis of upper and lower extremity long bone fractures.

  19. Diagnostic accuracy of history, physical examination, and bedside ultrasound for diagnosis of extremity fractures in the emergency department: a systematic review.

    PubMed

    Joshi, Nikita; Lira, Alena; Mehta, Ninfa; Paladino, Lorenzo; Sinert, Richard

    2013-01-01

    Understanding history, physical examination, and ultrasonography (US) to diagnose extremity fractures compared with radiography has potential benefits of decreasing radiation exposure, costs, and pain and improving emergency department (ED) resource management and triage time. The authors performed two electronic searches using PubMed and EMBASE databases for studies published between 1965 to 2012 using a strategy based on the inclusion of any patient presenting with extremity injuries suspicious for fracture who had history and physical examination and a separate search for US performed by an emergency physician (EP) with subsequent radiography. The primary outcome was operating characteristics of ED history, physical examination, and US in diagnosing radiologically proven extremity fractures. The methodologic quality of the studies was assessed using the quality assessment of studies of diagnostic accuracy tool (QUADAS-2). Nine studies met the inclusion criteria for history and physical examination, while eight studies met the inclusion criteria for US. There was significant heterogeneity in the studies that prevented data pooling. Data were organized into subgroups based on anatomic fracture locations, but heterogeneity within the subgroups also prevented data pooling. The prevalence of fracture varied among the studies from 22% to 70%. Upper extremity physical examination tests have positive likelihood ratios (LRs) ranging from 1.2 to infinity and negative LRs ranging from 0 to 0.8. US sensitivities varied between 85% and 100%, specificities varied between 73% and 100%, positive LRs varied between 3.2 and 56.1, and negative LRs varied between 0 and 0.2. Compared with radiography, EP US is an accurate diagnostic test to rule in or rule out extremity fractures. The diagnostic accuracy for history and physical examination are inconclusive. Future research is needed to understand the accuracy of ED US when combined with history and physical examination for upper and lower extremity fractures. © 2013 by the Society for Academic Emergency Medicine.

  20. Pattern Inspection of EUV Masks Using DUV Light

    NASA Astrophysics Data System (ADS)

    Liang, Ted; Tejnil, Edita; Stivers, Alan R.

    2002-12-01

    Inspection of extreme ultraviolet (EUV) lithography masks requires reflected light and this poses special challenges for inspection tool suppliers as well as for mask makers. Inspection must detect all the printable defects in the absorber pattern as well as printable process-related defects. Progress has been made under the NIST ATP project on "Intelligent Mask Inspection Systems for Next Generation Lithography" in assessing the factors that impact the inspection tool sensitivity. We report in this paper the inspection of EUV masks with programmed absorber defects using 257nm light. All the materials of interests for masks are highly absorptive to EUV light as compared to deep ultraviolet (DUV) light. Residues and contamination from mask fabrication process and handling are prone to be printable. Therefore, it is critical to understand their EUV printability and optical inspectability. Process related defects may include residual buffer layer such as oxide, organic contaminants and possible over-etch to the multilayer surface. Both simulation and experimental results will be presented in this paper.

  1. Sensitivity of Hydrologic Extremes to Spatial Resolution of Meteorological Forcings: A Case Study of the Conterminous United States

    NASA Astrophysics Data System (ADS)

    Kao, S. C.; Naz, B. S.; Gangrade, S.; Ashfaq, M.; Rastogi, D.

    2016-12-01

    The magnitude and frequency of hydroclimate extremes are projected to increase in the conterminous United States (CONUS) with significant implications for future water resource planning and flood risk management. Nevertheless, apart from the change of natural environment, the choice of model spatial resolution could also artificially influence the features of simulated extremes. To better understand how the spatial resolution of meteorological forcings may affect hydroclimate projections, we test the runoff sensitivity using the Variable Infiltration Capacity (VIC) model that was calibrated for each CONUS 8-digit hydrologic unit (HUC8) at 1/24° ( 4km) grid resolution. The 1980-2012 gridded Daymet and PRISM meteorological observations are used to conduct the 1/24° resolution control simulation. Comparative simulations are achieved by smoothing the 1/24° forcing into 1/12° and 1/8° resolutions which are then used to drive the VIC model for the CONUS. In addition, we also test how the simulated high and low runoff conditions would react to change in precipitation (±10%) and temperature (+1°C). The results are further analyzed for various types of hydroclimate extremes across different watersheds in the CONUS. This work helps us understand the sensitivity of simulated runoff to different spatial resolutions of climate forcings and also its sensitivity to different watershed sizes and characteristics of extreme events in the future climate conditions.

  2. Economic Evidence on the Health Impacts of Climate Change in Europe

    PubMed Central

    Hutton, Guy; Menne, Bettina

    2014-01-01

    BACKGROUND In responding to the health impacts of climate change, economic evidence and tools inform decision makers of the efficiency of alternative health policies and interventions. In a time when sweeping budget cuts are affecting all tiers of government, economic evidence on health protection from climate change spending enables comparison with other public spending. METHODS The review included 53 countries of the World Health Organization (WHO) European Region. Literature was obtained using a Medline and Internet search of key terms in published reports and peer-reviewed literature, and from institutions working on health and climate change. Articles were included if they provided economic estimation of the health impacts of climate change or adaptation measures to protect health from climate change in the WHO European Region. Economic studies are classified under health impact cost, health adaptation cost, and health economic evaluation (comparing both costs and impacts). RESULTS A total of 40 relevant studies from Europe were identified, covering the health damage or adaptation costs related to the health effects of climate change and response measures to climate-sensitive diseases. No economic evaluation studies were identified of response measures specific to the impacts of climate change. Existing studies vary in terms of the economic outcomes measured and the methods for evaluation of health benefits. The lack of robust health impact data underlying economic studies significantly affects the availability and precision of economic studies. CONCLUSIONS Economic evidence in European countries on the costs of and response to climate-sensitive diseases is extremely limited and fragmented. Further studies are urgently needed that examine health impacts and the costs and efficiency of alternative responses to climate-sensitive health conditions, in particular extreme weather events (other than heat) and potential emerging diseases and other conditions threatening Europe. PMID:25452694

  3. Economic evidence on the health impacts of climate change in europe.

    PubMed

    Hutton, Guy; Menne, Bettina

    2014-01-01

    In responding to the health impacts of climate change, economic evidence and tools inform decision makers of the efficiency of alternative health policies and interventions. In a time when sweeping budget cuts are affecting all tiers of government, economic evidence on health protection from climate change spending enables comparison with other public spending. The review included 53 countries of the World Health Organization (WHO) European Region. Literature was obtained using a Medline and Internet search of key terms in published reports and peer-reviewed literature, and from institutions working on health and climate change. Articles were included if they provided economic estimation of the health impacts of climate change or adaptation measures to protect health from climate change in the WHO European Region. Economic studies are classified under health impact cost, health adaptation cost, and health economic evaluation (comparing both costs and impacts). A total of 40 relevant studies from Europe were identified, covering the health damage or adaptation costs related to the health effects of climate change and response measures to climate-sensitive diseases. No economic evaluation studies were identified of response measures specific to the impacts of climate change. Existing studies vary in terms of the economic outcomes measured and the methods for evaluation of health benefits. The lack of robust health impact data underlying economic studies significantly affects the availability and precision of economic studies. Economic evidence in European countries on the costs of and response to climate-sensitive diseases is extremely limited and fragmented. Further studies are urgently needed that examine health impacts and the costs and efficiency of alternative responses to climate-sensitive health conditions, in particular extreme weather events (other than heat) and potential emerging diseases and other conditions threatening Europe.

  4. Neural mechanisms underlying sensitivity to reverse-phi motion in the fly

    PubMed Central

    Meier, Matthias; Serbe, Etienne; Eichner, Hubert; Borst, Alexander

    2017-01-01

    Optical illusions provide powerful tools for mapping the algorithms and circuits that underlie visual processing, revealing structure through atypical function. Of particular note in the study of motion detection has been the reverse-phi illusion. When contrast reversals accompany discrete movement, detected direction tends to invert. This occurs across a wide range of organisms, spanning humans and invertebrates. Here, we map an algorithmic account of the phenomenon onto neural circuitry in the fruit fly Drosophila melanogaster. Through targeted silencing experiments in tethered walking flies as well as electrophysiology and calcium imaging, we demonstrate that ON- or OFF-selective local motion detector cells T4 and T5 are sensitive to certain interactions between ON and OFF. A biologically plausible detector model accounts for subtle features of this particular form of illusory motion reversal, like the re-inversion of turning responses occurring at extreme stimulus velocities. In light of comparable circuit architecture in the mammalian retina, we suggest that similar mechanisms may apply even to human psychophysics. PMID:29261684

  5. Neural mechanisms underlying sensitivity to reverse-phi motion in the fly.

    PubMed

    Leonhardt, Aljoscha; Meier, Matthias; Serbe, Etienne; Eichner, Hubert; Borst, Alexander

    2017-01-01

    Optical illusions provide powerful tools for mapping the algorithms and circuits that underlie visual processing, revealing structure through atypical function. Of particular note in the study of motion detection has been the reverse-phi illusion. When contrast reversals accompany discrete movement, detected direction tends to invert. This occurs across a wide range of organisms, spanning humans and invertebrates. Here, we map an algorithmic account of the phenomenon onto neural circuitry in the fruit fly Drosophila melanogaster. Through targeted silencing experiments in tethered walking flies as well as electrophysiology and calcium imaging, we demonstrate that ON- or OFF-selective local motion detector cells T4 and T5 are sensitive to certain interactions between ON and OFF. A biologically plausible detector model accounts for subtle features of this particular form of illusory motion reversal, like the re-inversion of turning responses occurring at extreme stimulus velocities. In light of comparable circuit architecture in the mammalian retina, we suggest that similar mechanisms may apply even to human psychophysics.

  6. Enhanced optical coupling and Raman scattering via microscopic interface engineering

    NASA Astrophysics Data System (ADS)

    Thompson, Jonathan V.; Hokr, Brett H.; Kim, Wihan; Ballmann, Charles W.; Applegate, Brian E.; Jo, Javier A.; Yamilov, Alexey; Cao, Hui; Scully, Marlan O.; Yakovlev, Vladislav V.

    2017-11-01

    Spontaneous Raman scattering is an extremely powerful tool for the remote detection and identification of various chemical materials. However, when those materials are contained within strongly scattering or turbid media, as is the case in many biological and security related systems, the sensitivity and range of Raman signal generation and detection is severely limited. Here, we demonstrate that through microscopic engineering of the optical interface, the optical coupling of light into a turbid material can be substantially enhanced. This improved coupling facilitates the enhancement of the Raman scattering signal generated by molecules within the medium. In particular, we detect at least two-orders of magnitude more spontaneous Raman scattering from a sample when the pump laser light is focused into a microscopic hole in the surface of the sample. Because this approach enhances both the interaction time and interaction region of the laser light within the material, its use will greatly improve the range and sensitivity of many spectroscopic techniques, including Raman scattering and fluorescence emission detection, inside highly scattering environments.

  7. The Stanford-U.S. Geological Survey SHRIMP ion microprobe--a tool for micro-scale chemical and isotopic analysis

    USGS Publications Warehouse

    Bacon, Charles R.; Grove, Marty; Vazquez, Jorge A.; Coble, Matthew A.

    2012-01-01

    Answers to many questions in Earth science require chemical analysis of minute volumes of minerals, volcanic glass, or biological materials. Secondary Ion Mass Spectrometry (SIMS) is an extremely sensitive analytical method in which a 5–30 micrometer diameter "primary" beam of charged particles (ions) is focused on a region of a solid specimen to sputter secondary ions from 1–5 nanograms of the sample under high vacuum. The elemental abundances and isotopic ratios of these secondary ions are determined with a mass spectrometer. These results can be used for geochronology to determine the age of a region within a crystal thousands to billions of years old or to precisely measure trace abundances of chemical elements at concentrations as low as parts per billion. A partnership of the U.S. Geological Survey and the Stanford University School of Earth Sciences operates a large SIMS instrument, the Sensitive High-Resolution Ion Microprobe with Reverse Geometry (SHRIMP–RG) on the Stanford campus.

  8. Current trends in nanobiosensor technology

    PubMed Central

    Wu, Diana; Langer, Robert S

    2014-01-01

    The development of tools and processes used to fabricate, measure, and image nanoscale objects has lead to a wide range of work devoted to producing sensors that interact with extremely small numbers (or an extremely small concentration) of analyte molecules. These advances are particularly exciting in the context of biosensing, where the demands for low concentration detection and high specificity are great. Nanoscale biosensors, or nanobiosensors, provide researchers with an unprecedented level of sensitivity, often to the single molecule level. The use of biomolecule-functionalized surfaces can dramatically boost the specificity of the detection system, but can also yield reproducibility problems and increased complexity. Several nanobiosensor architectures based on mechanical devices, optical resonators, functionalized nanoparticles, nanowires, nanotubes, and nanofibers have been demonstrated in the lab. As nanobiosensor technology becomes more refined and reliable, it is likely it will eventually make its way from the lab to the clinic, where future lab-on-a-chip devices incorporating an array of nanobiosensors could be used for rapid screening of a wide variety of analytes at low cost using small samples of patient material. PMID:21391305

  9. Effects of ocean acidification increase embryonic sensitivity to thermal extremes in Atlantic cod, Gadus morhua.

    PubMed

    Dahlke, Flemming T; Leo, Elettra; Mark, Felix C; Pörtner, Hans-Otto; Bickmeyer, Ulf; Frickenhaus, Stephan; Storch, Daniela

    2017-04-01

    Thermal tolerance windows serve as a powerful tool for estimating the vulnerability of marine species and their life stages to increasing temperature means and extremes. However, it remains uncertain to which extent additional drivers, such as ocean acidification, modify organismal responses to temperature. This study investigated the effects of CO 2 -driven ocean acidification on embryonic thermal sensitivity and performance in Atlantic cod, Gadus morhua, from the Kattegat. Fertilized eggs were exposed to factorial combinations of two PCO 2 conditions (400 μatm vs. 1100 μatm) and five temperature treatments (0, 3, 6, 9 and 12 °C), which allow identifying both lower and upper thermal tolerance thresholds. We quantified hatching success, oxygen consumption (MO 2 ) and mitochondrial functioning of embryos as well as larval morphometrics at hatch and the abundance of acid-base-relevant ionocytes on the yolk sac epithelium of newly hatched larvae. Hatching success was high under ambient spawning conditions (3-6 °C), but decreased towards both cold and warm temperature extremes. Elevated PCO 2 caused a significant decrease in hatching success, particularly at cold (3 and 0 °C) and warm (12 °C) temperatures. Warming imposed limitations to MO 2 and mitochondrial capacities. Elevated PCO 2 stimulated MO 2 at cold and intermediate temperatures, but exacerbated warming-induced constraints on MO 2 , indicating a synergistic interaction with temperature. Mitochondrial functioning was not affected by PCO 2 . Increased MO 2 in response to elevated PCO 2 was paralleled by reduced larval size at hatch. Finally, ionocyte abundance decreased with increasing temperature, but did not differ between PCO 2 treatments. Our results demonstrate increased thermal sensitivity of cod embryos under future PCO 2 conditions and suggest that acclimation to elevated PCO 2 requires reallocation of limited resources at the expense of embryonic growth. We conclude that ocean acidification constrains the thermal performance window of embryos, which has important implication for the susceptibility of cod to projected climate change. © 2016 John Wiley & Sons Ltd.

  10. Understanding neuromotor strategy during functional upper extremity tasks using symbolic dynamics.

    PubMed

    Nathan, Dominic E; Guastello, Stephen J; Prost, Robert W; Jeutter, Dean C

    2012-01-01

    The ability to model and quantify brain activation patterns that pertain to natural neuromotor strategy of the upper extremities during functional task performance is critical to the development of therapeutic interventions such as neuroprosthetic devices. The mechanisms of information flow, activation sequence and patterns, and the interaction between anatomical regions of the brain that are specific to movement planning, intention and execution of voluntary upper extremity motor tasks were investigated here. This paper presents a novel method using symbolic dynamics (orbital decomposition) and nonlinear dynamic tools of entropy, self-organization and chaos to describe the underlying structure of activation shifts in regions of the brain that are involved with the cognitive aspects of functional upper extremity task performance. Several questions were addressed: (a) How is it possible to distinguish deterministic or causal patterns of activity in brain fMRI from those that are really random or non-contributory to the neuromotor control process? (b) Can the complexity of activation patterns over time be quantified? (c) What are the optimal ways of organizing fMRI data to preserve patterns of activation, activation levels, and extract meaningful temporal patterns as they evolve over time? Analysis was performed using data from a custom developed time resolved fMRI paradigm involving human subjects (N=18) who performed functional upper extremity motor tasks with varying time delays between the onset of intention and onset of actual movements. The results indicate that there is structure in the data that can be quantified through entropy and dimensional complexity metrics and statistical inference, and furthermore, orbital decomposition is sensitive in capturing the transition of states that correlate with the cognitive aspects of functional task performance.

  11. Nonstationary Extreme Value Analysis in a Changing Climate: A Software Package

    NASA Astrophysics Data System (ADS)

    Cheng, L.; AghaKouchak, A.; Gilleland, E.

    2013-12-01

    Numerous studies show that climatic extremes have increased substantially in the second half of the 20th century. For this reason, analysis of extremes under a nonstationary assumption has received a great deal of attention. This paper presents a software package developed for estimation of return levels, return periods, and risks of climatic extremes in a changing climate. This MATLAB software package offers tools for analysis of climate extremes under both stationary and non-stationary assumptions. The Nonstationary Extreme Value Analysis (hereafter, NEVA) provides an efficient and generalized framework for analyzing extremes using Bayesian inference. NEVA estimates the extreme value parameters using a Differential Evolution Markov Chain (DE-MC) which utilizes the genetic algorithm Differential Evolution (DE) for global optimization over the real parameter space with the Markov Chain Monte Carlo (MCMC) approach and has the advantage of simplicity, speed of calculation and convergence over conventional MCMC. NEVA also offers the confidence interval and uncertainty bounds of estimated return levels based on the sampled parameters. NEVA integrates extreme value design concepts, data analysis tools, optimization and visualization, explicitly designed to facilitate analysis extremes in geosciences. The generalized input and output files of this software package make it attractive for users from across different fields. Both stationary and nonstationary components of the package are validated for a number of case studies using empirical return levels. The results show that NEVA reliably describes extremes and their return levels.

  12. Optical modeling of waveguide coupled TES detectors towards the SAFARI instrument for SPICA

    NASA Astrophysics Data System (ADS)

    Trappe, N.; Bracken, C.; Doherty, S.; Gao, J. R.; Glowacka, D.; Goldie, D.; Griffin, D.; Hijmering, R.; Jackson, B.; Khosropanah, P.; Mauskopf, P.; Morozov, D.; Murphy, A.; O'Sullivan, C.; Ridder, M.; Withington, S.

    2012-09-01

    The next generation of space missions targeting far-infrared wavelengths will require large-format arrays of extremely sensitive detectors. The development of Transition Edge Sensor (TES) array technology is being developed for future Far-Infrared (FIR) space applications such as the SAFARI instrument for SPICA where low-noise and high sensitivity is required to achieve ambitious science goals. In this paper we describe a modal analysis of multi-moded horn antennas feeding integrating cavities housing TES detectors with superconducting film absorbers. In high sensitivity TES detector technology the ability to control the electromagnetic and thermo-mechanical environment of the detector is critical. Simulating and understanding optical behaviour of such detectors at far IR wavelengths is difficult and requires development of existing analysis tools. The proposed modal approach offers a computationally efficient technique to describe the partial coherent response of the full pixel in terms of optical efficiency and power leakage between pixels. Initial wok carried out as part of an ESA technical research project on optical analysis is described and a prototype SAFARI pixel design is analyzed where the optical coupling between the incoming field and the pixel containing horn, cavity with an air gap, and thin absorber layer are all included in the model to allow a comprehensive optical characterization. The modal approach described is based on the mode matching technique where the horn and cavity are described in the traditional way while a technique to include the absorber was developed. Radiation leakage between pixels is also included making this a powerful analysis tool.

  13. Holistic view to integrated climate change assessment and extreme weather adaptation in the Lake Victoria Basin East Africa

    NASA Astrophysics Data System (ADS)

    Mutua, F.; Koike, T.

    2013-12-01

    Extreme weather events have been the leading cause of disasters and damage all over the world.The primary ingredient to these disasters especially floods is rainfall which over the years, despite advances in modeling, computing power and use of new data and technologies, has proven to be difficult to predict. Also, recent climate projections showed a pattern consistent with increase in the intensity and frequency of extreme events in the East African region.We propose a holistic integrated approach to climate change assessment and extreme event adaptation through coupling of analysis techniques, tools and data. The Lake Victoria Basin (LVB) in East Africa supports over three million livelihoods and is a valuable resource to five East African countries as a source of water and means of transport. However, with a Mesoscale weather regime driven by land and lake dynamics,extreme Mesoscale events have been prevalent and the region has been on the receiving end during anomalously wet years in the region. This has resulted in loss of lives, displacements, and food insecurity. In the LVB, the effects of climate change are increasingly being recognized as a significant contributor to poverty, by its linkage to agriculture, food security and water resources. Of particular importance are the likely impacts of climate change in frequency and intensity of extreme events. To tackle this aspect, this study adopted an integrated regional, mesoscale and basin scale approach to climate change assessment. We investigated the projected changes in mean climate over East Africa, diagnosed the signals of climate change in the atmosphere, and transferred this understanding to mesoscale and basin scale. Changes in rainfall were analyzed and similar to the IPCC AR4 report; the selected three General Circulation Models (GCMs) project a wetter East Africa with intermittent dry periods in June-August. Extreme events in the region are projected to increase; with the number of wet days exceeding the 90% percentile of 1981-2000 likely to increase by 20-40% in the whole region. We also focused on short-term weather forecasting as a step towards adapting to a changing climate. This involved dynamic downscaling of global weather forecasts to high resolution with a special focus on extreme events. By utilizing complex model dynamics, the system was able to reproduce the Mesoscale dynamics well, simulated the land/lake breeze and diurnal pattern but was inadequate in some aspects. The quantitative prediction of rainfall was inaccurate with overestimation and misplacement but with reasonable occurrence. To address these shortcomings we investigated the value added by assimilating Advanced Microwave Scanning Radiometer (AMSR-E) brightness temperature during the event. By assimilating 23GHz (sensitive to water) and 89GHz (sensitive to cloud) frequency brightness temperature; the predictability of an extreme rain weather event was investigated. The assimilation through a Cloud Microphysics Data Assimilation (CMDAS) into the weather prediction model considerably improved the spatial distribution of this event.

  14. Mercury monohalides: suitability for electron electric dipole moment searches.

    PubMed

    Prasannaa, V S; Vutha, A C; Abe, M; Das, B P

    2015-05-08

    Heavy polar diatomic molecules are the primary tools for searching for the T-violating permanent electric dipole moment of the electron (eEDM). Valence electrons in some molecules experience extremely large effective electric fields due to relativistic interactions. These large effective electric fields are crucial to the success of polar-molecule-based eEDM search experiments. Here we report on the results of relativistic ab initio calculations of the effective electric fields in a series of molecules that are highly sensitive to an eEDM, the mercury monohalides (HgF, HgCl, HgBr, and HgI). We study the influence of the halide anions on E_{eff}, and identify HgBr and HgI as attractive candidates for future electric dipole moment search experiments.

  15. Solving Component Structural Dynamic Failures Due to Extremely High Frequency Structural Response on the Space Shuttle Program

    NASA Technical Reports Server (NTRS)

    Frady, Greg; Nesman, Thomas; Zoladz, Thomas; Szabo, Roland

    2010-01-01

    For many years, the capabilities to determine the root-cause failure of component failures have been limited to the analytical tools and the state of the art data acquisition systems. With this limited capability, many anomalies have been resolved by adding material to the design to increase robustness without the ability to determine if the design solution was satisfactory until after a series of expensive test programs were complete. The risk of failure and multiple design, test, and redesign cycles were high. During the Space Shuttle Program, many crack investigations in high energy density turbomachines, like the SSME turbopumps and high energy flows in the main propulsion system, have led to the discovery of numerous root-cause failures and anomalies due to the coexistences of acoustic forcing functions, structural natural modes, and a high energy excitation, such as an edge tone or shedding flow, leading the technical community to understand many of the primary contributors to extremely high frequency high cycle fatique fluid-structure interaction anomalies. These contributors have been identified using advanced analysis tools and verified using component and system tests during component ground tests, systems tests, and flight. The structural dynamics and fluid dynamics communities have developed a special sensitivity to the fluid-structure interaction problems and have been able to adjust and solve these problems in a time effective manner to meet budget and schedule deadlines of operational vehicle programs, such as the Space Shuttle Program over the years.

  16. A Fiducial Approach to Extremes and Multiple Comparisons

    ERIC Educational Resources Information Center

    Wandler, Damian V.

    2010-01-01

    Generalized fiducial inference is a powerful tool for many difficult problems. Based on an extension of R. A. Fisher's work, we used generalized fiducial inference for two extreme value problems and a multiple comparison procedure. The first extreme value problem is dealing with the generalized Pareto distribution. The generalized Pareto…

  17. A Tool for Rating the Resilience of Critical Infrastructures in Extreme Fires

    DTIC Science & Technology

    2014-05-01

    provide a tool for NRC to help the Canadian industry to develop extreme fire protection materials and technologies for critical infrastructures. Future...supported by the Canadian Safety and Security Program (CSSP) which is led by Defence Research and Development Canada’s Centre for Security Science, in...in oil refinery and chemical industry facilities. The only available standard in North America that addresses the transportation infrastructure is

  18. Telescience - Concepts and contributions to the Extreme Ultraviolet Explorer mission

    NASA Technical Reports Server (NTRS)

    Marchant, Will; Dobson, Carl; Chakrabarti, Supriya; Malina, Roger F.

    1987-01-01

    It is shown how the contradictory goals of low-cost and fast data turnaround characterizing the Extreme Ultraviolet Explorer (EUVE) mission can be achieved via the early use of telescience style transparent tools and simulations. The use of transparent tools reduces the parallel development of capability while ensuring that valuable prelaunch experience is not lost in the operations phase. Efforts made to upgrade the 'EUVE electronics' simulator are described.

  19. Brittle materials at high-loading rates: an open area of research

    NASA Astrophysics Data System (ADS)

    Forquin, Pascal

    2017-01-01

    Brittle materials are extensively used in many civil and military applications involving high-strain-rate loadings such as: blasting or percussive drilling of rocks, ballistic impact against ceramic armour or transparent windshields, plastic explosives used to damage or destroy concrete structures, soft or hard impacts against concrete structures and so on. With all of these applications, brittle materials are subjected to intense loadings characterized by medium to extremely high strain rates (few tens to several tens of thousands per second) leading to extreme and/or specific damage modes such as multiple fragmentation, dynamic cracking, pore collapse, shearing, mode II fracturing and/or microplasticity mechanisms in the material. Additionally, brittle materials exhibit complex features such as a strong strain-rate sensitivity and confining pressure sensitivity that justify expending greater research efforts to understand these complex features. Currently, the most popular dynamic testing techniques used for this are based on the use of split Hopkinson pressure bar methodologies and/or plate-impact testing methods. However, these methods do have some critical limitations and drawbacks when used to investigate the behaviour of brittle materials at high loading rates. The present theme issue of Philosophical Transactions A provides an overview of the latest experimental methods and numerical tools that are currently being developed to investigate the behaviour of brittle materials at high loading rates. This article is part of the themed issue 'Experimental testing and modelling of brittle materials at high strain rates'.

  20. Brittle materials at high-loading rates: an open area of research

    PubMed Central

    2017-01-01

    Brittle materials are extensively used in many civil and military applications involving high-strain-rate loadings such as: blasting or percussive drilling of rocks, ballistic impact against ceramic armour or transparent windshields, plastic explosives used to damage or destroy concrete structures, soft or hard impacts against concrete structures and so on. With all of these applications, brittle materials are subjected to intense loadings characterized by medium to extremely high strain rates (few tens to several tens of thousands per second) leading to extreme and/or specific damage modes such as multiple fragmentation, dynamic cracking, pore collapse, shearing, mode II fracturing and/or microplasticity mechanisms in the material. Additionally, brittle materials exhibit complex features such as a strong strain-rate sensitivity and confining pressure sensitivity that justify expending greater research efforts to understand these complex features. Currently, the most popular dynamic testing techniques used for this are based on the use of split Hopkinson pressure bar methodologies and/or plate-impact testing methods. However, these methods do have some critical limitations and drawbacks when used to investigate the behaviour of brittle materials at high loading rates. The present theme issue of Philosophical Transactions A provides an overview of the latest experimental methods and numerical tools that are currently being developed to investigate the behaviour of brittle materials at high loading rates. This article is part of the themed issue ‘Experimental testing and modelling of brittle materials at high strain rates’. PMID:27956517

  1. Brittle materials at high-loading rates: an open area of research.

    PubMed

    Forquin, Pascal

    2017-01-28

    Brittle materials are extensively used in many civil and military applications involving high-strain-rate loadings such as: blasting or percussive drilling of rocks, ballistic impact against ceramic armour or transparent windshields, plastic explosives used to damage or destroy concrete structures, soft or hard impacts against concrete structures and so on. With all of these applications, brittle materials are subjected to intense loadings characterized by medium to extremely high strain rates (few tens to several tens of thousands per second) leading to extreme and/or specific damage modes such as multiple fragmentation, dynamic cracking, pore collapse, shearing, mode II fracturing and/or microplasticity mechanisms in the material. Additionally, brittle materials exhibit complex features such as a strong strain-rate sensitivity and confining pressure sensitivity that justify expending greater research efforts to understand these complex features. Currently, the most popular dynamic testing techniques used for this are based on the use of split Hopkinson pressure bar methodologies and/or plate-impact testing methods. However, these methods do have some critical limitations and drawbacks when used to investigate the behaviour of brittle materials at high loading rates. The present theme issue of Philosophical Transactions A provides an overview of the latest experimental methods and numerical tools that are currently being developed to investigate the behaviour of brittle materials at high loading rates.This article is part of the themed issue 'Experimental testing and modelling of brittle materials at high strain rates'. © 2016 The Author(s).

  2. Integration of Stable Droplet Formation on a CD Microfluidic Device for Extreme Point of Care Applications

    NASA Astrophysics Data System (ADS)

    Ganesh, Shruthi Vatsyayani

    With the advent of microfluidic technologies for molecular diagnostics, a lot of emphasis has been placed on developing diagnostic tools for resource poor regions in the form of Extreme Point of Care devices. To ensure commercial viability of such a device there is a need to develop an accurate sample to answer system, which is robust, portable, isolated yet highly sensitive and cost effective. This need has been a driving force for research involving integration of different microsystems like droplet microfluidics, Compact-disc (CD)microfluidics along with sample preparation and detection modules on a single platform. This work attempts to develop a proof of concept prototype of one such device using existing CD microfluidics tools to generate stable droplets used in point of care diagnostics (POC diagnostics). Apart from using a fairly newer technique for droplet generation and stabilization, the work aims to develop this method focused towards diagnostics for rural healthcare. The motivation for this work is first described with an emphasis on the current need for diagnostic testing in rural health-care and the general guidelines prescribed by WHO for such a sample to answer system. Furthermore, a background on CD and droplet microfluidics is presented to understand the merits and de-merits of each system and the need for integrating the two. This phase of the thesis also includes different methods employed/demonstrated to generate droplets on a spinning platform. An overview on the detection platforms is also presented to understand the challenges involved in building an extreme point of care device. In the third phase of the thesis, general manufacturing techniques and materials used to accomplish this work is presented. Lastly, design trials for droplet generation is presented. The shortcomings of these trials are solved by investigating mechanisms pertaining to design modification and use of agarose based droplet generation to ensure a more robust sample processing method. This method is further characterized and compared with non-agarose based system and the results are analyzed. In conclusion, future prospects of this work are discussed in relation to extreme POC applications.

  3. Body temperature and cold sensation during and following exercise under temperate room conditions in cold-sensitive young trained females.

    PubMed

    Fujii, Naoto; Aoki-Murakami, Erii; Tsuji, Bun; Kenny, Glen P; Nagashima, Kei; Kondo, Narihiko; Nishiyasu, Takeshi

    2017-11-01

    We evaluated cold sensation at rest and in response to exercise-induced changes in core and skin temperatures in cold-sensitive exercise trained females. Fifty-eight trained young females were screened by a questionnaire, selecting cold-sensitive (Cold-sensitive, n  = 7) and non-cold-sensitive (Control, n  = 7) individuals. Participants rested in a room at 29.5°C for ~100 min after which ambient temperature was reduced to 23.5°C where they remained resting for 60 min. Participants then performed 30-min of moderate intensity cycling (50% peak oxygen uptake) followed by a 60-min recovery. Core and mean skin temperatures and cold sensation over the whole-body and extremities (fingers and toes) were assessed throughout. Resting core temperature was lower in the Cold-sensitive relative to Control group (36.4 ± 0.3 vs. 36.7 ± 0.2°C). Core temperature increased to similar levels at end-exercise (~37.2°C) and gradually returned to near preexercise rest levels at the end of recovery (>36.6°C). Whole-body cold sensation was greater in the Cold-sensitive relative to Control group during resting at a room temperature of 23.5°C only without a difference in mean skin temperature between groups. In contrast, cold sensation of the extremities was greater in the Cold-sensitive group prior to, during and following exercise albeit this was not paralleled by differences in mean extremity skin temperature. We show that young trained females who are sensitive to cold exhibit augmented whole-body cold sensation during rest under temperate ambient conditions. However, this response is diminished during and following exercise. In contrast, cold sensation of extremities is augmented during resting that persists during and following exercise. © 2017 The Authors. Physiological Reports published by Wiley Periodicals, Inc. on behalf of The Physiological Society and the American Physiological Society.

  4. North Atlantic storm driving of extreme wave heights in the North Sea

    NASA Astrophysics Data System (ADS)

    Bell, R. J.; Gray, S. L.; Jones, O. P.

    2017-04-01

    The relationship between storms and extreme ocean waves in the North Sea is assessed using a long-period wave data set and storms identified in the Interim ECMWF Re-Analysis (ERA-Interim). An ensemble sensitivity analysis is used to provide information on the spatial and temporal forcing from mean sea-level pressure and surface wind associated with extreme ocean wave height responses. Extreme ocean waves in the central North Sea arise due to intense extratropical cyclone winds from either the cold conveyor belt (northerly-wind events) or the warm conveyor belt (southerly-wind events). The largest wave heights are associated with northerly-wind events which tend to have stronger wind speeds and occur as the cold conveyor belt wraps rearward round the cyclone to the cold side of the warm front. The northerly-wind events provide a larger fetch to the central North Sea to aid wave growth. Southerly-wind events are associated with the warm conveyor belts of intense extratropical cyclones that develop in the left upper tropospheric jet exit region. Ensemble sensitivity analysis can provide early warning of extreme wave events by demonstrating a relationship between wave height and high pressure to the west of the British Isles for northerly-wind events 48 h prior. Southerly-wind extreme events demonstrate sensitivity to low pressure to the west of the British Isles 36 h prior.

  5. Family, friend, and media factors are associated with patterns of weight-control behavior among adolescent girls.

    PubMed

    Balantekin, Katherine N; Birch, Leann L; Savage, Jennifer S

    2018-04-01

    To examine the relationship of family, friend, and media factors on weight-control group membership at 15 years separately and in a combined model. Subjects included 166 15 year girls. Latent class analysis identified four patterns of weight-control behaviors: non-dieters, lifestyle, dieters, and extreme dieters. Family (family functioning, priority of the family meals, maternal/paternal weight-teasing, and mother's/father's dieting), friend (weight-teasing and dieting), and media variables (media sensitivity and weekly TV time) were included as predictors of weight-control group membership. Family functioning and priority of family meals predicted membership in the Extreme Dieters group, and maternal weight-teasing predicted membership in both dieters and extreme dieters. Friend's dieting and weight-teasing predicted membership in both dieters and extreme dieters. Media sensitivity was significantly associated with membership in lifestyle, dieters, and extreme dieters. In a combined influence model with family, friend, and media factors included, the following remained significantly associated with weight-control group membership: family functioning, friends' dieting, and media sensitivity. Family, friends, and the media are three sources of sociocultural influence, which play a role in adolescent girls' use of patterns of weight-control behaviors; family functioning was a protective factor, whereas friend's dieting and media sensitivity were risk factors. These findings emphasize the need for multidimensional interventions, addressing risk factors for dieting and use of unhealthy weight-control behaviors at the family, peer, and community (e.g., media) levels.

  6. Sensitivity Analysis of Weather Variables on Offsite Consequence Analysis Tools in South Korea and the United States.

    PubMed

    Kim, Min-Uk; Moon, Kyong Whan; Sohn, Jong-Ryeul; Byeon, Sang-Hoon

    2018-05-18

    We studied sensitive weather variables for consequence analysis, in the case of chemical leaks on the user side of offsite consequence analysis (OCA) tools. We used OCA tools Korea Offsite Risk Assessment (KORA) and Areal Location of Hazardous Atmospheres (ALOHA) in South Korea and the United States, respectively. The chemicals used for this analysis were 28% ammonia (NH₃), 35% hydrogen chloride (HCl), 50% hydrofluoric acid (HF), and 69% nitric acid (HNO₃). The accident scenarios were based on leakage accidents in storage tanks. The weather variables were air temperature, wind speed, humidity, and atmospheric stability. Sensitivity analysis was performed using the Statistical Package for the Social Sciences (SPSS) program for dummy regression analysis. Sensitivity analysis showed that impact distance was not sensitive to humidity. Impact distance was most sensitive to atmospheric stability, and was also more sensitive to air temperature than wind speed, according to both the KORA and ALOHA tools. Moreover, the weather variables were more sensitive in rural conditions than in urban conditions, with the ALOHA tool being more influenced by weather variables than the KORA tool. Therefore, if using the ALOHA tool instead of the KORA tool in rural conditions, users should be careful not to cause any differences in impact distance due to input errors of weather variables, with the most sensitive one being atmospheric stability.

  7. Tools in Support of Planning for Weather and Climate Extremes

    NASA Astrophysics Data System (ADS)

    Done, J.; Bruyere, C. L.; Hauser, R.; Holland, G. J.; Tye, M. R.

    2016-12-01

    A major limitation to planning for weather and climate extremes is the lack of maintained and readily available tools that can provide robust and well-communicated predictions and advice on their impacts. The National Center for Atmospheric Research is facilitating a collaborative international program to develop and support such tools within its Capacity Center for Climate and Weather Extremes aimed at improving community resilience planning and reducing weather and climate impacts. A Global Risk, Resilience and Impacts Toolbox is in development and will provide: A portable web-based interface to process work requests from a variety of users and locations; A sophisticated framework that enables specialized community tools to access a comprehensive database (public and private) of geo-located hazard, vulnerability, exposure, and loss data; A community development toolkit that enables and encourages community tool developments geared towards specific user man­agement and planning needs, and A comprehensive community sup­port facilitated by NCAR utilizing tutorials and a help desk. A number of applications are in development, built off the latest climate science, and in collaboration with private industry and local and state governments. Example applications will be described, including a hurricane damage tool in collaboration with the reinsurance sector, and a weather management tool for the construction industry. These examples will serve as starting points to discuss the broader potential of the toolbox.

  8. Challenges in Modelling of Lightning-Induced Delamination; Effect of Temperature-Dependent Interfacial Properties

    NASA Technical Reports Server (NTRS)

    Naghipour, P.; Pineda, E. J.; Arnold, S.

    2014-01-01

    Lightning is a major cause of damage in laminated composite aerospace structures during flight. Due to the dielectric nature of Carbon fiber reinforced polymers (CFRPs), the high energy induced by lightning strike transforms into extreme, localized surface temperature accompanied with a high-pressure shockwave resulting in extensive damage. It is crucial to develop a numerical tool capable of predicting the damage induced from a lightning strike to supplement extremely expensive lightning experiments. Delamination is one of the most significant failure modes resulting from a lightning strike. It can be extended well beyond the visible damage zone, and requires sophisticated techniques and equipment to detect. A popular technique used to model delamination is the cohesive zone approach. Since the loading induced from a lightning strike event is assumed to consist of extreme localized heating, the cohesive zone formulation should additionally account for temperature effects. However, the sensitivity to this dependency remains unknown. Therefore, the major focus point of this work is to investigate the importance of this dependency via defining various temperature dependency profiles for the cohesive zone properties, and analyzing the corresponding delamination area. Thus, a detailed numerical model consisting of multidirectional composite plies with temperature-dependent cohesive elements in between is subjected to lightning (excessive amount of heat and pressure) and delamination/damage expansion is studied under specified conditions.

  9. Diagnostic accuracy of ultrasound in upper and lower extremity long bone fractures of emergency department trauma patients

    PubMed Central

    Frouzan, Arash; Masoumi, Kambiz; Delirroyfard, Ali; Mazdaie, Behnaz; Bagherzadegan, Elnaz

    2017-01-01

    Background Long bone fractures are common injuries caused by trauma. Some studies have demonstrated that ultrasound has a high sensitivity and specificity in the diagnosis of upper and lower extremity long bone fractures. Objective The aim of this study was to determine the accuracy of ultrasound compared with plain radiography in diagnosis of upper and lower extremity long bone fractures in traumatic patients. Methods This cross-sectional study assessed 100 patients admitted to the emergency department of Imam Khomeini Hospital, Ahvaz, Iran with trauma to the upper and lower extremities, from September 2014 through October 2015. In all patients, first ultrasound and then standard plain radiography for the upper and lower limb was performed. Data were analyzed by SPSS version 21 to determine the specificity and sensitivity. Results The mean age of patients with upper and lower limb trauma were 31.43±12.32 years and 29.63±5.89 years, respectively. Radius fracture was the most frequent compared to other fractures (27%). Sensitivity, specificity, positive predicted value, and negative predicted value of ultrasound compared with plain radiography in the diagnosis of upper extremity long bones were 95.3%, 87.7%, 87.2% and 96.2%, respectively, and the highest accuracy was observed in left arm fractures (100%). Tibia and fibula fractures were the most frequent types compared to other fractures (89.2%). Sensitivity, specificity, PPV and NPV of ultrasound compared with plain radiography in the diagnosis of upper extremity long bone fractures were 98.6%, 83%, 65.4% and 87.1%, respectively, and the highest accuracy was observed in men, lower ages and femoral fractures. Conclusion The results of this study showed that ultrasound compared with plain radiography has a high accuracy in the diagnosis of upper and lower extremity long bone fractures. PMID:28979747

  10. Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base Superalloy IN100 (Preprint)

    DTIC Science & Technology

    2009-03-01

    transition fatigue regimes; however, microplasticity (i.e., heterogeneous plasticity at the scale of microstructure) is relevant to understanding fatigue...and Socie [57] considered the affect of microplastic 14 Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base...considers the local stress state as affected by intergranular interactions and microplasticity . For the calculations given below, the volumes over which

  11. Projected Changes in Hydrological Extremes in a Cold Region Watershed: Sensitivity of Results to Statistical Methods of Analysis

    NASA Astrophysics Data System (ADS)

    Dibike, Y. B.; Eum, H. I.; Prowse, T. D.

    2017-12-01

    Flows originating from alpine dominated cold region watersheds typically experience extended winter low flows followed by spring snowmelt and summer rainfall driven high flows. In a warmer climate, there will be temperature- induced shift in precipitation from snow towards rain as well as changes in snowmelt timing affecting the frequency of extreme high and low flow events which could significantly alter ecosystem services. This study examines the potential changes in the frequency and severity of hydrologic extremes in the Athabasca River watershed in Alberta, Canada based on the Variable Infiltration Capacity (VIC) hydrologic model and selected and statistically downscaled climate change scenario data from the latest Coupled Model Intercomparison Project (CMIP5). The sensitivity of these projected changes is also examined by applying different extreme flow analysis methods. The hydrological model projections show an overall increase in mean annual streamflow in the watershed and a corresponding shift in the freshet timing to earlier period. Most of the streams are projected to experience increases during the winter and spring seasons and decreases during the summer and early fall seasons, with an overall projected increases in extreme high flows, especially for low frequency events. While the middle and lower parts of the watershed are characterised by projected increases in extreme high flows, the high elevation alpine region is mainly characterised by corresponding decreases in extreme low flow events. However, the magnitude of projected changes in extreme flow varies over a wide range, especially for low frequent events, depending on the climate scenario and period of analysis, and sometimes in a nonlinear way. Nonetheless, the sensitivity of the projected changes to the statistical method of analysis is found to be relatively small compared to the inter-model variability.

  12. Static tool influence function for fabrication simulation of hexagonal mirror segments for extremely large telescopes.

    PubMed

    Kim, Dae Wook; Kim, Sug-Whan

    2005-02-07

    We present a novel simulation technique that offers efficient mass fabrication strategies for 2m class hexagonal mirror segments of extremely large telescopes. As the first of two studies in series, we establish the theoretical basis of the tool influence function (TIF) for precessing tool polishing simulation for non-rotating workpieces. These theoretical TIFs were then used to confirm the reproducibility of the material removal foot-prints (measured TIFs) of the bulged precessing tooling reported elsewhere. This is followed by the reverse-computation technique that traces, employing the simplex search method, the real polishing pressure from the empirical TIF. The technical details, together with the results and implications described here, provide the theoretical tool for material removal essential to the successful polishing simulation which will be reported in the second study.

  13. Long-term Changes in Extreme Air Pollution Meteorology and the Implications for Air Quality.

    PubMed

    Hou, Pei; Wu, Shiliang

    2016-03-31

    Extreme air pollution meteorological events, such as heat waves, temperature inversions and atmospheric stagnation episodes, can significantly affect air quality. Based on observational data, we have analyzed the long-term evolution of extreme air pollution meteorology on the global scale and their potential impacts on air quality, especially the high pollution episodes. We have identified significant increasing trends for the occurrences of extreme air pollution meteorological events in the past six decades, especially over the continental regions. Statistical analysis combining air quality data and meteorological data further indicates strong sensitivities of air quality (including both average air pollutant concentrations and high pollution episodes) to extreme meteorological events. For example, we find that in the United States the probability of severe ozone pollution when there are heat waves could be up to seven times of the average probability during summertime, while temperature inversions in wintertime could enhance the probability of severe particulate matter pollution by more than a factor of two. We have also identified significant seasonal and spatial variations in the sensitivity of air quality to extreme air pollution meteorology.

  14. Lymphoscintigraphic findings in chylous reflux in a lower extremity.

    PubMed

    Berenji, Gholam R; Iker, Emily; Glass, Edwin C

    2007-09-01

    Lymphoscintigraphy is a useful and safe tool for the diagnostic evaluation of a swollen extremity. Unilateral leg swelling with cutaneous chylous vesicles is a common manifestation of chylous reflux. The authors present a case of chylous reflux in an 11-year-old boy who presented with swelling and skin lesions of the left lower extremity.

  15. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  16. Extreme Events and Energy Providers: Science and Innovation

    NASA Astrophysics Data System (ADS)

    Yiou, P.; Vautard, R.

    2012-04-01

    Most socio-economic regulations related to the resilience to climate extremes, from infrastructure or network design to insurance premiums, are based on a present-day climate with an assumption of stationarity. Climate extremes (heat waves, cold spells, droughts, storms and wind stilling) affect in particular energy production, supply, demand and security in several ways. While national, European or international projects have generated vast amounts of climate projections for the 21st century, their practical use in long-term planning remains limited. Estimating probabilistic diagnostics of energy user relevant variables from those multi-model projections will help the energy sector to elaborate medium to long-term plans, and will allow the assessment of climate risks associated to those plans. The project "Extreme Events for Energy Providers" (E3P) aims at filling a gap between climate science and its practical use in the energy sector and creating in turn favourable conditions for new business opportunities. The value chain ranges from addressing research questions directly related to energy-significant climate extremes to providing innovative tools of information and decision making (including methodologies, best practices and software) and climate science training for the energy sector, with a focus on extreme events. Those tools will integrate the scientific knowledge that is developed by scientific communities, and translate it into a usable probabilistic framework. The project will deliver projection tools assessing the probabilities of future energy-relevant climate extremes at a range of spatial scales varying from pan-European to local scales. The E3P project is funded by the Knowledge and Innovation Community (KIC Climate). We will present the mechanisms of interactions between academic partners, SMEs and industrial partners for this project. Those mechanisms are elementary bricks of a climate service.

  17. Comparison of the MoCA and BEARNI tests for detection of cognitive impairment in in-patients with alcohol use disorders.

    PubMed

    Pelletier, Stéphanie; Alarcon, Régis; Ewert, Valérie; Forest, Margot; Nalpas, Bertrand; Perney, Pascal

    2018-06-01

    Screening of cognitive impairment is a major challenge in alcoholics seeking treatment, since cognitive dysfunction may impair the overall efficacy of rehabilitation programs and consequently increase relapse rate. We compared the performance of two screening tools: the MoCA (Montreal Cognitive Assessment), which is widely used in patients with neurological diseases and already used in patients with alcohol use disorder (AUD), and the BEARNI (Brief Evaluation of Alcohol-Related Neuropsychological Impairments), a recent test specifically developed for the alcoholic population. We compared the sensitivity and specificity of the MoCA and the BEARNI in a sample of AUD patients with and without cognitive impairment assessed by a battery of neuropsychological tests. Ninety patients were included. There were 67 men and 23 women aged 48.9 ± 9.6 years. According to the neuropsychological tests, 51.1% of patients had no cognitive impairment, while it was mild or moderate to severe in 31.1 and 17.8%, respectively. The BEARNI sensitivity was extremely high (1.0), since all patients with cognitive impairment were identified, but its specificity was very low (0.04). The MoCA had a lower sensitivity (0.79) than the BEARNI, but its specificity was significantly better (0.65). A detailed analysis of the BEARNI scores showed a discrepancy between the qualitative and quantitative interpretation of the test which could, at least in part, explain its low specificity. Both the MoCA and the BEARNI are screening tools which identified alcoholic patients with cognitive impairment. However, in routine use, the MoCA appeared to be more appropriate given the low specificity of the BEARNI. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Sensitivity of subject-specific models to errors in musculo-skeletal geometry.

    PubMed

    Carbone, V; van der Krogt, M M; Koopman, H F J M; Verdonschot, N

    2012-09-21

    Subject-specific musculo-skeletal models of the lower extremity are an important tool for investigating various biomechanical problems, for instance the results of surgery such as joint replacements and tendon transfers. The aim of this study was to assess the potential effects of errors in musculo-skeletal geometry on subject-specific model results. We performed an extensive sensitivity analysis to quantify the effect of the perturbation of origin, insertion and via points of each of the 56 musculo-tendon parts contained in the model. We used two metrics, namely a Local Sensitivity Index (LSI) and an Overall Sensitivity Index (OSI), to distinguish the effect of the perturbation on the predicted force produced by only the perturbed musculo-tendon parts and by all the remaining musculo-tendon parts, respectively, during a simulated gait cycle. Results indicated that, for each musculo-tendon part, only two points show a significant sensitivity: its origin, or pseudo-origin, point and its insertion, or pseudo-insertion, point. The most sensitive points belong to those musculo-tendon parts that act as prime movers in the walking movement (insertion point of the Achilles Tendon: LSI=15.56%, OSI=7.17%; origin points of the Rectus Femoris: LSI=13.89%, OSI=2.44%) and as hip stabilizers (insertion points of the Gluteus Medius Anterior: LSI=17.92%, OSI=2.79%; insertion point of the Gluteus Minimus: LSI=21.71%, OSI=2.41%). The proposed priority list provides quantitative information to improve the predictive accuracy of subject-specific musculo-skeletal models. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Suitability of the isolated chicken eye test for classification of extreme pH detergents and cleaning products.

    PubMed

    Cazelle, Elodie; Eskes, Chantra; Hermann, Martina; Jones, Penny; McNamee, Pauline; Prinsen, Menk; Taylor, Hannah; Wijnands, Marcel V W

    2015-04-01

    A.I.S.E. investigated the suitability of the regulatory adopted ICE in vitro test method (OECD TG 438) with or without histopathology to identify detergent and cleaning formulations having extreme pH that require classification as EU CLP/UN GHS Category 1. To this aim, 18 extreme pH detergent and cleaning formulations were tested covering both alkaline and acidic extreme pHs. The ICE standard test method following OECD Test Guideline 438 showed good concordance with in vivo classification (83%) and good and balanced specificity and sensitivity values (83%) which are in line with the performances of currently adopted in vitro test guidelines, confirming its suitability to identify Category 1 extreme pH detergent and cleaning products. In contrast to previous findings obtained with non-extreme pH formulations, the use of histopathology did not improve the sensitivity of the assay whilst it strongly decreased its specificity for the extreme pH formulations. Furthermore, use of non-testing prediction rules for classification showed poor concordance values (33% for the extreme pH rule and 61% for the EU CLP additivity approach) with high rates of over-prediction (100% for the extreme pH rule and 50% for the additivity approach), indicating that these non-testing prediction rules are not suitable to predict Category 1 hazards of extreme pH detergent and cleaning formulations. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Using Weather Types to Understand and Communicate Weather and Climate Impacts

    NASA Astrophysics Data System (ADS)

    Prein, A. F.; Hale, B.; Holland, G. J.; Bruyere, C. L.; Done, J.; Mearns, L.

    2017-12-01

    A common challenge in atmospheric research is the translation of scientific advancements and breakthroughs to decision relevant and actionable information. This challenge is central to the mission of NCAR's Capacity Center for Climate and Weather Extremes (C3WE, www.c3we.ucar.edu). C3WE advances our understanding of weather and climate impacts and integrates these advances with distributed information technology to create tools that promote a global culture of resilience to weather and climate extremes. Here we will present an interactive web-based tool that connects historic U.S. losses and fatalities from extreme weather and climate events to 12 large-scale weather types. Weather types are dominant weather situations such as winter high-pressure systems over the U.S. leading to very cold temperatures or summertime moist humid air masses over the central U.S. leading to severe thunderstorms. Each weather type has a specific fingerprint of economic losses and fatalities in a region that is quantified. Therefore, weather types enable a direct connection of observed or forecasted weather situation to loss of life and property. The presented tool allows the user to explore these connections, raise awareness of existing vulnerabilities, and build resilience to weather and climate extremes.

  1. A probabilistic storm transposition approach for estimating exceedance probabilities of extreme precipitation depths

    NASA Astrophysics Data System (ADS)

    Foufoula-Georgiou, E.

    1989-05-01

    A storm transposition approach is investigated as a possible tool of assessing the frequency of extreme precipitation depths, that is, depths of return period much greater than 100 years. This paper focuses on estimation of the annual exceedance probability of extreme average precipitation depths over a catchment. The probabilistic storm transposition methodology is presented, and the several conceptual and methodological difficulties arising in this approach are identified. The method is implemented and is partially evaluated by means of a semihypothetical example involving extreme midwestern storms and two hypothetical catchments (of 100 and 1000 mi2 (˜260 and 2600 km2)) located in central Iowa. The results point out the need for further research to fully explore the potential of this approach as a tool for assessing the probabilities of rare storms, and eventually floods, a necessary element of risk-based analysis and design of large hydraulic structures.

  2. Bottom Extreme-Ultraviolet-Sensitive Coating for Evaluation of the Absorption Coefficient of Ultrathin Film

    NASA Astrophysics Data System (ADS)

    Hijikata, Hayato; Kozawa, Takahiro; Tagawa, Seiichi; Takei, Satoshi

    2009-06-01

    A bottom extreme-ultraviolet-sensitive coating (BESC) for evaluation of the absorption coefficients of ultrathin films such as extreme ultraviolet (EUV) resists was developed. This coating consists of a polymer, crosslinker, acid generator, and acid-responsive chromic dye and is formed by a conventional spin-coating method. By heating the film after spin-coating, a crosslinking reaction is induced and the coating becomes insoluble. A typical resist solution can be spin-coated on a substrate covered with the coating film. The evaluation of the linear absorption coefficients of polymer films was demonstrated by measuring the EUV absorption of BESC substrates on which various polymers were spin-coated.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilcox, Kevin R.; Shi, Zheng; Gherardi, Laureano A.

    Climatic changes are altering Earth's hydrological cycle, resulting in altered precipitation amounts, increased interannual variability of precipitation, and more frequent extreme precipitation events. These trends will likely continue into the future, having substantial impacts on net primary productivity (NPP) and associated ecosystem services such as food production and carbon sequestration. Frequently, experimental manipulations of precipitation have linked altered precipitation regimes to changes in NPP. Yet, findings have been diverse and substantial uncertainty still surrounds generalities describing patterns of ecosystem sensitivity to altered precipitation. Additionally, we do not know whether previously observed correlations between NPP and precipitation remain accurate when precipitationmore » changes become extreme. We synthesized results from 83 case studies of experimental precipitation manipulations in grasslands worldwide. Here, we used meta-analytical techniques to search for generalities and asymmetries of aboveground NPP (ANPP) and belowground NPP (BNPP) responses to both the direction and magnitude of precipitation change. Sensitivity (i.e., productivity response standardized by the amount of precipitation change) of BNPP was similar under precipitation additions and reductions, but ANPP was more sensitive to precipitation additions than reductions; this was especially evident in drier ecosystems. Additionally, overall relationships between the magnitude of productivity responses and the magnitude of precipitation change were saturating in form. The saturating form of this relationship was likely driven by ANPP responses to very extreme precipitation increases, although there were limited studies imposing extreme precipitation change, and there was considerable variation among experiments. Finally, this highlights the importance of incorporating gradients of manipulations, ranging from extreme drought to extreme precipitation increases into future climate change experiments. Additionally, policy and land management decisions related to global change scenarios should consider how ANPP and BNPP responses may differ, and that ecosystem responses to extreme events might not be predicted from relationships found under moderate environmental changes.« less

  4. Detection and Attribution of Simulated Climatic Extreme Events and Impacts: High Sensitivity to Bias Correction

    NASA Astrophysics Data System (ADS)

    Sippel, S.; Otto, F. E. L.; Forkel, M.; Allen, M. R.; Guillod, B. P.; Heimann, M.; Reichstein, M.; Seneviratne, S. I.; Kirsten, T.; Mahecha, M. D.

    2015-12-01

    Understanding, quantifying and attributing the impacts of climatic extreme events and variability is crucial for societal adaptation in a changing climate. However, climate model simulations generated for this purpose typically exhibit pronounced biases in their output that hinders any straightforward assessment of impacts. To overcome this issue, various bias correction strategies are routinely used to alleviate climate model deficiencies most of which have been criticized for physical inconsistency and the non-preservation of the multivariate correlation structure. We assess how biases and their correction affect the quantification and attribution of simulated extremes and variability in i) climatological variables and ii) impacts on ecosystem functioning as simulated by a terrestrial biosphere model. Our study demonstrates that assessments of simulated climatic extreme events and impacts in the terrestrial biosphere are highly sensitive to bias correction schemes with major implications for the detection and attribution of these events. We introduce a novel ensemble-based resampling scheme based on a large regional climate model ensemble generated by the distributed weather@home setup[1], which fully preserves the physical consistency and multivariate correlation structure of the model output. We use extreme value statistics to show that this procedure considerably improves the representation of climatic extremes and variability. Subsequently, biosphere-atmosphere carbon fluxes are simulated using a terrestrial ecosystem model (LPJ-GSI) to further demonstrate the sensitivity of ecosystem impacts to the methodology of bias correcting climate model output. We find that uncertainties arising from bias correction schemes are comparable in magnitude to model structural and parameter uncertainties. The present study consists of a first attempt to alleviate climate model biases in a physically consistent way and demonstrates that this yields improved simulations of climate extremes and associated impacts. [1] http://www.climateprediction.net/weatherathome/

  5. Modeling Effects of Annealing on Coal Char Reactivity to O 2 and CO 2 , Based on Preparation Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, Troy; Bhat, Sham; Marcy, Peter

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive computational fluid dynamics (CFD) simulations are valuable tools in evaluating and deploying oxyfuel and other carbon capture technologies, either as retrofit technologies or for new construction. However, accurate predictive combustor simulations require physically realistic submodels with low computational requirements. A recent sensitivity analysis of a detailed char conversion model (Char Conversion Kinetics (CCK)) found thermal annealing to be an extremely sensitive submodel. In the present work, further analysis of the previous annealing model revealed significant disagreement with numerous datasets from experiments performed after that annealing model was developed. Themore » annealing model was accordingly extended to reflect experimentally observed reactivity loss, because of the thermal annealing of a variety of coals under diverse char preparation conditions. The model extension was informed by a Bayesian calibration analysis. In addition, since oxyfuel conditions include extraordinarily high levels of CO 2, the development of a first-ever CO 2 reactivity loss model due to annealing is presented.« less

  6. Modeling Effects of Annealing on Coal Char Reactivity to O 2 and CO 2 , Based on Preparation Conditions

    DOE PAGES

    Holland, Troy; Bhat, Sham; Marcy, Peter; ...

    2017-08-25

    Oxy-fired coal combustion is a promising potential carbon capture technology. Predictive computational fluid dynamics (CFD) simulations are valuable tools in evaluating and deploying oxyfuel and other carbon capture technologies, either as retrofit technologies or for new construction. However, accurate predictive combustor simulations require physically realistic submodels with low computational requirements. A recent sensitivity analysis of a detailed char conversion model (Char Conversion Kinetics (CCK)) found thermal annealing to be an extremely sensitive submodel. In the present work, further analysis of the previous annealing model revealed significant disagreement with numerous datasets from experiments performed after that annealing model was developed. Themore » annealing model was accordingly extended to reflect experimentally observed reactivity loss, because of the thermal annealing of a variety of coals under diverse char preparation conditions. The model extension was informed by a Bayesian calibration analysis. In addition, since oxyfuel conditions include extraordinarily high levels of CO 2, the development of a first-ever CO 2 reactivity loss model due to annealing is presented.« less

  7. A simple pendulum borehole tiltmeter based on a triaxial optical-fibre displacement sensor

    NASA Astrophysics Data System (ADS)

    Chawah, P.; Chéry, J.; Boudin, F.; Cattoen, M.; Seat, H. C.; Plantier, G.; Lizion, F.; Sourice, A.; Bernard, P.; Brunet, C.; Boyer, D.; Gaffet, S.

    2015-11-01

    Sensitive instruments like strainmeters and tiltmeters are necessary for measuring slowly varying low amplitude Earth deformations. Nonetheless, laser and fibre interferometers are particularly suitable for interrogating such instruments due to their extreme precision and accuracy. In this paper, a practical design of a simple pendulum borehole tiltmeter based on laser fibre interferometric displacement sensors is presented. A prototype instrument has been constructed using welded borosilicate with a pendulum length of 0.85 m resulting in a main resonance frequency of 0.6 Hz. By implementing three coplanar extrinsic fibre Fabry-Perot interferometric probes and appropriate signal filtering, our instrument provides tilt measurements that are insensitive to parasitic deformations caused by temperature and pressure variations. This prototype has been installed in an underground facility (Rustrel, France) where results show accurate measurements of Earth strains derived from Earth and ocean tides, local hydrologic effects, as well as local and remote earthquakes. The large dynamic range and the high sensitivity of this tiltmeter render it an invaluable tool for numerous geophysical applications such as transient fault motion, volcanic strain and reservoir monitoring.

  8. Shuttle Imaging Radar (SIR-B) investigations of the Canadian shield - Initial Report

    NASA Technical Reports Server (NTRS)

    Lowman, Paul D., Jr.; Harris, Jeff; Masuoka, Penny M.; Singhroy, Vernon H.; Slaney, Vernon Roy

    1987-01-01

    Two of the 43 Shuttle Imaging Radar (SIR-B) experiments carried out from the 41-G shuttle mission in 1984 involved a 2600-km swath across the Canadian Shield, with the objectives of studying the structure of province boundaries and developing techniques for the geologic use of orbital radar. Despite degraded single incidence angle imagery resulting from system problems, valuable experience has been obtained with data over a test site near Bancroft, Ontario. It has been found that even subdued glaciated topography can be effectively imaged, variations in backscatter being caused by variations in local incidence angle rather than shadowing. It has been demonstrated that small incidence angles are more sensitive to topography than large angles. Backscatter is extremely sensitive to look direction, topographic features nearly normal to the illumination being highlighted, and those nearly parallel to it being suppressed. It is concluded that orbital radar can provide a valuable tool for geologic studies of the Canadian Shield and similar areas, if suitable look angles and at least two look directions can be utilized for each area.

  9. Trimethylation enhancement using diazomethane (TrEnDi): rapid on-column quaternization of peptide amino groups via reaction with diazomethane significantly enhances sensitivity in mass spectrometry analyses via a fixed, permanent positive charge.

    PubMed

    Wasslen, Karl V; Tan, Le Hoa; Manthorpe, Jeffrey M; Smith, Jeffrey C

    2014-04-01

    Defining cellular processes relies heavily on elucidating the temporal dynamics of proteins. To this end, mass spectrometry (MS) is an extremely valuable tool; different MS-based quantitative proteomics strategies have emerged to map protein dynamics over the course of stimuli. Herein, we disclose our novel MS-based quantitative proteomics strategy with unique analytical characteristics. By passing ethereal diazomethane over peptides on strong cation exchange resin within a microfluidic device, peptides react to contain fixed, permanent positive charges. Modified peptides display improved ionization characteristics and dissociate via tandem mass spectrometry (MS(2)) to form strong a2 fragment ion peaks. Process optimization and determination of reactive functional groups enabled a priori prediction of MS(2) fragmentation patterns for modified peptides. The strategy was tested on digested bovine serum albumin (BSA) and successfully quantified a peptide that was not observable prior to modification. Our method ionizes peptides regardless of proton affinity, thus decreasing ion suppression and permitting predictable multiple reaction monitoring (MRM)-based quantitation with improved sensitivity.

  10. Micro-electromechanical sensors in the analytical field.

    PubMed

    Zougagh, Mohammed; Ríos, Angel

    2009-07-01

    Micro- and nano-electromechanical systems (MEMS and NEMS) for use as sensors represent one of the most exciting new fields in analytical chemistry today. These systems are advantageous over currently available non-miniaturized sensors, such as quartz crystal microbalances, thickness shear mode resonators, and flexural plate wave oscillators, because of their high sensitivity, low cost and easy integration into automated systems. In this article, we present and discuss the evolution in the use of MEMS and NEMS, which are basically cantilever-type sensors, as good analytical tools for a wide variety of applications. We discuss the analytical features and the practical potential of micro(nano)-cantilever sensors, which combine the synergetic advantages of selectivity, provided by their functionalization, and the high sensitivity, which is attributed largely to the extremely small size of the sensing element. An insight is given into the different types of functionalization and detection strategies and a critical discussion is presented on the existing state of the art concerning the applications reported for mechanical microsensors. New developments and the possibilities for routine work in the near future are also covered.

  11. Compensatory Limb Use and Behavioral Assessment of Motor Skill Learning Following Sensorimotor Cortex Injury in a Mouse Model of Ischemic Stroke

    PubMed Central

    Kerr, Abigail L.; Tennant, Kelly A.

    2014-01-01

    Mouse models have become increasingly popular in the field of behavioral neuroscience, and specifically in studies of experimental stroke. As models advance, it is important to develop sensitive behavioral measures specific to the mouse. The present protocol describes a skilled motor task for use in mouse models of stroke. The Pasta Matrix Reaching Task functions as a versatile and sensitive behavioral assay that permits experimenters to collect accurate outcome data and manipulate limb use to mimic human clinical phenomena including compensatory strategies (i.e., learned non-use) and focused rehabilitative training. When combined with neuroanatomical tools, this task also permits researchers to explore the mechanisms that support behavioral recovery of function (or lack thereof) following stroke. The task is both simple and affordable to set up and conduct, offering a variety of training and testing options for numerous research questions concerning functional outcome following injury. Though the task has been applied to mouse models of stroke, it may also be beneficial in studies of functional outcome in other upper extremity injury models. PMID:25045916

  12. Nano-imprint gold grating as refractive index sensor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumari, Sudha; Mohapatra, Saswat; Moirangthem, Rakesh S.

    Large scale of fabrication of plasmonic nanostructures has been a challenging task due to time consuming process and requirement of expensive nanofabrication tools such as electron beam lithography system, focused ion beam system, and extreme UV photolithography system. Here, we present a cost-effective fabrication technique so called soft nanoimprinting to fabricate nanostructures on the larger sample area. In our fabrication process, a commercially available optical DVD disc was used as a template which was imprinted on a polymer glass substrate to prepare 1D polymer nano-grating. A homemade nanoimprinting setup was used in this fabrication process. Further, a label-free refractive indexmore » sensor was developed by utilizing the properties of surface plasmon resonance (SPR) of a gold coated 1D polymer nano-grating. Refractive index sensing was tested by exposing different solutions of glycerol-water mixture on the surface of gold nano-grating. The calculated bulk refractive index sensitivity was found to be 751nm/RIU. We believed that our proposed SPR sensor could be a promising candidate for developing low-cost refractive index sensor with high sensitivity on a large scale.« less

  13. Clinical utility of the mini-mental status examination when assessing decision-making capacity.

    PubMed

    Pachet, Arlin; Astner, Kevin; Brown, Lenora

    2010-03-01

    The main objectives of this study were to examine the relationship between cognitive deficits, as measured by the Mini-Mental Status Examination (MMSE), and decision-making capacity and to determine whether the sensitivity and specificity of the MMSE varied based upon the patient population assessed. Using a sample size of 152 patients and varying cutoff scores, the MMSE demonstrated extremely poor sensitivity. In contrast, the MMSE had excellent specificity when scores of 19 or less were obtained. In our sample, not one patient, regardless of diagnosis, was deemed to have capacity if their MMSE score was below 20. However, reliance on the MMSE for scores above 19 would too frequently lead to misclassification and incorrect assumptions about a patient's decision-making abilities. Although a score below 20 consistently yielded findings of incapability in our sample, it remains our opinion that the MMSE should not be used as a stand-alone tool to make determinations related to capacity, especially when considering the complexities associated with capacity evaluations and the vital areas, such as executive functioning and individual values and beliefs, which are omitted by the MMSE.

  14. Capture, Learning, and Classification of Upper Extremity Movement Primitives in Healthy Controls and Stroke Patients

    PubMed Central

    Guerra, Jorge; Uddin, Jasim; Nilsen, Dawn; Mclnerney, James; Fadoo, Ammarah; Omofuma, Isirame B.; Hughes, Shatif; Agrawal, Sunil; Allen, Peter; Schambra, Heidi M.

    2017-01-01

    There currently exist no practical tools to identify functional movements in the upper extremities (UEs). This absence has limited the precise therapeutic dosing of patients recovering from stroke. In this proof-of-principle study, we aimed to develop an accurate approach for classifying UE functional movement primitives, which comprise functional movements. Data were generated from inertial measurement units (IMUs) placed on upper body segments of older healthy individuals and chronic stroke patients. Subjects performed activities commonly trained during rehabilitation after stroke. Data processing involved the use of a sliding window to obtain statistical descriptors, and resulting features were processed by a Hidden Markov Model (HMM). The likelihoods of the states, resulting from the HMM, were segmented by a second sliding window and their averages were calculated. The final predictions were mapped to human functional movement primitives using a Logistic Regression algorithm. Algorithm performance was assessed with a leave-one-out analysis, which determined its sensitivity, specificity, and positive and negative predictive values for all classified primitives. In healthy control and stroke participants, our approach identified functional movement primitives embedded in training activities with, on average, 80% precision. This approach may support functional movement dosing in stroke rehabilitation. PMID:28813877

  15. An Urban Resilience to Extreme Weather Events Framework for Development of Post Event Learning and Transformative Adaptation in Cities

    NASA Astrophysics Data System (ADS)

    Solecki, W. D.; Friedman, E. S.; Breitzer, R.

    2016-12-01

    Increasingly frequent extreme weather events are becoming an immediate priority for urban coastal practitioners and stakeholders, adding complexity to decisions concerning risk management for short-term action and long-term needs of city climate stakeholders. The conflict between the prioritization of short versus long-term events by decision-makers creates disconnect between climate science and its applications. The Consortium for Climate Risk in the Urban Northeast (CCRUN), a NOAA RISA team, is developing a set of mechanisms to help bridge this gap. The mechanisms are designed to promote the application of climate science on extreme weather events and their aftermath. It is in the post event policy window where significant opportunities for science-policy linkages exist. In particular, CCRUN is interested in producing actionable and useful information for city managers to use in decision-making processes surrounding extreme weather events and climate change. These processes include a sector specific needs assessment survey instrument and two tools for urban coastal practitioners and stakeholders. The tools focus on post event learning and connections between resilience and transformative adaptation. Elements of the two tools are presented. Post extreme event learning supports urban coastal practitioners and decision-makers concerned about maximizing opportunities for knowledge transfer and assimilation, and policy initiation and development following an extreme weather event. For the urban U.S. Northeast, post event learning helps coastal stakeholders build the capacity to adapt to extreme weather events, and inform and develop their planning capacity through analysis of past actions and steps taken in response to Hurricane Sandy. Connecting resilience with transformative adaptation is intended to promote resilience in urban Northeast coastal settings to the long-term negative consequences of extreme weather events. This is done through a knowledge co-production engagement process that links innovative and flexible adaptation pathways that can address requirements for short-term action and long-term needs.

  16. Single Phase Dual-energy CT Angiography: One-stop-shop Tool for Evaluating Aneurysmal Subarachnoid Hemorrhage.

    PubMed

    Ni, Qian Qian; Tang, Chun Xiang; Zhao, Yan E; Zhou, Chang Sheng; Chen, Guo Zhong; Lu, Guang Ming; Zhang, Long Jiang

    2016-05-25

    Aneurysmal subarachnoid hemorrhages have extremely high case fatality in clinic. Early and rapid identifications of ruptured intracranial aneurysms seem to be especially important. Here we evaluate clinical value of single phase contrast-enhanced dual-energy CT angiograph (DE-CTA) as a one-stop-shop tool in detecting aneurysmal subarachnoid hemorrhage. One hundred and five patients who underwent true non-enhanced CT (TNCT), contrast-enhanced DE-CTA and digital subtraction angiography (DSA) were included. Image quality and detectability of intracranial hemorrhage were evaluated and compared between virtual non-enhanced CT (VNCT) images reconstructed from DE-CTA and TNCT. There was no statistical difference in image quality (P > 0.05) between VNCT and TNCT. The agreement of VNCT and TNCT in detecting intracranial hemorrhage reached 98.1% on a per-patient basis. With DSA as reference standard, sensitivity and specificity on a per-patient were 98.3% and 97.9% for DE-CTA in intracranial aneurysm detection. Effective dose of DE-CTA was reduced by 75.0% compared to conventional digital subtraction CTA. Thus, single phase contrast-enhanced DE-CTA is optimal reliable one-stop-shop tool for detecting intracranial hemorrhage with VNCT and intracranial aneurysms with DE-CTA with substantial radiation dose reduction compared with conventional digital subtraction CTA.

  17. Magnetorheological finishing (MRF) of potassium dihydrogen phosphate (KDP) crystals: nonaqueous fluids development, optical finish, and laser damage performance at 1064 nm and 532 nm

    NASA Astrophysics Data System (ADS)

    Menapace, J. A.; Ehrmann, P. R.; Bickel, R. C.

    2009-10-01

    Over the past year we have been working on specialized MR fluids for polishing KDP crystals. KDP is an extremely difficult material to conventionally polish due to its water solubility, low hardness, and temperature sensitivity. Today, KDP crystals are finished using single-point diamond turning (SPDT) tools and nonaqueous lubricants/coolants. KDP optics fabricated using SPDT, however, are limited to surface corrections due to tool/method characteristics with surface quality driven by microroughness from machine pitch, speed, force, and diamond tool character. MRF polishing offers a means to circumvent many of these issues since it is deterministic which makes the technique practical for surface and transmitted wavefront correction, is low force, and is temperature independent. What is lacking is a usable nonaqueous MR fluid that is chemically and physically compatible with KDP which can be used for polishing and subsequently cleaned from the optical surface. In this study, we will present the fluid parameters important in the design and development of nonaqueous MR fluid formulations capable of polishing KDP and how these parameters affect MRF polishing. We will also discuss requirements peculiar to successful KDP polishing and how they affect optical figure/finish and laser damage performance at 1064 nm and 532 nm.

  18. Single Phase Dual-energy CT Angiography: One-stop-shop Tool for Evaluating Aneurysmal Subarachnoid Hemorrhage

    PubMed Central

    Ni, Qian Qian; Tang, Chun Xiang; Zhao, Yan E; Zhou, Chang Sheng; Chen, Guo Zhong; Lu, Guang Ming; Zhang, Long Jiang

    2016-01-01

    Aneurysmal subarachnoid hemorrhages have extremely high case fatality in clinic. Early and rapid identifications of ruptured intracranial aneurysms seem to be especially important. Here we evaluate clinical value of single phase contrast-enhanced dual-energy CT angiograph (DE-CTA) as a one-stop-shop tool in detecting aneurysmal subarachnoid hemorrhage. One hundred and five patients who underwent true non-enhanced CT (TNCT), contrast-enhanced DE-CTA and digital subtraction angiography (DSA) were included. Image quality and detectability of intracranial hemorrhage were evaluated and compared between virtual non-enhanced CT (VNCT) images reconstructed from DE-CTA and TNCT. There was no statistical difference in image quality (P > 0.05) between VNCT and TNCT. The agreement of VNCT and TNCT in detecting intracranial hemorrhage reached 98.1% on a per-patient basis. With DSA as reference standard, sensitivity and specificity on a per-patient were 98.3% and 97.9% for DE-CTA in intracranial aneurysm detection. Effective dose of DE-CTA was reduced by 75.0% compared to conventional digital subtraction CTA. Thus, single phase contrast-enhanced DE-CTA is optimal reliable one-stop-shop tool for detecting intracranial hemorrhage with VNCT and intracranial aneurysms with DE-CTA with substantial radiation dose reduction compared with conventional digital subtraction CTA. PMID:27222163

  19. Secure FAST: Security Enhancement in the NATO Time Sensitive Targeting Tool

    DTIC Science & Technology

    2010-11-01

    designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and authorisation in terms...level authentication and authorisation in terms of security. It uses operating system level security but does not provide application level security for...and collaboration tool, designed to aid in the tracking and prosecuting of Time Sensitive Targets. The FAST tool provides user level authentication and

  20. Volumetric in vivo imaging of microvascular perfusion within the intact cochlea in mice using ultra-high sensitive optical microangiography.

    PubMed

    Subhash, Hrebesh M; Davila, Viviana; Sun, Hai; Nguyen-Huynh, Anh T; Shi, Xiaorui; Nuttall, Alfred L; Wang, Ruikang K

    2011-02-01

    Studying the inner ear microvascular dynamics is extremely important to understand the cochlear function and to further advance the diagnosis, prevention, and treatment of many otologic disorders. However, there is currently no effective imaging tool available that is able to access the blood flow within the intact cochlea. In this paper, we report the use of an ultrahigh sensitive optical micro-angiography (UHS-OMAG) imaging system to image 3-D microvascular perfusion within the intact cochlea in living mice. The UHS-OMAG image system used in this study is based on spectral domain optical coherence tomography, which uses a broadband light source centered at 1300 nm with an imaging rate of 47[Formula: see text] 000 A-scans/s, capable of acquiring high-resolution B scans at 300 frames/s. The technique is sensitive enough to image very slow blood flow velocities, such as those found in capillary networks. The 3-D imaging acquisition time for a whole cochlea is  ∼ 4.1 s. We demonstrate that volumetric reconstruction of microvascular flow obtained by UHS-OMAG provides a comprehensive perfusion map of several regions of the cochlea, including the otic capsule, the stria vascularis of the apical and middle turns and the radiating arterioles that emanate from the modiolus.

  1. Prediction of protein-protein interactions from amino acid sequences with ensemble extreme learning machines and principal component analysis.

    PubMed

    You, Zhu-Hong; Lei, Ying-Ke; Zhu, Lin; Xia, Junfeng; Wang, Bing

    2013-01-01

    Protein-protein interactions (PPIs) play crucial roles in the execution of various cellular processes and form the basis of biological mechanisms. Although large amount of PPIs data for different species has been generated by high-throughput experimental techniques, current PPI pairs obtained with experimental methods cover only a fraction of the complete PPI networks, and further, the experimental methods for identifying PPIs are both time-consuming and expensive. Hence, it is urgent and challenging to develop automated computational methods to efficiently and accurately predict PPIs. We present here a novel hierarchical PCA-EELM (principal component analysis-ensemble extreme learning machine) model to predict protein-protein interactions only using the information of protein sequences. In the proposed method, 11188 protein pairs retrieved from the DIP database were encoded into feature vectors by using four kinds of protein sequences information. Focusing on dimension reduction, an effective feature extraction method PCA was then employed to construct the most discriminative new feature set. Finally, multiple extreme learning machines were trained and then aggregated into a consensus classifier by majority voting. The ensembling of extreme learning machine removes the dependence of results on initial random weights and improves the prediction performance. When performed on the PPI data of Saccharomyces cerevisiae, the proposed method achieved 87.00% prediction accuracy with 86.15% sensitivity at the precision of 87.59%. Extensive experiments are performed to compare our method with state-of-the-art techniques Support Vector Machine (SVM). Experimental results demonstrate that proposed PCA-EELM outperforms the SVM method by 5-fold cross-validation. Besides, PCA-EELM performs faster than PCA-SVM based method. Consequently, the proposed approach can be considered as a new promising and powerful tools for predicting PPI with excellent performance and less time.

  2. Dynamic balance performance and noncontact lower extremity injury in college football players: an initial study.

    PubMed

    Butler, Robert J; Lehr, Michael E; Fink, Michael L; Kiesel, Kyle B; Plisky, Phillip J

    2013-09-01

    Field expedient screening tools that can identify individuals at an elevated risk for injury are needed to minimize time loss in American football players. Previous research has suggested that poor dynamic balance may be associated with an elevated risk for injury in athletes; however, this has yet to be examined in college football players. To determine if dynamic balance deficits are associated with an elevated risk of injury in collegiate football players. It was hypothesized that football players with lower performance and increased asymmetry in dynamic balance would be at an elevated risk for sustaining a noncontact lower extremity injury. Prospective cohort study. Fifty-nine collegiate American football players volunteered for this study. Demographic information, injury history, and dynamic balance testing performance were collected, and noncontact lower extremity injuries were recorded over the course of the season. Receiver operator characteristic curves were calculated based on performance on the Star Excursion Balance Test (SEBT), including composite score and asymmetry, to determine the population-specific risk cut-off point. Relative risk was then calculated based on these variables, as well as previous injury. A cut-off point of 89.6% composite score on the SEBT optimized the sensitivity (100%) and specificity (71.7%). A college football player who scored below 89.6% was 3.5 times more likely to get injured. Poor performance on the SEBT may be related to an increased risk for sustaining a noncontact lower extremity injury over the course of a competitive American football season. College football players should be screened preseason using the SEBT to identify those at an elevated risk for injury based upon dynamic balance performance to implement injury mitigation strategies to this specific subgroup of athletes.

  3. Bacterial and archaeal resistance to ionizing radiation

    NASA Astrophysics Data System (ADS)

    Confalonieri, F.; Sommer, S.

    2011-01-01

    Organisms living in extreme environments must cope with large fluctuations of temperature, high levels of radiation and/or desiccation, conditions that can induce DNA damage ranging from base modifications to DNA double-strand breaks. The bacterium Deinococcus radiodurans is known for its resistance to extremely high doses of ionizing radiation and for its ability to reconstruct a functional genome from hundreds of radiation-induced chromosomal fragments. Recently, extreme ionizing radiation resistance was also generated by directed evolution of an apparently radiation-sensitive bacterial species, Escherichia coli. Radioresistant organisms are not only found among the Eubacteria but also among the Archaea that represent the third kingdom of life. They present a set of particular features that differentiate them from the Eubacteria and eukaryotes. Moreover, Archaea are often isolated from extreme environments where they live under severe conditions of temperature, pressure, pH, salts or toxic compounds that are lethal for the large majority of living organisms. Thus, Archaea offer the opportunity to understand how cells are able to cope with such harsh conditions. Among them, the halophilic archaeon Halobacterium sp and several Pyrococcus or Thermococcus species, such as Thermococcus gammatolerans, were also shown to display high level of radiation resistance. The dispersion, in the phylogenetic tree, of radioresistant prokaryotes suggests that they have independently acquired radioresistance. Different strategies were selected during evolution including several mechanisms of radiation byproduct detoxification and subtle cellular metabolism modifications to help cells recover from radiation-induced injuries, protection of proteins against oxidation, an efficient DNA repair tool box, an original pathway of DNA double-strand break repair, a condensed nucleoid that may prevent the dispersion of the DNA fragments and specific radiation-induced proteins involved in radioresistance. Here, we compare mechanisms and discuss hypotheses suggested to contribute to radioresistance in several Archaea and Eubacteria.

  4. Diagnostic Accuracy of Full-Body Linear X-Ray Scanning in Multiple Trauma Patients in Comparison to Computed Tomography.

    PubMed

    Jöres, A P W; Heverhagen, J T; Bonél, H; Exadaktylos, A; Klink, T

    2016-02-01

    The purpose of this study was to evaluate the diagnostic accuracy of full-body linear X-ray scanning (LS) in multiple trauma patients in comparison to 128-multislice computed tomography (MSCT). 106 multiple trauma patients (female: 33; male: 73) were retrospectively included in this study. All patients underwent LS of the whole body, including extremities, and MSCT covering the neck, thorax, abdomen, and pelvis. The diagnostic accuracy of LS for the detection of fractures of the truncal skeleton and pneumothoraces was evaluated in comparison to MSCT by two observers in consensus. Extremity fractures detected by LS were documented. The overall sensitivity of LS was 49.2 %, the specificity was 93.3 %, the positive predictive value was 91 %, and the negative predictive value was 57.5 %. The overall sensitivity for vertebral fractures was 16.7 %, and the specificity was 100 %. The sensitivity was 48.7 % and the specificity 98.2 % for all other fractures. Pneumothoraces were detected in 12 patients by CT, but not by LS. 40 extremity fractures were detected by LS, of which 4 fractures were dislocated, and 2 were fully covered by MSCT. The diagnostic accuracy of LS is limited in the evaluation of acute trauma of the truncal skeleton. LS allows fast whole-body X-ray imaging, and may be valuable for detecting extremity fractures in trauma patients in addition to MSCT.  The overall sensitivity of LS for truncal skeleton injuries in multiple-trauma patients was < 50 %. The diagnostic reference standard MSCT is the preferred and reliable imaging modality. LS may be valuable for quick detection of extremity fractures. © Georg Thieme Verlag KG Stuttgart · New York.

  5. Bayesian species delimitation in Pleophylla chafers (Coleoptera) - the importance of prior choice and morphology.

    PubMed

    Eberle, Jonas; Warnock, Rachel C M; Ahrens, Dirk

    2016-05-05

    Defining species units can be challenging, especially during the earliest stages of speciation, when phylogenetic inference and delimitation methods may be compromised by incomplete lineage sorting (ILS) or secondary gene flow. Integrative approaches to taxonomy, which combine molecular and morphological evidence, have the potential to be valuable in such cases. In this study we investigated the South African scarab beetle genus Pleophylla using data collected from 110 individuals of eight putative morphospecies. The dataset included four molecular markers (cox1, 16S, rrnL, ITS1) and morphometric data based on male genital morphology. We applied a suite of molecular and morphological approaches to species delimitation, and implemented a novel Bayesian approach in the software iBPP, which enables continuous morphological trait and molecular data to be combined. Traditional morphology-based species assignments were supported quantitatively by morphometric analyses of the male genitalia (eigenshape analysis, CVA, LDA). While the ITS1-based delineation was also broadly congruent with the morphospecies, the cox1 data resulted in over-splitting (GMYC modelling, haplotype networks, PTP, ABGD). In the most extreme case morphospecies shared identical haplotypes, which may be attributable to ILS based on statistical tests performed using the software JML. We found the strongest support for putative morphospecies based on phylogenetic evidence using the combined approach implemented in iBPP. However, support for putative species was sensitive to the use of alternative guide trees and alternative combinations of priors on the population size (θ) and rootage (τ 0 ) parameters, especially when the analysis was based on molecular or morphological data alone. We demonstrate that continuous morphological trait data can be extremely valuable in assessing competing hypotheses to species delimitation. In particular, we show that the inclusion of morphological data in an integrative Bayesian framework can improve the resolution of inferred species units. However, we also demonstrate that this approach is extremely sensitive to guide tree and prior parameter choice. These parameters should be chosen with caution - if possible - based on independent empirical evidence, or careful sensitivity analyses should be performed to assess the robustness of results. Young species provide exemplars for investigating the mechanisms of speciation and for assessing the performance of tools used to delimit species on the basis of molecular and/or morphological evidence.

  6. Extreme rainfall, vulnerability and risk: a continental-scale assessment for South America.

    PubMed

    Vörösmarty, Charles J; Bravo de Guenni, Lelys; Wollheim, Wilfred M; Pellerin, Brian; Bjerklie, David; Cardoso, Manoel; D'Almeida, Cassiano; Green, Pamela; Colon, Lilybeth

    2013-11-13

    Extreme weather continues to preoccupy society as a formidable public safety concern bearing huge economic costs. While attention has focused on global climate change and how it could intensify key elements of the water cycle such as precipitation and river discharge, it is the conjunction of geophysical and socioeconomic forces that shapes human sensitivity and risks to weather extremes. We demonstrate here the use of high-resolution geophysical and population datasets together with documentary reports of rainfall-induced damage across South America over a multi-decadal, retrospective time domain (1960-2000). We define and map extreme precipitation hazard, exposure, affectedpopulations, vulnerability and risk, and use these variables to analyse the impact of floods as a water security issue. Geospatial experiments uncover major sources of risk from natural climate variability and population growth, with change in climate extremes bearing a minor role. While rural populations display greatest relative sensitivity to extreme rainfall, urban settings show the highest rates of increasing risk. In the coming decades, rapid urbanization will make South American cities the focal point of future climate threats but also an opportunity for reducing vulnerability, protecting lives and sustaining economic development through both traditional and ecosystem-based disaster risk management systems.

  7. Asymmetric responses of primary productivity to precipitation extremes: A synthesis of grassland precipitation manipulation experiments

    DOE PAGES

    Wilcox, Kevin R.; Shi, Zheng; Gherardi, Laureano A.; ...

    2017-04-02

    Climatic changes are altering Earth's hydrological cycle, resulting in altered precipitation amounts, increased interannual variability of precipitation, and more frequent extreme precipitation events. These trends will likely continue into the future, having substantial impacts on net primary productivity (NPP) and associated ecosystem services such as food production and carbon sequestration. Frequently, experimental manipulations of precipitation have linked altered precipitation regimes to changes in NPP. Yet, findings have been diverse and substantial uncertainty still surrounds generalities describing patterns of ecosystem sensitivity to altered precipitation. Additionally, we do not know whether previously observed correlations between NPP and precipitation remain accurate when precipitationmore » changes become extreme. We synthesized results from 83 case studies of experimental precipitation manipulations in grasslands worldwide. Here, we used meta-analytical techniques to search for generalities and asymmetries of aboveground NPP (ANPP) and belowground NPP (BNPP) responses to both the direction and magnitude of precipitation change. Sensitivity (i.e., productivity response standardized by the amount of precipitation change) of BNPP was similar under precipitation additions and reductions, but ANPP was more sensitive to precipitation additions than reductions; this was especially evident in drier ecosystems. Additionally, overall relationships between the magnitude of productivity responses and the magnitude of precipitation change were saturating in form. The saturating form of this relationship was likely driven by ANPP responses to very extreme precipitation increases, although there were limited studies imposing extreme precipitation change, and there was considerable variation among experiments. Finally, this highlights the importance of incorporating gradients of manipulations, ranging from extreme drought to extreme precipitation increases into future climate change experiments. Additionally, policy and land management decisions related to global change scenarios should consider how ANPP and BNPP responses may differ, and that ecosystem responses to extreme events might not be predicted from relationships found under moderate environmental changes.« less

  8. Asymmetric responses of primary productivity to precipitation extremes: A synthesis of grassland precipitation manipulation experiments.

    PubMed

    Wilcox, Kevin R; Shi, Zheng; Gherardi, Laureano A; Lemoine, Nathan P; Koerner, Sally E; Hoover, David L; Bork, Edward; Byrne, Kerry M; Cahill, James; Collins, Scott L; Evans, Sarah; Gilgen, Anna K; Holub, Petr; Jiang, Lifen; Knapp, Alan K; LeCain, Daniel; Liang, Junyi; Garcia-Palacios, Pablo; Peñuelas, Josep; Pockman, William T; Smith, Melinda D; Sun, Shanghua; White, Shannon R; Yahdjian, Laura; Zhu, Kai; Luo, Yiqi

    2017-10-01

    Climatic changes are altering Earth's hydrological cycle, resulting in altered precipitation amounts, increased interannual variability of precipitation, and more frequent extreme precipitation events. These trends will likely continue into the future, having substantial impacts on net primary productivity (NPP) and associated ecosystem services such as food production and carbon sequestration. Frequently, experimental manipulations of precipitation have linked altered precipitation regimes to changes in NPP. Yet, findings have been diverse and substantial uncertainty still surrounds generalities describing patterns of ecosystem sensitivity to altered precipitation. Additionally, we do not know whether previously observed correlations between NPP and precipitation remain accurate when precipitation changes become extreme. We synthesized results from 83 case studies of experimental precipitation manipulations in grasslands worldwide. We used meta-analytical techniques to search for generalities and asymmetries of aboveground NPP (ANPP) and belowground NPP (BNPP) responses to both the direction and magnitude of precipitation change. Sensitivity (i.e., productivity response standardized by the amount of precipitation change) of BNPP was similar under precipitation additions and reductions, but ANPP was more sensitive to precipitation additions than reductions; this was especially evident in drier ecosystems. Additionally, overall relationships between the magnitude of productivity responses and the magnitude of precipitation change were saturating in form. The saturating form of this relationship was likely driven by ANPP responses to very extreme precipitation increases, although there were limited studies imposing extreme precipitation change, and there was considerable variation among experiments. This highlights the importance of incorporating gradients of manipulations, ranging from extreme drought to extreme precipitation increases into future climate change experiments. Additionally, policy and land management decisions related to global change scenarios should consider how ANPP and BNPP responses may differ, and that ecosystem responses to extreme events might not be predicted from relationships found under moderate environmental changes. © 2017 John Wiley & Sons Ltd.

  9. Large uncertainties in observed daily precipitation extremes over land

    NASA Astrophysics Data System (ADS)

    Herold, Nicholas; Behrangi, Ali; Alexander, Lisa V.

    2017-01-01

    We explore uncertainties in observed daily precipitation extremes over the terrestrial tropics and subtropics (50°S-50°N) based on five commonly used products: the Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) dataset, the Global Precipitation Climatology Centre-Full Data Daily (GPCC-FDD) dataset, the Tropical Rainfall Measuring Mission (TRMM) multi-satellite research product (T3B42 v7), the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR), and the Global Precipitation Climatology Project's One-Degree Daily (GPCP-1DD) dataset. We use the precipitation indices R10mm and Rx1day, developed by the Expert Team on Climate Change Detection and Indices, to explore the behavior of "moderate" and "extreme" extremes, respectively. In order to assess the sensitivity of extreme precipitation to different grid sizes we perform our calculations on four common spatial resolutions (0.25° × 0.25°, 1° × 1°, 2.5° × 2.5°, and 3.75° × 2.5°). The impact of the chosen "order of operation" in calculating these indices is also determined. Our results show that moderate extremes are relatively insensitive to product and resolution choice, while extreme extremes can be very sensitive. For example, at 0.25° × 0.25° quasi-global mean Rx1day values vary from 37 mm in PERSIANN-CDR to 62 mm in T3B42. We find that the interproduct spread becomes prominent at resolutions of 1° × 1° and finer, thus establishing a minimum effective resolution at which observational products agree. Without improvements in interproduct spread, these exceedingly large observational uncertainties at high spatial resolution may limit the usefulness of model evaluations. As has been found previously, resolution sensitivity can be largely eliminated by applying an order of operation where indices are calculated prior to regridding. However, this approach is not appropriate when true area averages are desired (e.g., for model evaluations).

  10. Soft x-ray microscopy and extreme ultraviolet lithography: Imaging in the 20-50 nm regime (abstract) (invited)

    NASA Astrophysics Data System (ADS)

    Attwood, David

    2002-03-01

    Advances in short wavelength optics, covering the range from 1 to 14 nm, are providing new results and new opportunities. Zone plate lenses [E. Anderson et al., J. Vac. Sci. Techno. B 18, 2970 (2000)] for soft x-ray microscopy [G. Denbeaux, Rev. Sci. Instrum. (these proceedings); W. Chao, Proc. SPIE 4146, 171 (2000)] are now made to high accuracy with outer zone widths of 25 nm, and demonstrated resolution of 23 nm with proper illumination and stability. These permit important advances in the study of protein specific transport and structure in the life sciences [C. Larabell (private communication); W. Meyer-Ilse et al., J. Microsc. 201, 395 (2001)] and the study of magnetic materials [P. Fischer et al., J. Synchrotron. Radiat. 8, 325 (2001)] with elemental sensitivity at the resolution of individual domains. Major corporations (members of the EUV Limited Liability Company are Intel, Motorola, AMD, Micron, Infineon, and IBM) are now preparing the path for the fabrication of future computer chips, in the years 2007 and beyond, using multilayer coated reflective optics, which achieve reflectivities of 70% in the 11-14 nm region [T. Barbee et al., Appl. Opt. 24, 883 (1985); C. Montcalm et al., Proc. SPIE 3676, 710 (1999)]. These coated optics are to be incorporated in extreme ultraviolet (EUV) print cameras, known as "steppers." Electronic patterns with features in the range of 50-70 nm have been printed. The first alpha tool stepper recently demonstrated all critical technologies [D. Tichenor et al., Proc. SPIE 4343, 19 (2001)] needed for EUV lithography. Preproduction beta tools are targeted for delivery by leading suppliers [ASML, the Netherlands, at the SPIE Microlithography Conference, Santa Clara, CA, March 2001] in 2004, with high volume production tools available in late 2006 for manufacturing in 2007. New results in these two areas will be discussed in the context of the synergy of science and technology.

  11. Neuropad for the detection of cardiovascular autonomic neuropathy in patients with type 2 diabetes.

    PubMed

    Mendivil, Carlos O; Kattah, William; Orduz, Arturo; Tique, Claudia; Cárdenas, José L; Patiño, Jorge E

    2016-01-01

    Cardiovascular autonomic neuropathy (CAN) is a prevalent and neglected chronic complication of diabetes, with a large impact on morbidity and mortality. Part of the reason why it is not detected and treated opportunely is because of the complexity of the tests required for its diagnosis. We evaluated the Neuropad®, a test based on sudomotor function, as a screening tool for CAN in adult patients with type 2 diabetes in Bogotá, Colombia. This was a cross-sectional evaluation of Neuropad® for the detection of CAN. Patients were 20-75years of age and did not suffer from any other type of neuropathy. CAN was diagnosed using the Ewing battery of tests for R-R variability during deep breathing, Valsalva and lying-to-standing maneuvers. Additionally, distal symmetric polyneuropathy (DSP) was diagnosed using a sign-based scale (Michigan Neuropathy Disability Score - NDS) and a symptom-based score (Total Symptom Score - TSS). The primary outcome was the sensitivity and specificity of the Neuropad® for the diagnosis of CAN, and secondary outcomes were the sensitivity and specificity of Neuropad® for DSP. We studied 154 patients (74 men and 80 women). Prevalence of CAN was extremely high (68.0% of study participants), but also DSP was prevalent, particularly according to the signs-based definition (45%). The sensitivity of the Neuropad® for any degree of CAN was 70.1%, being slightly higher for the deep breathing and Valsalva tests than for lying-to-standing. The specificity of the Neuropad® for any type of CAN was only 37.0%, as expected for a screening exam. The negative predictive value was higher for the deep breathing and Valsalva tests (69.4 and 81.6%, respectively). Neuropad showed also a good sensitivity and negative predictive value for DSP. The sensitivity and specificity of Neuropad were better among men, and among patients with diabetes duration above the group median. The Neuropad is a simple and inexpensive device that demonstrated an adequate performance as a screening tool for cardiovascular autonomic neuropathy in Latin American patients with DM2. Copyright © 2016 Elsevier Inc. All rights reserved.

  12. Climatic Extremes and Food Grain Production in India

    NASA Astrophysics Data System (ADS)

    A, A.; Mishra, V.

    2015-12-01

    Climate change is likely to affect food and water security in India. India has witnessed tremendous growth in its food production after the green revolution. However, during the recent decades the food grain yields were significantly affected by the extreme climate and weather events. Air temperature and associated extreme events (number of hot days and hot nights, heat waves) increased significantly during the last 50 years in the majority of India. More remarkably, a substantial increase in mean and extreme temperatures was observed during the winter season in India. On the other hand, India witnessed extreme flood and drought events that have become frequent during the past few decades. Extreme rainfall during the non-monsoon season adversely affected the food grain yields and results in tremendous losses in several parts of the country. Here we evaluate the changes in hydroclimatic extremes and its linkage with the food grain production in India. We use observed food grain yield data for the period of 1980-2012 at district level. We understand the linkages between food grain yield and crop phenology obtained from the high resolution leaf area index and NDVI datasets from satellites. We used long-term observed data of daily precipitation and maximum and minimum temperatures to evaluate changes in the extreme events. We use statistical models to develop relationships between crop yields, mean and extreme temperatures for various crops to understand the sensitivity of these crops towards changing climatic conditions. We find that some of the major crop types and predominant crop growing areas have shown a significant sensitivity towards changes in extreme climatic conditions in India.

  13. Movement Assessment of Children (MAC): validity, reliability, stability and sensitivity to change in typically developing children.

    PubMed

    Chandler, L S; Terhorst, L; Rogers, J C; Holm, M B

    2016-07-01

    The purpose of this study was to establish the validity, reliability, stability and sensitivity to change of the family-centred Movement Assessment of Children (MAC) in typically developing infants/toddlers from 2 months (1 month 16 days) to 2 years (24 months 15 days) of age. Assessment of infant/toddler motor development is critical so that infants and toddlers who are at-risk for developmental delay or whose functional motor development is delayed can be monitored and receive therapy to improve their developmental outcomes. Infants/toddlers are thought to be more responsive during the MAC assessment because parents and siblings participate and elicit responses. Two hundred seventy six children and 405 assessments contributed to the establishment of age-related parameters for typically developing infants and toddlers on the MAC. The MAC assesses three core domains of functional movement (head control, upper extremities and hands, pelvis and lower extremities), and generates a core total score. Four explanatory domains serve to alert examiners to factors that may impact atypical development (general observations, special senses, primitive reflexes/reactions, muscle tone). Construct validity of functional motor development was examined using the relationship between incremental increases in scores and increases in participants' ages. Subsamples were used to establish inter-rater reliability, test-retest reliability, stability and sensitivity to change. Construct validity was established and inter-rater reliability ICCs for the core items and core total ranged from 0.83 to 0.99. Percent agreement for the explanatory items ranged from 0.72 to 0.96. Stability within age grouping was consistent from baseline to 6 months post-baseline, and sensitivity to change from baseline to 6 months was significant for all core items and the total score. The MAC has proven to be a well-constructed assessment of infant and toddler functional motor development. It is a family-centred and efficient tool that can be used to assess and follow-up of infants and toddlers from 2 months to 2 years. © 2016 John Wiley & Sons Ltd.

  14. Monitoring of DNA breakage in embryonic stages of the African catfish Clarias gariepinus (Burchell, 1822) after exposure to lead nitrate using alkaline comet assay.

    PubMed

    Osman, Alaa G M; Mekkawy, Imam A; Verreth, Johan; Wuertz, Sven; Kloas, Werner; Kirschbaum, Frank

    2008-12-01

    Increasing lead contamination in Egyptian ecosystems and high lead concentrations in food items have raised concern for human health and stimulated studies on monitoring ecotoxicological impact of lead-caused genotoxicity. In this work, the alkaline comet assay was modified for monitoring DNA strand breakage in sensitive early life stages of the African catfish Clarias gariepinus. Following exposure to 100, 300, and 500 microg/L lead nitrate, DNA strand breakage was quantified in embryos at 30, 48, 96, 144, and 168 h post-fertilization (PFS). For quantitative analysis, four commonly used parameters (tail % DNA, %TDNA; head % DNA, %HDNA; tail length, TL; tail moment, TM) were analyzed in 96 nuclei (in triplicates) at each sampling point. The parameter %TDNA revealed highest resolution and lowest variation. A strong correlation between lead concentration, time of exposure, and DNA strand breakage was observed. Here, genotoxicity detected by comet assay preceded the manifested malformations assessed with conventional histology. Qualitative evaluation was carried out using five categories are as follows: undamaged (%TDNA < or = 10%), low damaged (10% < %TDNA < or = 25%), median damaged (25 < %TDNA < or = 50%), highly damaged (50 < %TDNA < or = 75%), and extremely damaged (%TDNA > 75%) nuclei confirming a dose and time-dependent shift towards increased frequencies of highly and extremely damaged nuclei. A protective capacity provided by a hardened chorion is a an interesting finding in this study as DNA damage in the prehatching stages 30 h-PFS and 48 h-PFS was low in all treatments (qualitative and quantitative analyses). These results clearly show that the comet assay is a sensitive tool for the detection of genotoxicity in vulnerable early life stages of the African catfish and is a method more sensitive than histological parameters for monitoring genotoxic effects. 2008 Wiley Periodicals, Inc.

  15. The New York City Operations Support Tool: Supporting Water Supply Operations for Millions in an Era of Changing Patterns in Hydrological Extreme Events

    NASA Astrophysics Data System (ADS)

    Matonse, A. H.; Porter, J. H.; Frei, A.

    2015-12-01

    Providing an average 1.1 billion gallons (~ 4.2 x 106 cubic meters) of drinking water per day to approximately nine million people in New York City (NYC) and four upstate counties, the NYC water supply is among the world's largest unfiltered systems. In addition to providing a reliable water supply in terms of water quantity and quality, the city has to fulfill other flow objectives to serve downstream communities. At times, such as during extreme hydrological events, water quality issues may restrict water usage for parts of the system. To support a risk-based water supply decision making process NYC has developed the Operations Support Tool (OST). OST combines a water supply systems model with reservoir water quality models, near real time data ingestion, data base management and an ensemble hydrological forecast. A number of reports have addressed the frequency and intensities of extreme hydrological events across the continental US. In the northeastern US studies have indicated an increase in the frequency of extremely large precipitation and streamflow events during the most recent decades. During this presentation we describe OST and, using case studies we demonstrate how this tool has been useful to support operational decisions. We also want to motivate a discussion about how undergoing changes in patterns of hydrological extreme events elevate the challenge faced by water supply managers and the role of the scientific community to integrate nonstationarity approaches in hydrologic forecast and modeling.

  16. Variable effects of climate on forest growth in relation to climate extremes, disturbance, and forest dynamics.

    PubMed

    Itter, Malcolm S; Finley, Andrew O; D'Amato, Anthony W; Foster, Jane R; Bradford, John B

    2017-06-01

    Changes in the frequency, duration, and severity of climate extremes are forecast to occur under global climate change. The impacts of climate extremes on forest productivity and health remain difficult to predict due to potential interactions with disturbance events and forest dynamics-changes in forest stand composition, density, size and age structure over time. Such interactions may lead to non-linear forest growth responses to climate involving thresholds and lag effects. Understanding how forest dynamics influence growth responses to climate is particularly important given stand structure and composition can be modified through management to increase forest resistance and resilience to climate change. To inform such adaptive management, we develop a hierarchical Bayesian state space model in which climate effects on tree growth are allowed to vary over time and in relation to past climate extremes, disturbance events, and forest dynamics. The model is an important step toward integrating disturbance and forest dynamics into predictions of forest growth responses to climate extremes. We apply the model to a dendrochronology data set from forest stands of varying composition, structure, and development stage in northeastern Minnesota that have experienced extreme climate years and forest tent caterpillar defoliation events. Mean forest growth was most sensitive to water balance variables representing climatic water deficit. Forest growth responses to water deficit were partitioned into responses driven by climatic threshold exceedances and interactions with insect defoliation. Forest growth was both resistant and resilient to climate extremes with the majority of forest growth responses occurring after multiple climatic threshold exceedances across seasons and years. Interactions between climate and disturbance were observed in a subset of years with insect defoliation increasing forest growth sensitivity to water availability. Forest growth was particularly sensitive to climate extremes during periods of high stem density following major regeneration events when average inter-tree competition was high. Results suggest the resistance and resilience of forest growth to climate extremes can be increased through management steps such as thinning to reduce competition during early stages of stand development and small-group selection harvests to maintain forest structures characteristic of older, mature stands. © 2017 by the Ecological Society of America.

  17. Variable effects of climate on forest growth in relation to climate extremes, disturbance, and forest dynamics

    USGS Publications Warehouse

    Itter, Malcolm S.; Finley, Andrew O.; D'Amato, Anthony W.; Foster, Jane R.; Bradford, John B.

    2017-01-01

    Changes in the frequency, duration, and severity of climate extremes are forecast to occur under global climate change. The impacts of climate extremes on forest productivity and health remain difficult to predict due to potential interactions with disturbance events and forest dynamics—changes in forest stand composition, density, size and age structure over time. Such interactions may lead to non-linear forest growth responses to climate involving thresholds and lag effects. Understanding how forest dynamics influence growth responses to climate is particularly important given stand structure and composition can be modified through management to increase forest resistance and resilience to climate change. To inform such adaptive management, we develop a hierarchical Bayesian state space model in which climate effects on tree growth are allowed to vary over time and in relation to past climate extremes, disturbance events, and forest dynamics. The model is an important step toward integrating disturbance and forest dynamics into predictions of forest growth responses to climate extremes. We apply the model to a dendrochronology data set from forest stands of varying composition, structure, and development stage in northeastern Minnesota that have experienced extreme climate years and forest tent caterpillar defoliation events. Mean forest growth was most sensitive to water balance variables representing climatic water deficit. Forest growth responses to water deficit were partitioned into responses driven by climatic threshold exceedances and interactions with insect defoliation. Forest growth was both resistant and resilient to climate extremes with the majority of forest growth responses occurring after multiple climatic threshold exceedances across seasons and years. Interactions between climate and disturbance were observed in a subset of years with insect defoliation increasing forest growth sensitivity to water availability. Forest growth was particularly sensitive to climate extremes during periods of high stem density following major regeneration events when average inter-tree competition was high. Results suggest the resistance and resilience of forest growth to climate extremes can be increased through management steps such as thinning to reduce competition during early stages of stand development and small-group selection harvests to maintain forest structures characteristic of older, mature stands.

  18. Assessing functional mobility in survivors of lower-extremity sarcoma: reliability and validity of a new assessment tool.

    PubMed

    Marchese, Victoria G; Rai, Shesh N; Carlson, Claire A; Hinds, Pamela S; Spearing, Elena M; Zhang, Lijun; Callaway, Lulie; Neel, Michael D; Rao, Bhaskar N; Ginsberg, Jill P

    2007-08-01

    Reliability and validity of a new tool, Functional Mobility Assessment (FMA), were examined in patients with lower-extremity sarcoma. FMA requires the patients to physically perform the functional mobility measures, unlike patient self-report or clinician administered measures. A sample of 114 subjects participated, 20 healthy volunteers and 94 patients with lower-extremity sarcoma after amputation, limb-sparing, or rotationplasty surgery. Reliability of the FMA was examined by three raters testing 20 healthy volunteers and 23 subjects with lower-extremity sarcoma. Concurrent validity was examined using data from 94 subjects with lower-extremity sarcoma who completed the FMA, Musculoskeletal Tumor Society (MSTS), Short-Form 36 (SF-36v2), and Toronto Extremity Salvage Scale (TESS) scores. Construct validity was measured by the ability of the FMA to discriminate between subjects with and without functional mobility deficits. FMA demonstrated excellent reliability (ICC [2,1] >or=0.97). Moderate correlations were found between FMA and SF-36v2 (r = 0.60, P < 0.01), FMA and MSTS (r = 0.68, P < 0.01), and FMA and TESS (r = 0.62, P < 0.01). The patients with lower-extremity sarcoma scored lower on the FMA as compared to healthy controls (P < 0.01). The FMA is a reliable and valid functional outcome measure for patients with lower-extremity sarcoma. This study supports the ability of the FMA to discriminate between patients with varying functional abilities and supports the need to include measures of objective functional mobility in examination of patients with lower-extremity sarcoma.

  19. The Effects of Tools of the Mind on Math and Reading Scores in Kindergarten

    ERIC Educational Resources Information Center

    Mackay, Patricia E.

    2013-01-01

    Although a limited body of research has supported the positive impact of the Tools of the Mind curriculum on the development of self-regulation, research supporting a direct relationship between Tools and academic achievement is extremely limited. The purpose of this study is to evaluate the effectiveness of the Tools of the Mind curriculum…

  20. Cell Culture on MEMS Platforms: A Review

    PubMed Central

    Ni, Ming; Tong, Wen Hao; Choudhury, Deepak; Rahim, Nur Aida Abdul; Iliescu, Ciprian; Yu, Hanry

    2009-01-01

    Microfabricated systems provide an excellent platform for the culture of cells, and are an extremely useful tool for the investigation of cellular responses to various stimuli. Advantages offered over traditional methods include cost-effectiveness, controllability, low volume, high resolution, and sensitivity. Both biocompatible and bio-incompatible materials have been developed for use in these applications. Biocompatible materials such as PMMA or PLGA can be used directly for cell culture. However, for bio-incompatible materials such as silicon or PDMS, additional steps need to be taken to render these materials more suitable for cell adhesion and maintenance. This review describes multiple surface modification strategies to improve the biocompatibility of MEMS materials. Basic concepts of cell-biomaterial interactions, such as protein adsorption and cell adhesion are covered. Finally, the applications of these MEMS materials in Tissue Engineering are presented. PMID:20054478

  1. Assays for the determination of the activity of DNA nucleases based on the fluorometric properties of the YOYO dye.

    PubMed

    Fernández-Sierra, Mónica; Quiñones, Edwin

    2015-03-15

    Here we characterize the fluorescence of the YOYO dye as a tool for studying DNA-protein interactions in real time and present two continuous YOYO-based assays for sensitively monitoring the kinetics of DNA digestion by λ-exonuclease and the endonuclease EcoRV. The described assays rely on the different fluorescence intensities between single- and double-stranded DNA-YOYO complexes, allowing straightforward determination of nuclease activity and quantitative determination of reaction products. The assays were also employed to assess the effect of single-stranded DNA-binding proteins on the λ-exonuclease reaction kinetics, showing that the extreme thermostable single-stranded DNA-binding protein (ET-SSB) significantly reduced the reaction rate, while the recombination protein A (RecA) displayed no effect. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Spectrally resolved single-shot wavefront sensing of broadband high-harmonic sources

    NASA Astrophysics Data System (ADS)

    Freisem, L.; Jansen, G. S. M.; Rudolf, D.; Eikema, K. S. E.; Witte, S.

    2018-03-01

    Wavefront sensors are an important tool to characterize coherent beams of extreme ultraviolet radiation. However, conventional Hartmann-type sensors do not allow for independent wavefront characterization of different spectral components that may be present in a beam, which limits their applicability for intrinsically broadband high-harmonic generation (HHG) sources. Here we introduce a wavefront sensor that measures the wavefronts of all the harmonics in a HHG beam in a single camera exposure. By replacing the mask apertures with transmission gratings at different orientations, we simultaneously detect harmonic wavefronts and spectra, and obtain sensitivity to spatiotemporal structure such as pulse front tilt as well. We demonstrate the capabilities of the sensor through a parallel measurement of the wavefronts of 9 harmonics in a wavelength range between 25 and 49 nm, with up to lambda/32 precision.

  3. Sensitivity of the orbiting JEM-EUSO mission to large-scale anisotropies

    NASA Astrophysics Data System (ADS)

    Weiler, Thomas; Anchordoqui, Luis; Denton, Peter

    2013-04-01

    Uniform sky coverage and very large apertures are advantages of future extreme-energy, space-based cosmic-ray observatories. In this talk we will quantify the advantage of an all-sky/4pi observatory such as JEM-EUSO over the one to two steradian coverage of a ground-based observatory such as Auger. We exploit the availability of spherical harmonics in the case of 4pi coverage. The resulting Y(lm) coefficients will likely become a standard analysis tool for near-future, space-based, cosmic-ray astronomy. We demonstrate the use of Y(lm)'s with extractions of simulated dipole and quadrupole anisotropies. (A dipole anisotropy is expected if a single source-region such as Cen A dominates the sky, while a quadrupole moment is expected if a 2D source region such as the Supergalactic Plane dominates the sky.)

  4. Logit-normal mixed model for Indian monsoon precipitation

    NASA Astrophysics Data System (ADS)

    Dietz, L. R.; Chatterjee, S.

    2014-09-01

    Describing the nature and variability of Indian monsoon precipitation is a topic of much debate in the current literature. We suggest the use of a generalized linear mixed model (GLMM), specifically, the logit-normal mixed model, to describe the underlying structure of this complex climatic event. Four GLMM algorithms are described and simulations are performed to vet these algorithms before applying them to the Indian precipitation data. The logit-normal model was applied to light, moderate, and extreme rainfall. Findings indicated that physical constructs were preserved by the models, and random effects were significant in many cases. We also found GLMM estimation methods were sensitive to tuning parameters and assumptions and therefore, recommend use of multiple methods in applications. This work provides a novel use of GLMM and promotes its addition to the gamut of tools for analysis in studying climate phenomena.

  5. Using seismic and tilt measurements simultaneously to forecast eruptions of silicic volcanoes

    NASA Astrophysics Data System (ADS)

    Neuberg, Jurgen; Collinson, Amy; Mothes, Patricia

    2016-04-01

    Independent interpretations of seismic swarms and tilt measurement on active silicic volcanoes have been successfully used to assess their eruption potential. Swarms of low-frequency seismic events have been associated with brittle failure or stick-slip motion of magma during ascent and have been used to estimate qualitatively the magma ascent rate which typically accelerates before lava dome collapses. Tilt signals are extremely sensitive indicators for volcano deformation and have been often modelled and interpreted as inflation or deflation of a shallow magma reservoir. Here we show that tilt in many cases does not represent inflation or deflation but is directly linked to magma ascent rate.This talk aims to combine these two independent observations, seismicity and deformation, to design and implement a forecasting tool that can be deployed in volcano observatories on an operational level.

  6. A mechanism of extreme growth and reliable signaling in sexually selected ornaments and weapons.

    PubMed

    Emlen, Douglas J; Warren, Ian A; Johns, Annika; Dworkin, Ian; Lavine, Laura Corley

    2012-08-17

    Many male animals wield ornaments or weapons of exaggerated proportions. We propose that increased cellular sensitivity to signaling through the insulin/insulin-like growth factor (IGF) pathway may be responsible for the extreme growth of these structures. We document how rhinoceros beetle horns, a sexually selected weapon, are more sensitive to nutrition and more responsive to perturbation of the insulin/IGF pathway than other body structures. We then illustrate how enhanced sensitivity to insulin/IGF signaling in a growing ornament or weapon would cause heightened condition sensitivity and increased variability in expression among individuals--critical properties of reliable signals of male quality. The possibility that reliable signaling arises as a by-product of the growth mechanism may explain why trait exaggeration has evolved so many different times in the context of sexual selection.

  7. Integration of modern statistical tools for the analysis of climate extremes into the web-GIS “CLIMATE”

    NASA Astrophysics Data System (ADS)

    Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.

    2017-11-01

    The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.

  8. Reliability, validity, and sensitivity to change of the lower extremity functional scale in individuals affected by stroke.

    PubMed

    Verheijde, Joseph L; White, Fred; Tompkins, James; Dahl, Peder; Hentz, Joseph G; Lebec, Michael T; Cornwall, Mark

    2013-12-01

    To investigate reliability, validity, and sensitivity to change of the Lower Extremity Functional Scale (LEFS) in individuals affected by stroke. The secondary objective was to test the validity and sensitivity of a single-item linear analog scale (LAS) of function. Prospective cohort reliability and validation study. A single rehabilitation department in an academic medical center. Forty-three individuals receiving neurorehabilitation for lower extremity dysfunction after stroke were studied. Their ages ranged from 32 to 95 years, with a mean of 70 years; 77% were men. Test-retest reliability was assessed by calculating the classical intraclass correlation coefficient, and the Bland-Altman limits of agreement. Validity was assessed by calculating the Pearson correlation coefficient between the instruments. Sensitivity to change was assessed by comparing baseline scores with end of treatment scores. Measurements were taken at baseline, after 1-3 days, and at 4 and 8 weeks. The LEFS, Short-Form-36 Physical Function Scale, Berg Balance Scale, Six-Minute Walk Test, Five-Meter Walk Test, Timed Up-and-Go test, and the LAS of function were used. The test-retest reliability of the LEFS was found to be excellent (ICC = 0.96). Correlated with the 6 other measures of function studied, the validity of the LEFS was found to be moderate to high (r = 0.40-0.71). Regarding the sensitivity to change, the mean LEFS scores from baseline to study end increased 1.2 SD and for LAS 1.1 SD. LEFS exhibits good reliability, validity, and sensitivity to change in patients with lower extremity impairments secondary to stroke. Therefore, the LEFS can be a clinically efficient outcome measure in the rehabilitation of patients with subacute stroke. The LAS is shown to be a time-saving and reasonable option to track changes in a patient's functional status. Copyright © 2013 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  9. Loss of ecosystem productivity with repeated drought: a multi-year experiment to assess the role of drought legacy effects

    NASA Astrophysics Data System (ADS)

    Smith, M. D.; Knapp, A.; Hoover, D. L.; Avolio, M. L.; Felton, A. J.; Slette, I.; Wilcox, K.

    2017-12-01

    Climate extremes, such as drought, are increasing in frequency and intensity, and the ecological consequences of these extreme events can be substantial and widespread. Yet, little is known about the factors that determine recovery of ecosystem function post-drought. Such knowledge is particularly important because post-drought recovery periods can be protracted depending on drought legacy effects (e.g., loss key plant populations, altered community structure and/or biogeochemical processes). These drought legacies may alter ecosystem function for many years post-drought and may impact future sensitivity to climate extremes. With forecasts of more frequent drought, there is an imperative to understand whether and how post-drought legacies will affect ecosystem response to future drought events. To address this knowledge gap, we experimentally imposed over an eight year period two extreme growing season droughts, each two years in duration followed by a two-year recovery period, in a central US grassland. We found that aboveground net primary productivity (ANPP) declined dramatically with the first drought and was accompanied by a large shift in plant species composition (loss of C3 forb and increase in C4 grasses). This drought legacy - shift in plant composition - persisted two years post-drought. Yet, despite this legacy, ANPP recovered fully. However, we expected that previously-droughted grassland would be less sensitive to a second extreme drought due to the shift in plant composition. Contrary to this expectation, previously droughted grassland experienced a greater loss in ANPP than grassland that had not experienced drought. Furthermore, previously droughted grassland did not fully recover after the second drought. Thus, the legacy of drought - a shift in plant community composition - increased ecosystem sensitivity to a future extreme drought event.

  10. Interactions of Mean Climate Change and Climate Variability on Food Security Extremes

    NASA Technical Reports Server (NTRS)

    Ruane, Alexander C.; McDermid, Sonali; Mavromatis, Theodoros; Hudson, Nicholas; Morales, Monica; Simmons, John; Prabodha, Agalawatte; Ahmad, Ashfaq; Ahmad, Shakeel; Ahuja, Laj R.

    2015-01-01

    Recognizing that climate change will affect agricultural systems both through mean changes and through shifts in climate variability and associated extreme events, we present preliminary analyses of climate impacts from a network of 1137 crop modeling sites contributed to the AgMIP Coordinated Climate-Crop Modeling Project (C3MP). At each site sensitivity tests were run according to a common protocol, which enables the fitting of crop model emulators across a range of carbon dioxide, temperature, and water (CTW) changes. C3MP can elucidate several aspects of these changes and quantify crop responses across a wide diversity of farming systems. Here we test the hypothesis that climate change and variability interact in three main ways. First, mean climate changes can affect yields across an entire time period. Second, extreme events (when they do occur) may be more sensitive to climate changes than a year with normal climate. Third, mean climate changes can alter the likelihood of climate extremes, leading to more frequent seasons with anomalies outside of the expected conditions for which management was designed. In this way, shifts in climate variability can result in an increase or reduction of mean yield, as extreme climate events tend to have lower yield than years with normal climate.C3MP maize simulations across 126 farms reveal a clear indication and quantification (as response functions) of mean climate impacts on mean yield and clearly show that mean climate changes will directly affect the variability of yield. Yield reductions from increased climate variability are not as clear as crop models tend to be less sensitive to dangers on the cool and wet extremes of climate variability, likely underestimating losses from water-logging, floods, and frosts.

  11. Decadal-scale sensitivity of Northeast Greenland ice flow to errors in surface mass balance using ISSM

    NASA Astrophysics Data System (ADS)

    Schlegel, N.-J.; Larour, E.; Seroussi, H.; Morlighem, M.; Box, J. E.

    2013-06-01

    The behavior of the Greenland Ice Sheet, which is considered a major contributor to sea level changes, is best understood on century and longer time scales. However, on decadal time scales, its response is less predictable due to the difficulty of modeling surface climate, as well as incomplete understanding of the dynamic processes responsible for ice flow. Therefore, it is imperative to understand how modeling advancements, such as increased spatial resolution or more comprehensive ice flow equations, might improve projections of ice sheet response to climatic trends. Here we examine how a finely resolved climate forcing influences a high-resolution ice stream model that considers longitudinal stresses. We simulate ice flow using a two-dimensional Shelfy-Stream Approximation implemented within the Ice Sheet System Model (ISSM) and use uncertainty quantification tools embedded within the model to calculate the sensitivity of ice flow within the Northeast Greenland Ice Stream to errors in surface mass balance (SMB) forcing. Our results suggest that the model tends to smooth ice velocities even when forced with extreme errors in SMB. Indeed, errors propagate linearly through the model, resulting in discharge uncertainty of 16% or 1.9 Gt/yr. We find that mass flux is most sensitive to local errors but is also affected by errors hundreds of kilometers away; thus, an accurate SMB map of the entire basin is critical for realistic simulation. Furthermore, sensitivity analyses indicate that SMB forcing needs to be provided at a resolution of at least 40 km.

  12. Nonindependence and sensitivity analyses in ecological and evolutionary meta-analyses.

    PubMed

    Noble, Daniel W A; Lagisz, Malgorzata; O'dea, Rose E; Nakagawa, Shinichi

    2017-05-01

    Meta-analysis is an important tool for synthesizing research on a variety of topics in ecology and evolution, including molecular ecology, but can be susceptible to nonindependence. Nonindependence can affect two major interrelated components of a meta-analysis: (i) the calculation of effect size statistics and (ii) the estimation of overall meta-analytic estimates and their uncertainty. While some solutions to nonindependence exist at the statistical analysis stages, there is little advice on what to do when complex analyses are not possible, or when studies with nonindependent experimental designs exist in the data. Here we argue that exploring the effects of procedural decisions in a meta-analysis (e.g. inclusion of different quality data, choice of effect size) and statistical assumptions (e.g. assuming no phylogenetic covariance) using sensitivity analyses are extremely important in assessing the impact of nonindependence. Sensitivity analyses can provide greater confidence in results and highlight important limitations of empirical work (e.g. impact of study design on overall effects). Despite their importance, sensitivity analyses are seldom applied to problems of nonindependence. To encourage better practice for dealing with nonindependence in meta-analytic studies, we present accessible examples demonstrating the impact that ignoring nonindependence can have on meta-analytic estimates. We also provide pragmatic solutions for dealing with nonindependent study designs, and for analysing dependent effect sizes. Additionally, we offer reporting guidelines that will facilitate disclosure of the sources of nonindependence in meta-analyses, leading to greater transparency and more robust conclusions. © 2017 John Wiley & Sons Ltd.

  13. Influences of extreme weather, climate and pesticide use on invertebrates in cereal fields over 42 years.

    PubMed

    Ewald, Julie A; Wheatley, Christopher J; Aebischer, Nicholas J; Moreby, Stephen J; Duffield, Simon J; Crick, Humphrey Q P; Morecroft, Michael B

    2015-11-01

    Cereal fields are central to balancing food production and environmental health in the face of climate change. Within them, invertebrates provide key ecosystem services. Using 42 years of monitoring data collected in southern England, we investigated the sensitivity and resilience of invertebrates in cereal fields to extreme weather events and examined the effect of long-term changes in temperature, rainfall and pesticide use on invertebrate abundance. Of the 26 invertebrate groups examined, eleven proved sensitive to extreme weather events. Average abundance increased in hot/dry years and decreased in cold/wet years for Araneae, Cicadellidae, adult Heteroptera, Thysanoptera, Braconidae, Enicmus and Lathridiidae. The average abundance of Delphacidae, Cryptophagidae and Mycetophilidae increased in both hot/dry and cold/wet years relative to other years. The abundance of all 10 groups usually returned to their long-term trend within a year after the extreme event. For five of them, sensitivity to cold/wet events was lowest (translating into higher abundances) at locations with a westerly aspect. Some long-term trends in invertebrate abundance correlated with temperature and rainfall, indicating that climate change may affect them. However, pesticide use was more important in explaining the trends, suggesting that reduced pesticide use would mitigate the effects of climate change. © 2015 John Wiley & Sons Ltd.

  14. Extreme Weather Events and Interconnected Infrastructures: Toward More Comprehensive Climate Change Planning [Meeting challenges in understanding impacts of extreme weather events on connected infrastructures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilbanks, Thomas J.; Fernandez, Steven J.; Allen, Melissa R.

    The President s Climate Change Action Plan calls for the development of better science, data, and tools for climate preparedness. Many of the current questions about preparedness for extreme weather events in coming decades are, however, difficult to answer with assets that have been developed by climate science to answer longer-term questions about climate change. Capacities for projecting exposures to climate-related extreme events, along with their implications for interconnected infrastructures, are now emerging.

  15. Extreme Weather Events and Interconnected Infrastructures: Toward More Comprehensive Climate Change Planning [Meeting challenges in understanding impacts of extreme weather events on connected infrastructures

    DOE PAGES

    Wilbanks, Thomas J.; Fernandez, Steven J.; Allen, Melissa R.

    2015-06-23

    The President s Climate Change Action Plan calls for the development of better science, data, and tools for climate preparedness. Many of the current questions about preparedness for extreme weather events in coming decades are, however, difficult to answer with assets that have been developed by climate science to answer longer-term questions about climate change. Capacities for projecting exposures to climate-related extreme events, along with their implications for interconnected infrastructures, are now emerging.

  16. Easy-To-Use Connector-Assembly Tool

    NASA Technical Reports Server (NTRS)

    Redmon, John W., Jr.; Jankowski, Fred

    1988-01-01

    Tool compensates for user's loss of dexterity under awkward conditions. Has jaws that swivel over 180 degree so angle adjusts with respect to handles. Oriented and held in position most comfortable and effective for user in given situation. Jaws lined with rubber pads so they conform to irregularly shaped parts and grips firmly but gently. Once tool engages part, it locks on it so user can release handles without losing part. Ratchet mechanism in tool allows user to work handles back and forth in confined space to connect or disconnect part. Quickly positioned, locked, and released. Gives user feel of its grip on part. Frees grasping muscles from work during part of task, giving user greater freedom to move hand. Operates with only one hand, leaving user's other hand free to manipulate wiring or other parts. Also adapts to handling and positioning extremely-hot or extremely-cold fluid lines, contaminated objects, abrasive or sharp objects, fragile items, and soft objects.

  17. Sensitivity enhancement of chemically amplified resists and performance study using EUV interference lithography

    NASA Astrophysics Data System (ADS)

    Buitrago, Elizabeth; Nagahara, Seiji; Yildirim, Oktay; Nakagawa, Hisashi; Tagawa, Seiichi; Meeuwissen, Marieke; Nagai, Tomoki; Naruoka, Takehiko; Verspaget, Coen; Hoefnagels, Rik; Rispens, Gijsbert; Shiraishi, Gosuke; Terashita, Yuichi; Minekawa, Yukie; Yoshihara, Kosuke; Oshima, Akihiro; Vockenhuber, Michaela; Ekinci, Yasin

    2016-03-01

    Extreme ultraviolet lithography (EUVL, λ = 13.5 nm) is the most promising candidate to manufacture electronic devices for future technology nodes in the semiconductor industry. Nonetheless, EUVL still faces many technological challenges as it moves toward high-volume manufacturing (HVM). A key bottleneck from the tool design and performance point of view has been the development of an efficient, high power EUV light source for high throughput production. Consequently, there has been extensive research on different methodologies to enhance EUV resist sensitivity. Resist performance is measured in terms of its ultimate printing resolution, line width roughness (LWR), sensitivity (S or best energy BE) and exposure latitude (EL). However, there are well-known fundamental trade-off relationships (LRS trade-off) among these parameters for chemically amplified resists (CARs). Here we present early proof-of-principle results for a multi-exposure lithography process that has the potential for high sensitivity enhancement without compromising other important performance characteristics by the use of a Photosensitized Chemically Amplified Resist (PSCAR). With this method, we seek to increase the sensitivity by combining a first EUV pattern exposure with a second UV flood exposure (λ = 365 nm) and the use of a PSCAR. In addition, we have evaluated over 50 different state-of-the-art EUV CARs. Among these, we have identified several promising candidates that simultaneously meet sensitivity, LWR and EL high performance requirements with the aim of resolving line space (L/S) features for the 7 and 5 nm logic node (16 nm and 13 nm half-pitch HP, respectively) for HVM. Several CARs were additionally found to be well resolved down to 12 nm and 11 nm HP with minimal pattern collapse and bridging, a remarkable feat for CARs. Finally, the performance of two negative tone state-of-the-art alternative resist platforms previously investigated was compared to the CAR performance at and below 16 nm HP resolution, demonstrating the need for alternative resist solutions at 13 nm resolution and below. EUV interference lithography (IL) has provided and continues to provide a simple yet powerful platform for academic and industrial research enabling the characterization and development of new resist materials before commercial EUV exposure tools become available. Our experiments have been performed at the EUV-IL set-up in the Swiss Light Source (SLS) synchrotron facility located at the Paul Scherrer Institute (PSI).

  18. "Extreme Programming" in a Bioinformatics Class

    ERIC Educational Resources Information Center

    Kelley, Scott; Alger, Christianna; Deutschman, Douglas

    2009-01-01

    The importance of Bioinformatics tools and methodology in modern biological research underscores the need for robust and effective courses at the college level. This paper describes such a course designed on the principles of cooperative learning based on a computer software industry production model called "Extreme Programming" (EP).…

  19. The importance of range edges for an irruptive species during extreme weather events

    USGS Publications Warehouse

    Bateman, Brooke L.; Pidgeon, Anna M.; Radeloff, Volker C.; Allstadt, Andrew J.; Akçakaya, H. Resit; Thogmartin, Wayne E.; Vavrus, Stephen J.; Heglund, Patricia J.

    2015-01-01

    In a changing climate where more frequent extreme weather may be more common, conservation strategies for weather-sensitive species may require consideration of habitat in the edges of species’ ranges, even though non-core areas may be unoccupied in ‘normal’ years. Our results highlight the conservation importance of range edges in providing refuge from extreme events, such as drought, and climate change.

  20. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  1. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  2. Extreme rainfall, vulnerability and risk: a continental-scale assessment for South America

    USGS Publications Warehouse

    Vorosmarty, Charles J.; de Guenni, Lelys Bravo; Wollheim, Wilfred M.; Pellerin, Brian A.; Bjerklie, David M.; Cardoso, Manoel; D'Almeida, Cassiano; Colon, Lilybeth

    2013-01-01

    Extreme weather continues to preoccupy society as a formidable public safety concern bearing huge economic costs. While attention has focused on global climate change and how it could intensify key elements of the water cycle such as precipitation and river discharge, it is the conjunction of geophysical and socioeconomic forces that shapes human sensitivity and risks to weather extremes. We demonstrate here the use of high-resolution geophysical and population datasets together with documentary reports of rainfall-induced damage across South America over a multi-decadal, retrospective time domain (1960–2000). We define and map extreme precipitation hazard, exposure, affectedpopulations, vulnerability and risk, and use these variables to analyse the impact of floods as a water security issue. Geospatial experiments uncover major sources of risk from natural climate variability and population growth, with change in climate extremes bearing a minor role. While rural populations display greatest relative sensitivity to extreme rainfall, urban settings show the highest rates of increasing risk. In the coming decades, rapid urbanization will make South American cities the focal point of future climate threats but also an opportunity for reducing vulnerability, protecting lives and sustaining economic development through both traditional and ecosystem-based disaster risk management systems.

  3. Rugged, Low Cost, Environmental Sensors for a Turbulent World

    NASA Astrophysics Data System (ADS)

    Schulz, B.; Sandell, C. T.; Wickert, A. D.

    2017-12-01

    Ongoing scientific research and resource management require a diverse range of high-quality and low-cost sensors to maximize the number and type of measurements that can be obtained. To accomplish this, we have developed a series of diversified sensors for common environmental applications. The TP-DownHole is an ultra-compact temperature and pressure sensor designed for use in CMT (Continuous Multi-channel Tubing) multi-level wells. Its 1 mm water depth resolution, 30 cm altitude resolution, and rugged design make it ideal for both water level measurements and monitoring barometric pressure and associated temperature changes. The TP-DownHole sensor has also been incorporated into a self-contained, fully independent data recorder for extreme and remote environments. This device (the TP-Solo) is based around the TP-DownHole design, but has self-contained power and data storage and is designed to collect data independently for up to 6 months (logging at once an hour), creating a specialized tool for extreme environment data collection. To gather spectral information, we have also developed a very low cost photodiode-based Lux sensor to measure spectral irradiance; while this does not measure the entire solar radiation spectrum, simple modeling to rescale the remainder of the solar spectrum makes this a cost-effective alternative to a thermopile pyranometer. Lastly, we have developed an instrumentation amplifier which is designed to interface a wide range of sensitive instruments to common data logging systems, such as thermopile pyranometers, thermocouples, and many other analog output sensors. These three instruments are the first in a diverse family aimed to give researchers a set of powerful and low-cost tools for environmental instrumentation.

  4. Variability of temperature sensitivity of extreme precipitation from a regional-to-local impact scale perspective

    NASA Astrophysics Data System (ADS)

    Schroeer, K.; Kirchengast, G.

    2016-12-01

    Relating precipitation intensity to temperature is a popular approach to assess potential changes of extreme events in a warming climate. Potential increases in extreme rainfall induced hazards, such as flash flooding, serve as motivation. It has not been addressed whether the temperature-precipitation scaling approach is meaningful on a regional to local level, where the risk of climate and weather impact is dealt with. Substantial variability of temperature sensitivity of extreme precipitation has been found that results from differing methodological assumptions as well as from varying climatological settings of the study domains. Two aspects are consistently found: First, temperature sensitivities beyond the expected consistency with the Clausius-Clapeyron (CC) equation are a feature of short-duration, convective, sub-daily to sub-hourly high-percentile rainfall intensities at mid-latitudes. Second, exponential growth ceases or reverts at threshold temperatures that vary from region to region, as moisture supply becomes limited. Analyses of pooled data, or of single or dispersed stations over large areas make it difficult to estimate the consequences in terms of local climate risk. In this study we test the meaningfulness of the scaling approach from an impact scale perspective. Temperature sensitivities are assessed using quantile regression on hourly and sub-hourly precipitation data from 189 stations in the Austrian south-eastern Alpine region. The observed scaling rates vary substantially, but distinct regional and seasonal patterns emerge. High sensitivity exceeding CC-scaling is seen on the 10-minute scale more than on the hourly scale, in storms shorter than 2 hours duration, and in shoulder seasons, but it is not necessarily a significant feature of the extremes. To be impact relevant, change rates need to be linked to absolute rainfall amounts. We show that high scaling rates occur in lower temperature conditions and thus have smaller effect on absolute precipitation intensities. While reporting of mere percentage numbers can be misleading, scaling studies can add value to process understanding on the local scale, if the factors that influence scaling rates are considered from both a methodological and a physical perspective.

  5. Climate change, extreme weather events, and us health impacts: what can we say?

    PubMed

    Mills, David M

    2009-01-01

    Address how climate change impacts on a group of extreme weather events could affect US public health. A literature review summarizes arguments for, and evidence of, a climate change signal in select extreme weather event categories, projections for future events, and potential trends in adaptive capacity and vulnerability in the United States. Western US wildfires already exhibit a climate change signal. The variability within hurricane and extreme precipitation/flood data complicates identifying a similar climate change signal. Health impacts of extreme events are not equally distributed and are very sensitive to a subset of exceptional extreme events. Cumulative uncertainty in forecasting climate change driven characteristics of extreme events and adaptation prevents confidently projecting the future health impacts from hurricanes, wildfires, and extreme precipitation/floods in the United States attributable to climate change.

  6. Gis-Based Multi-Criteria Decision Analysis for Forest Fire Risk Mapping

    NASA Astrophysics Data System (ADS)

    Akay, A. E.; Erdoğan, A.

    2017-11-01

    The forested areas along the coastal zone of the Mediterranean region in Turkey are classified as first-degree fire sensitive areas. Forest fires are major environmental disaster that affects the sustainability of forest ecosystems. Besides, forest fires result in important economic losses and even threaten human lives. Thus, it is critical to determine the forested areas with fire risks and thereby minimize the damages on forest resources by taking necessary precaution measures in these areas. The risk of forest fire can be assessed based on various factors such as forest vegetation structures (tree species, crown closure, tree stage), topographic features (slope and aspect), and climatic parameters (temperature, wind). In this study, GIS-based Multi-Criteria Decision Analysis (MCDA) method was used to generate forest fire risk map. The study was implemented in the forested areas within Yayla Forest Enterprise Chiefs at Dursunbey Forest Enterprise Directorate which is classified as first degree fire sensitive area. In the solution process, "extAhp 2.0" plug-in running Analytic Hierarchy Process (AHP) method in ArcGIS 10.4.1 was used to categorize study area under five fire risk classes: extreme risk, high risk, moderate risk, and low risk. The results indicated that 23.81 % of the area was of extreme risk, while 25.81 % was of high risk. The result indicated that the most effective criterion was tree species, followed by tree stages. The aspect had the least effective criterion on forest fire risk. It was revealed that GIS techniques integrated with MCDA methods are effective tools to quickly estimate forest fire risk at low cost. The integration of these factors into GIS can be very useful to determine forested areas with high fire risk and also to plan forestry management after fire.

  7. Multi-window detection for P-wave in electrocardiograms based on bilateral accumulative area.

    PubMed

    Chen, Riqing; Huang, Yingsong; Wu, Jian

    2016-11-01

    P-wave detection is one of the most challenging aspects in electrocardiograms (ECGs) due to its low amplitude, low frequency, and variable waveforms. This work introduces a novel multi-window detection method for P-wave delineation based on the bilateral accumulative area. The bilateral accumulative area is calculated by summing the areas covered by the P-wave curve with left and right sliding windows. The onset and offset of a positive P-wave correspond to the local maxima of the area detector. The position drift and difference in area variation of local extreme points with different windows are used to systematically combine multi-window and 12-lead synchronous detection methods, which are used to screen the optimization boundary points from all extreme points of different window widths and adaptively match the P-wave location. The proposed method was validated with ECG signals from various databases, including the Standard CSE Database, T-Wave Alternans Challenge Database, PTB Diagnostic ECG Database, and the St. Petersburg Institute of Cardiological Technics 12-Lead Arrhythmia Database. The average sensitivity Se was 99.44% with a positive predictivity P+ of 99.37% for P-wave detection. Standard deviations of 3.7 and 4.3ms were achieved for the onset and offset of P-waves, respectively, which is in agreement with the accepted tolerances required by the CSE committee. Compared with well-known delineation methods, this method can achieve high sensitivity and positive predictability using a simple calculation process. The experiment results suggest that the bilateral accumulative area could be an effective detection tool for ECG signal analysis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Spatial extremes modeling applied to extreme precipitation data in the state of Paraná

    NASA Astrophysics Data System (ADS)

    Olinda, R. A.; Blanchet, J.; dos Santos, C. A. C.; Ozaki, V. A.; Ribeiro, P. J., Jr.

    2014-11-01

    Most of the mathematical models developed for rare events are based on probabilistic models for extremes. Although the tools for statistical modeling of univariate and multivariate extremes are well developed, the extension of these tools to model spatial extremes includes an area of very active research nowadays. A natural approach to such a modeling is the theory of extreme spatial and the max-stable process, characterized by the extension of infinite dimensions of multivariate extreme value theory, and making it possible then to incorporate the existing correlation functions in geostatistics and therefore verify the extremal dependence by means of the extreme coefficient and the Madogram. This work describes the application of such processes in modeling the spatial maximum dependence of maximum monthly rainfall from the state of Paraná, based on historical series observed in weather stations. The proposed models consider the Euclidean space and a transformation referred to as space weather, which may explain the presence of directional effects resulting from synoptic weather patterns. This method is based on the theorem proposed for de Haan and on the models of Smith and Schlather. The isotropic and anisotropic behavior of these models is also verified via Monte Carlo simulation. Estimates are made through pairwise likelihood maximum and the models are compared using the Takeuchi Information Criterion. By modeling the dependence of spatial maxima, applied to maximum monthly rainfall data from the state of Paraná, it was possible to identify directional effects resulting from meteorological phenomena, which, in turn, are important for proper management of risks and environmental disasters in countries with its economy heavily dependent on agribusiness.

  9. Changes in regional climate extremes as a function of global mean temperature: an interactive plotting framework

    NASA Astrophysics Data System (ADS)

    Wartenburger, Richard; Hirschi, Martin; Donat, Markus G.; Greve, Peter; Pitman, Andy J.; Seneviratne, Sonia I.

    2017-09-01

    This article extends a previous study Seneviratne et al. (2016) to provide regional analyses of changes in climate extremes as a function of projected changes in global mean temperature. We introduce the DROUGHT-HEAT Regional Climate Atlas, an interactive tool to analyse and display a range of well-established climate extremes and water-cycle indices and their changes as a function of global warming. These projections are based on simulations from the fifth phase of the Coupled Model Intercomparison Project (CMIP5). A selection of example results are presented here, but users can visualize specific indices of interest using the online tool. This implementation enables a direct assessment of regional climate changes associated with global mean temperature targets, such as the 2 and 1.5° limits agreed within the 2015 Paris Agreement.

  10. Absolute sensitivity calibration of an extreme ultraviolet spectrometer for tokamak measurements

    NASA Astrophysics Data System (ADS)

    Guirlet, R.; Schwob, J. L.; Meyer, O.; Vartanian, S.

    2017-01-01

    An extreme ultraviolet spectrometer installed on the Tore Supra tokamak has been calibrated in absolute units of brightness in the range 10-340 Å. This has been performed by means of a combination of techniques. The range 10-113 Å was absolutely calibrated by using an ultrasoft-X ray source emitting six spectral lines in this range. The calibration transfer to the range 113-182 Å was performed using the spectral line intensity branching ratio method. The range 182-340 Å was calibrated thanks to radiative-collisional modelling of spectral line intensity ratios. The maximum sensitivity of the spectrometer was found to lie around 100 Å. Around this wavelength, the sensitivity is fairly flat in a 80 Å wide interval. The spatial variations of sensitivity along the detector assembly were also measured. The observed trend is related to the quantum efficiency decrease as the angle of the incoming photon trajectories becomes more grazing.

  11. An evaluation of selected in silico models for the assessment of skin sensitization potential – performance and practical utility considerations (QSAR conference)

    EPA Science Inventory

    Skin sensitization remains an important endpoint for consumers, manufacturers and regulators. Although the development of alternative approaches to assess skin sensitization potential has been extremely active over many years, the implication of regulations such as REACH and the ...

  12. Shot-noise-limited optical Faraday polarimetry with enhanced laser noise cancelling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Jiaming; Department of Physics, Indiana University Purdue University Indianapolis, Indianapolis, Indiana 46202; Luo, Le, E-mail: leluo@iupui.edu

    2014-03-14

    We present a shot-noise-limited measurement of optical Faraday rotations with sub-ten-nanoradian angular sensitivity. This extremely high sensitivity is achieved by using electronic laser noise cancelling and phase sensitive detection. Specially, an electronic laser noise canceller with a common mode rejection ratio of over 100 dB was designed and built for enhanced laser noise cancelling. By measuring the Faraday rotation of ambient air, we demonstrate an angular sensitivity of up to 9.0×10{sup −9} rad/√(Hz), which is limited only by the shot-noise of the photocurrent of the detector. To date, this is the highest angular sensitivity ever reported for Faraday polarimeters in the absencemore » of cavity enhancement. The measured Verdet constant of ambient air, 1.93(3)×10{sup −9}rad/(G cm) at 633 nm wavelength, agrees extremely well with the earlier experiments using high finesse optical cavities. Further, we demonstrate the applications of this sensitive technique in materials science by measuring the Faraday effect of an ultrathin iron film.« less

  13. Assessing Regional Scale Variability in Extreme Value Statistics Under Altered Climate Scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brunsell, Nathaniel; Mechem, David; Ma, Chunsheng

    Recent studies have suggested that low-frequency modes of climate variability can significantly influence regional climate. The climatology associated with extreme events has been shown to be particularly sensitive. This has profound implications for droughts, heat waves, and food production. We propose to examine regional climate simulations conducted over the continental United States by applying a recently developed technique which combines wavelet multi–resolution analysis with information theory metrics. This research is motivated by two fundamental questions concerning the spatial and temporal structure of extreme events. These questions are 1) what temporal scales of the extreme value distributions are most sensitive tomore » alteration by low-frequency climate forcings and 2) what is the nature of the spatial structure of variation in these timescales? The primary objective is to assess to what extent information theory metrics can be useful in characterizing the nature of extreme weather phenomena. Specifically, we hypothesize that (1) changes in the nature of extreme events will impact the temporal probability density functions and that information theory metrics will be sensitive these changes and (2) via a wavelet multi–resolution analysis, we will be able to characterize the relative contribution of different timescales on the stochastic nature of extreme events. In order to address these hypotheses, we propose a unique combination of an established regional climate modeling approach and advanced statistical techniques to assess the effects of low-frequency modes on climate extremes over North America. The behavior of climate extremes in RCM simulations for the 20th century will be compared with statistics calculated from the United States Historical Climatology Network (USHCN) and simulations from the North American Regional Climate Change Assessment Program (NARCCAP). This effort will serve to establish the baseline behavior of climate extremes, the validity of an innovative multi–resolution information theory approach, and the ability of the RCM modeling framework to represent the low-frequency modulation of extreme climate events. Once the skill of the modeling and analysis methodology has been established, we will apply the same approach for the AR5 (IPCC Fifth Assessment Report) climate change scenarios in order to assess how climate extremes and the the influence of lowfrequency variability on climate extremes might vary under changing climate. The research specifically addresses the DOE focus area 2. Simulation of climate extremes under a changing climate. Specific results will include (1) a better understanding of the spatial and temporal structure of extreme events, (2) a thorough quantification of how extreme values are impacted by low-frequency climate teleconnections, (3) increased knowledge of current regional climate models ability to ascertain these influences, and (4) a detailed examination of the how the distribution of extreme events are likely to change under different climate change scenarios. In addition, this research will assess the ability of the innovative wavelet information theory approach to characterize extreme events. Any and all of these results will greatly enhance society’s ability to understand and mitigate the regional ramifications of future global climate change.« less

  14. Weather and extremes in the last Millennium - a challenge for climate modelling

    NASA Astrophysics Data System (ADS)

    Raible, Christoph C.; Blumer, Sandro R.; Gomez-Navarro, Juan J.; Lehner, Flavio

    2015-04-01

    Changes in the climate mean state are expected to influence society, but the socio-economic sensitivity to extreme events might be even more severe. Whether or not the current frequency and severity of extreme events is a unique characteristic of anthropogenic-driven climate change can be assessed by putting the observed changes in a long-term perspective. In doing so, early instrumental series and proxy archives are a rich source to investigate also extreme events, in particular during the last millennium, yet they suffer from spatial and temporal scarcity. Therefore, simulations with coupled general circulation models (GCMs) could fill such gaps and help in deepening our process understanding. In this study, an overview of past and current efforts as well as challenges in modelling paleo weather and extreme events is presented. Using simulations of the last millennium we investigate extreme midlatitude cyclone characteristics, precipitation, and their connection to large-scale atmospheric patterns in the North Atlantic European region. In cold climate states such as the Maunder Minimum, the North Atlantic Oscillation (NAO) is found to be predominantly in its negative phase. In this sense, simulations of different models agree with proxy findings for this period. However, some proxy data available for this period suggests an increase in storminess during this period, which could be interpreted as a positive phase of the NAO - a superficial contradiction. The simulated cyclones are partly reduced over Europe, which is consistent with the aforementioned negative phase of the NAO. However, as the meridional temperature gradient is increased during this period - which constitutes a source of low-level baroclincity - they also intensify. This example illustrates how model simulations could be used to improve our proxy interpretation and to gain additional process understanding. Nevertheless, there are also limitations associated with climate modeling efforts to simulate the last millennium. In particular, these models still struggle to properly simulate atmospheric blocking events, an important dynamical feature for dry conditions during summer times. Finally, new and promising ways in improving past climate modelling are briefly introduced. In particular, the use of dynamical downscaling is a powerful tool to bridge the gap between the coarsely resolved GCMs and characteristics of the regional climate, which is potentially recorded in proxy archives. In particular, the representation of extreme events could be improved by dynamical downscaling as processes are better resolved than GCMs.

  15. Development of a Wafer Positioning System for the Sandia Extreme Ultraviolet Lithography Tool

    NASA Technical Reports Server (NTRS)

    Wronosky, John B.; Smith, Tony G.; Darnold, Joel R.

    1996-01-01

    A wafer positioning system was recently developed by Sandia National Laboratories for an Extreme Ultraviolet Lithography (EUVL) tool. The system, which utilizes a magnetically levitated fine stage to provide ultra-precise positioning in all six degrees of freedom, incorporates technological improvements resulting from four years of prototype development. This paper describes the design, implementation, and functional capability of the system. Specifics regarding control system electronics, including software and control algorithm structure, as well as performance design goals and test results are presented. Potential system enhancements, some of which are in process, are also discussed.

  16. Comparative sensitizing potencies of fragrances, preservatives, and hair dyes.

    PubMed

    Lidén, Carola; Yazar, Kerem; Johansen, Jeanne D; Karlberg, Ann-Therese; Uter, Wolfgang; White, Ian R

    2016-11-01

    The local lymph node assay (LLNA) is used for assessing sensitizing potential in hazard identification and risk assessment for regulatory purposes. Sensitizing potency on the basis of the LLNA is categorized into extreme (EC3 value of ≤0.2%), strong (>0.2% to ≤2%), and moderate (>2%). To compare the sensitizing potencies of fragrance substances, preservatives, and hair dye substances, which are skin sensitizers that frequently come into contact with the skin of consumers and workers, LLNA results and EC3 values for 72 fragrance substances, 25 preservatives and 107 hair dye substances were obtained from two published compilations of LLNA data and opinions by the Scientific Committee on Consumer Safety and its predecessors. The median EC3 values of fragrances (n = 61), preservatives (n = 19) and hair dyes (n = 59) were 5.9%, 0.9%, and 1.3%, respectively. The majority of sensitizing preservatives and hair dyes are thus strong or extreme sensitizers (EC3 value of ≤2%), and fragrances are mostly moderate sensitizers. Although fragrances are typically moderate sensitizers, they are among the most frequent causes of contact allergy. This indicates that factors other than potency need to be addressed more rigorously in risk assessment and risk management. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Magnetorheological finishing (MRF) of potassium dihydrogen phosphate (KDP) crystals: nonaqueous fluids development, optical finish, and laser damage performance at 1064 nm and 532 nm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menapace, J A; Ehrmann, P R; Bickel, R C

    2009-11-05

    Over the past year we have been working on specialized MR fluids for polishing KDP crystals. KDP is an extremely difficult material to conventionally polish due to its water solubility, low hardness, and temperature sensitivity. Today, KDP crystals are finished using single-point diamond turning (SPDT) tools and nonaqueous lubricants/coolants. KDP optics fabricated using SPDT, however, are limited to surface corrections due to tool/method characteristics with surface quality driven by microroughness from machine pitch, speed, force, and diamond tool character. MRF polishing offers a means to circumvent many of these issues since it is deterministic which makes the technique practical formore » surface and transmitted wavefront correction, is low force, and is temperature independent. What is lacking is a usable nonaqueous MR fluid that is chemically and physically compatible with KDP which can be used for polishing and subsequently cleaned from the optical surface. In this study, we will present the fluid parameters important in the design and development of nonaqueous MR fluid formulations capable of polishing KDP and how these parameters affect MRF polishing. We will also discuss requirements peculiar to successful KDP polishing and how they affect optical figure/finish and laser damage performance at 1064 nm and 532 nm.« less

  18. Material Behavior At The Extreme Cutting Edge In Bandsawing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarwar, Mohammed; Haider, Julfikar; Persson, Martin

    2011-01-17

    In recent years, bandsawing has been widely accepted as a favourite option for metal cutting off operations where the accuracy of cut, good surface finish, low kerf loss, long tool life and high material removal rate are required. Material removal by multipoint cutting tools such as bandsaw is a complex mechanism owing to the geometry of the bandsaw tooth (e.g., limited gullet size, tooth setting etc.) and the layer of material removed or undeformed chip thickness or depth of cut (5 {mu}m-50 {mu}m) being smaller than or equal to the cutting edge radius (5 {mu}m-15 {mu}m). This situation can leadmore » to inefficient material removal in bandsawing. Most of the research work are concentrated on the mechanics of material removal by single point cutting tool such as lathe tool. However, such efforts are very limited in multipoint cutting tools such as in bandsaw. This paper presents the fundamental understanding of the material behaviour at the extreme cutting edge of bandsaw tooth, which would help in designing and manufacturing of blades with higher cutting performance and life. ''High Speed Photography'' has been carried out to analyse the material removal process at the extreme cutting edge of bandsaw tooth. Geometric model of chip formation mechanisms based on the evidences found during ''High Speed Photography'' and ''Quick Stop'' process is presented. Wear modes and mechanism in bimetal and carbide tipped bandsaw teeth are also presented.« less

  19. Water Power Data and Tools | Water Power | NREL

    Science.gov Websites

    computer modeling tools and data with state-of-the-art design and analysis. Photo of a buoy designed around National Wind Technology Center's Information Portal as well as a WEC-Sim fact sheet. WEC Design Response Toolbox The WEC Design Response Toolbox provides extreme response and fatigue analysis tools specifically

  20. Forensic Analysis of Windows Hosts Using UNIX-based Tools

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cory Altheide

    2004-07-19

    Many forensic examiners are introduced to UNIX-based forensic utilities when faced with investigating a UNIX-like operating system for the first time. They will use these utilities for this very specific task, because in many cases these tools are the only ones for the given job. For example, at the time of this writing, given a FreeBSD 5.x file system, the author's only choice is to use The Coroner's Toolkit running on FreeBSD 5.x. However, many of the same tools examiners use for the occasional UNIX-like system investigation are extremely capable when a Windows system is the target. Indeed, the Linuxmore » operating system itself can prove to be an extremely useful forensics platform with very little use of specialized forensics utilities at all.« less

  1. Compression ultrasonography of the lower extremity with portable vascular ultrasonography can accurately detect deep venous thrombosis in the emergency department.

    PubMed

    Crisp, Jonathan G; Lovato, Luis M; Jang, Timothy B

    2010-12-01

    Compression ultrasonography of the lower extremity is an established method of detecting proximal lower extremity deep venous thrombosis when performed by a certified operator in a vascular laboratory. Our objective is to determine the sensitivity and specificity of bedside 2-point compression ultrasonography performed in the emergency department (ED) with portable vascular ultrasonography for the detection of proximal lower extremity deep venous thrombosis. We did this by directly comparing emergency physician-performed ultrasonography to lower extremity duplex ultrasonography performed by the Department of Radiology. This was a prospective, cross-sectional study and diagnostic test assessment of a convenience sample of ED patients with a suspected lower extremity deep venous thrombosis, conducted at a single-center, urban, academic ED. All physicians had a 10-minute training session before enrolling patients. ED compression ultrasonography occurred before Department of Radiology ultrasonography and involved identification of 2 specific points: the common femoral and popliteal vessels, with subsequent compression of the common femoral and popliteal veins. The study result was considered positive for proximal lower extremity deep venous thrombosis if either vein was incompressible or a thrombus was visualized. Sensitivity and specificity were calculated with the final radiologist interpretation of the Department of Radiology ultrasonography as the criterion standard. A total of 47 physicians performed 199 2-point compression ultrasonographic examinations in the ED. Median number of examinations per physician was 2 (range 1 to 29 examinations; interquartile range 1 to 5 examinations). There were 45 proximal lower extremity deep venous thromboses observed on Department of Radiology evaluation, all correctly identified by ED 2-point compression ultrasonography. The 153 patients without proximal lower extremity deep venous thrombosis all had a negative ED compression ultrasonographic result. One patient with a negative Department of Radiology ultrasonographic result was found to have decreased compression of the popliteal vein on ED compression ultrasonography, giving a single false-positive result, yet repeated ultrasonography by the Department of Radiology 1 week later showed a popliteal deep venous thrombosis. The sensitivity and specificity of ED 2-point compression ultrasonography for deep venous thrombosis were 100% (95% confidence interval 92% to 100%) and 99% (95% confidence interval 96% to 100%), respectively. Emergency physician-performed 2-point compression ultrasonography of the lower extremity with a portable vascular ultrasonographic machine, conducted in the ED by this physician group and in this patient sample, accurately identified the presence and absence of proximal lower extremity deep venous thrombosis. Copyright © 2010 American College of Emergency Physicians. Published by Mosby, Inc. All rights reserved.

  2. Field-expedient screening and injury risk algorithm categories as predictors of noncontact lower extremity injury.

    PubMed

    Lehr, M E; Plisky, P J; Butler, R J; Fink, M L; Kiesel, K B; Underwood, F B

    2013-08-01

    In athletics, efficient screening tools are sought to curb the rising number of noncontact injuries and associated health care costs. The authors hypothesized that an injury prediction algorithm that incorporates movement screening performance, demographic information, and injury history can accurately categorize risk of noncontact lower extremity (LE) injury. One hundred eighty-three collegiate athletes were screened during the preseason. The test scores and demographic information were entered into an injury prediction algorithm that weighted the evidence-based risk factors. Athletes were then prospectively followed for noncontact LE injury. Subsequent analysis collapsed the groupings into two risk categories: Low (normal and slight) and High (moderate and substantial). Using these groups and noncontact LE injuries, relative risk (RR), sensitivity, specificity, and likelihood ratios were calculated. Forty-two subjects sustained a noncontact LE injury over the course of the study. Athletes identified as High Risk (n = 63) were at a greater risk of noncontact LE injury (27/63) during the season [RR: 3.4 95% confidence interval 2.0 to 6.0]. These results suggest that an injury prediction algorithm composed of performance on efficient, low-cost, field-ready tests can help identify individuals at elevated risk of noncontact LE injury. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  3. Invited Article: A review of haptic optical tweezers for an interactive microworld exploration

    NASA Astrophysics Data System (ADS)

    Pacoret, Cécile; Régnier, Stéphane

    2013-08-01

    This paper is the first review of haptic optical tweezers, a new technique which associates force feedback teleoperation with optical tweezers. This technique allows users to explore the microworld by sensing and exerting picoNewton-scale forces with trapped microspheres. Haptic optical tweezers also allow improved dexterity of micromanipulation and micro-assembly. One of the challenges of this technique is to sense and magnify picoNewton-scale forces by a factor of 1012 to enable human operators to perceive interactions that they have never experienced before, such as adhesion phenomena, extremely low inertia, and high frequency dynamics of extremely small objects. The design of optical tweezers for high quality haptic feedback is challenging, given the requirements for very high sensitivity and dynamic stability. The concept, design process, and specification of optical tweezers reviewed here are focused on those intended for haptic teleoperation. In this paper, two new specific designs as well as the current state-of-the-art are presented. Moreover, the remaining important issues are identified for further developments. The initial results obtained are promising and demonstrate that optical tweezers have a significant potential for haptic exploration of the microworld. Haptic optical tweezers will become an invaluable tool for force feedback micromanipulation of biological samples and nano- and micro-assembly parts.

  4. Exposure assessment in different occupational groups at a hospital using Quick Exposure Check (QEC) - a pilot study.

    PubMed

    Ericsson, Pernilla; Björklund, Martin; Wahlström, Jens

    2012-01-01

    In order to test the feasibility and sensitivity of the ergonomic exposure assessment tool Quick Exposure Check (QEC), a pilot-study was conducted. The aim was to test QEC in different occupational groups to compare the exposure in the most common work task with the exposure in the work task perceived as the most strenuous for the neck/shoulder region, and to test intra-observer reliability. One experienced ergonomist observed 23 workers. The mean observation time was 45 minutes, waiting time and time for complementary questions included. The exposure scores varied between the different occupational groups as well as between workers within the occupational groups. Eighteen workers rated their most common work task as also being the most strenuous for the neck/shoulder region. For the remaining five workers, the mean exposure score were higher both for the neck and shoulder/arm in the most common work task. Intra-observer reliability shows agreement in 86% of the exposure interactions in the neck and in 71% in the shoulder/arm. QEC seems to fulfill the expectations of being a quick, sensible and practical exposure assessment tool that covers physical risk factors in the neck, upper extremities and low back.

  5. Multiplex Detection of Rare Mutations by Picoliter Droplet Based Digital PCR: Sensitivity and Specificity Considerations.

    PubMed

    Zonta, Eleonora; Garlan, Fanny; Pécuchet, Nicolas; Perez-Toralla, Karla; Caen, Ouriel; Milbury, Coren; Didelot, Audrey; Fabre, Elizabeth; Blons, Hélène; Laurent-Puig, Pierre; Taly, Valérie

    2016-01-01

    In cancer research, the accuracy of the technology used for biomarkers detection is remarkably important. In this context, digital PCR represents a highly sensitive and reproducible method that could serve as an appropriate tool for tumor mutational status analysis. In particular, droplet-based digital PCR approaches have been developed for detection of tumor-specific mutated alleles within plasmatic circulating DNA. Such an approach calls for the development and validation of a very significant quantity of assays, which can be extremely costly and time consuming. Herein, we evaluated assays for the detection and quantification of various mutations occurring in three genes often misregulated in cancers: the epidermal growth factor receptor (EGFR), the v-Ki-ras2 Kirsten rat sarcoma viral oncogene homolog (KRAS) and the Tumoral Protein p53 (TP53) genes. In particular, commercial competitive allele-specific TaqMan® PCR (castPCR™) technology, as well as TaqMan® and ZEN™ assays, have been evaluated for EGFR p.L858R, p.T790M, p.L861Q point mutations and in-frame deletions Del19. Specificity and sensitivity have been determined on cell lines DNA, plasmatic circulating DNA of lung cancer patients or Horizon Diagnostics Reference Standards. To show the multiplexing capabilities of this technology, several multiplex panels for EGFR (several three- and four-plexes) have been developed, offering new "ready-to-use" tests for lung cancer patients.

  6. Finger and foot tapping as alternative outcomes of upper and lower extremity function in multiple sclerosis.

    PubMed

    Tanigawa, Makoto; Stein, Jason; Park, John; Kosa, Peter; Cortese, Irene; Bielekova, Bibiana

    2017-01-01

    While magnetic resonance imaging contrast-enhancing lesions represent an excellent screening tool for disease-modifying treatments in relapsing-remitting multiple sclerosis (RRMS), this biomarker is insensitive for testing therapies against compartmentalized inflammation in progressive multiple sclerosis (MS). Therefore, alternative sensitive outcomes are needed. Using machine learning, clinician-acquired disability scales can be combined with timed measures of neurological functions such as walking speed (e.g. 25-foot walk; 25FW) or fine finger movements (e.g. 9-hole peg test; 9HPT) into sensitive composite clinical scales, such as the recently developed combinatorial, weight-adjusted disability scale (CombiWISE). Ideally, these complementary simplified measurements of certain neurological functions could be performed regularly at patients' homes using smartphones. We asked whether tests amenable to adaptation to smartphone technology, such as finger and foot tapping have comparable sensitivity and specificity to current non-clinician-acquired disability measures. We observed that finger and foot tapping can differentiate RRMS and progressive MS in a cross-sectional study and can also measure yearly and two-year disease progression in the latter, with better power (based on z-scores) in comparison to currently utilized 9HPT and 25FW. Replacing the 9HPT and 25FW with simplified tests broadly adaptable to smartphone technology may enhance the power of composite scales for progressive MS.

  7. Mangrove expansion and contraction at a poleward range limit: Climate extremes and land-ocean temperature gradients

    USGS Publications Warehouse

    Osland, Michael J.; Day, Richard H.; Hall, Courtney T.; Brumfield, Marisa D; Dugas, Jason; Jones, William R.

    2017-01-01

    Within the context of climate change, there is a pressing need to better understand the ecological implications of changes in the frequency and intensity of climate extremes. Along subtropical coasts, less frequent and warmer freeze events are expected to permit freeze-sensitive mangrove forests to expand poleward and displace freeze-tolerant salt marshes. Here, our aim was to better understand the drivers of poleward mangrove migration by quantifying spatiotemporal patterns in mangrove range expansion and contraction across land-ocean temperature gradients. Our work was conducted in a freeze-sensitive mangrove-marsh transition zone that spans a land-ocean temperature gradient in one of the world's most wetland-rich regions (Mississippi River Deltaic Plain; Louisiana, USA). We used historical air temperature data (1893-2014), alternative future climate scenarios, and coastal wetland coverage data (1978-2011) to investigate spatiotemporal fluctuations and climate-wetland linkages. Our analyses indicate that changes in mangrove coverage have been controlled primarily by extreme freeze events (i.e., air temperatures below a threshold zone of -6.3 to -7.6 °C). We expect that in the past 121 years, mangrove range expansion and contraction has occurred across land-ocean temperature gradients. Mangrove resistance, resilience, and dominance were all highest in areas closer to the ocean where temperature extremes were buffered by large expanses of water and saturated soil. Under climate change, these areas will likely serve as local hotspots for mangrove dispersal, growth, range expansion, and displacement of salt marsh. Collectively, our results show that the frequency and intensity of freeze events across land-ocean temperature gradients greatly influences spatiotemporal patterns of range expansion and contraction of freeze-sensitive mangroves. We expect that, along subtropical coasts, similar processes govern the distribution and abundance of other freeze-sensitive organisms. In broad terms, our findings can be used to better understand and anticipate the ecological effects of changing winter climate extremes, especially within the transition zone between tropical and temperate climates.

  8. Who is more vulnerable to death from extremely cold temperatures? A case-only approach in Hong Kong with a temperate climate

    NASA Astrophysics Data System (ADS)

    Qiu, Hong; Tian, Linwei; Ho, Kin-fai; Yu, Ignatius T. S.; Thach, Thuan-Quoc; Wong, Chit-Ming

    2016-05-01

    The short-term effects of ambient cold temperature on mortality have been well documented in the literature worldwide. However, less is known about which subpopulations are more vulnerable to death related to extreme cold. We aimed to examine the personal characteristics and underlying causes of death that modified the association between extreme cold and mortality in a case-only approach. Individual information of 197,680 deaths of natural causes, daily temperature, and air pollution concentrations in cool season (November-April) during 2002-2011 in Hong Kong were collected. Extreme cold was defined as those days with preceding week with a daily maximum temperature at or less than the 1st percentile of its distribution. Logistic regression models were used to estimate the effects of modification, further controlling for age, seasonal pattern, and air pollution. Sensitivity analyses were conducted by using the 5th percentile as cutoff point to define the extreme cold. Subjects with age of 85 and older were more vulnerable to extreme cold, with an odds ratio (OR) of 1.33 (95 % confidence interval (CI), 1.22-1.45). The greater risk of extreme cold-related mortality was observed for total cardiorespiratory diseases and several specific causes including hypertensive diseases, stroke, congestive heart failure, chronic obstructive pulmonary disease (COPD), and pneumonia. Hypertensive diseases exhibited the greatest vulnerability to extreme cold exposure, with an OR of 1.37 (95 % CI, 1.13-1.65). Sensitivity analyses showed the robustness of these effect modifications. This evidence on which subpopulations are vulnerable to the adverse effects of extreme cold is important to inform public health measures to minimize those effects.

  9. A Non-Stationary Approach for Estimating Future Hydroclimatic Extremes Using Monte-Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Byun, K.; Hamlet, A. F.

    2017-12-01

    There is substantial evidence that observed hydrologic extremes (e.g. floods, extreme stormwater events, and low flows) are changing and that climate change will continue to alter the probability distributions of hydrologic extremes over time. These non-stationary risks imply that conventional approaches for designing hydrologic infrastructure (or making other climate-sensitive decisions) based on retrospective analysis and stationary statistics will become increasingly problematic through time. To develop a framework for assessing risks in a non-stationary environment our study develops a new approach using a super ensemble of simulated hydrologic extremes based on Monte Carlo (MC) methods. Specifically, using statistically downscaled future GCM projections from the CMIP5 archive (using the Hybrid Delta (HD) method), we extract daily precipitation (P) and temperature (T) at 1/16 degree resolution based on a group of moving 30-yr windows within a given design lifespan (e.g. 10, 25, 50-yr). Using these T and P scenarios we simulate daily streamflow using the Variable Infiltration Capacity (VIC) model for each year of the design lifespan and fit a Generalized Extreme Value (GEV) probability distribution to the simulated annual extremes. MC experiments are then used to construct a random series of 10,000 realizations of the design lifespan, estimating annual extremes using the estimated unique GEV parameters for each individual year of the design lifespan. Our preliminary results for two watersheds in Midwest show that there are considerable differences in the extreme values for a given percentile between conventional MC and non-stationary MC approach. Design standards based on our non-stationary approach are also directly dependent on the design lifespan of infrastructure, a sensitivity which is notably absent from conventional approaches based on retrospective analysis. The experimental approach can be applied to a wide range of hydroclimatic variables of interest.

  10. Developing a healthcare law library.

    PubMed

    Sconyers, J M

    1998-01-01

    Legal materials are expensive, bulky, and extremely time sensitive. Selecting the appropriate means of ensuring easy access to easily-retrievable, timely legal materials is of extreme importance to any lawyer. The author gives an overview of the various means of retrieving necessary research, including the strengths and weaknesses of each of the various options.

  11. The Wide-Field Infrared Survey Explorer (WISE): Mission Description and Initial On-Orbit Performance

    NASA Technical Reports Server (NTRS)

    Wright, Edward L.; Eisenhardt, Peter R. M.; Mainzer, Amy; Ressler, Michael E.; Cutri, Roc M.; Jarrett, Thomas; Kirkpatrick, J. Davy; Padgett, Deborah; McMillan, Robert S.; Skrutskie,Michael; hide

    2010-01-01

    The all sky surveys done by the Palomar Observatory Schmidt, the European Southern Observatory Schmidt, and the United Kingdom Schmidt, the InfraRed Astronomical Satellite and the 2 Micron All Sky Survey have proven to be extremely useful tools for astronomy with value that lasts for decades. The Wide-field Infrared Survey Explorer is mapping the whole sky following its launch on 14 December 2009. WISE began surveying the sky on 14 Jan 2010 and completed its first full coverage of the sky on July 17. The survey will continue to cover the sky a second time until the cryogen is exhausted (anticipated in November 2010). WISE is achieving 5 sigma point source sensitivities better than 0.08, 0.11, 1 and 6 mJy in unconfused regions on the ecliptic in bands centered at wavelengths of 3.4, 4.6, 12 and 22 micrometers. Sensitivity improves toward the ecliptic poles due to denser coverage and lower zodiacal background. The angular resolution is 6.1", 6.4", 6.5" and 12.0" at 3.4, 4.6, 12 and 22 micrometers, and the astrometric precision for high SNR sources is better than 0.15".

  12. Optical imaging of RNAi-mediated silencing of cancer

    NASA Astrophysics Data System (ADS)

    Ochiya, Takahiro; Honma, Kimi; Takeshita, Fumitaka; Nagahara, Shunji

    2008-02-01

    RNAi has rapidly become a powerful tool for drug target discovery and validation in an in vitro culture system and, consequently, interest is rapidly growing for extension of its application to in vivo systems, such as animal disease models and human therapeutics. Cancer is one obvious application for RNAi therapeutics, because abnormal gene expression is thought to contribute to the pathogenesis and maintenance of the malignant phenotype of cancer and thereby many oncogenes and cell-signaling molecules present enticing drug target possibilities. RNAi, potent and specific, could silence tumor-related genes and would appear to be a rational approach to inhibit tumor growth. In subsequent in vivo studies, the appropriate cancer model must be developed for an evaluation of siRNA effects on tumors. How to evaluate the effect of siRNA in an in vivo therapeutic model is also important. Accelerating the analyses of these models and improving their predictive value through whole animal imaging methods, which provide cancer inhibition in real time and are sensitive to subtle changes, are crucial for rapid advancement of these approaches. Bioluminescent imaging is one of these optically based imaging methods that enable rapid in vivo analyses of a variety of cellular and molecular events with extreme sensitivity.

  13. The MOLLER Experiment: ``An Ultra-precise Measurement of the Weak Charge of the Electron using moller Scattering''

    NASA Astrophysics Data System (ADS)

    Beminiwattha, Rakitha; Moller Collaboration

    2017-09-01

    Parity Violating Electron Scattering (PVES) is an extremely successful precision frontier tool that has been used for testing the Standard Model (SM) and understanding nucleon structure. Several generations of highly successful PVES programs at SLAC, MIT-Bates, MAMI-Mainz, and Jefferson Lab have contributed to the understanding of nucleon structure and testing the SM. But missing phenomena like matter-antimatter asymmetry, neutrino flavor oscillations, and dark matter and energy suggest that the SM is only a `low energy' effective theory. The MOLLER experiment at Jefferson Lab will measure the weak charge of the electron, QWe = 1 - 4sin2θW , with a precision of 2.4 % by measuring the parity violating asymmetry in electron-electron () scattering and will be sensitive to subtle but measurable deviations from precisely calculable predictions from the SM. The MOLLER experiment will provide the best contact interaction search for leptons at low OR high energy makes it a probe of physics beyond the Standard Model with sensitivities to mass-scales of new PV physics up to 7.5 TeV. Overview of the experiment and recent pre-R&D progress will be reported.

  14. Development and evaluation of probe based real time loop mediated isothermal amplification for Salmonella: A new tool for DNA quantification.

    PubMed

    Mashooq, Mohmad; Kumar, Deepak; Niranjan, Ankush Kiran; Agarwal, Rajesh Kumar; Rathore, Rajesh

    2016-07-01

    A one step, single tube, accelerated probe based real time loop mediated isothermal amplification (RT LAMP) assay was developed for detecting the invasion gene (InvA) of Salmonella. The probe based RT LAMP is a novel method of gene amplification that amplifies nucleic acid with high specificity and rapidity under isothermal conditions with a set of six primers. The whole procedure is very simple and rapid, and amplification can be obtained in 20min. Detection of gene amplification was accomplished by amplification curve, turbidity and addition of DNA binding dye at the end of the reaction results in colour difference and can be visualized under normal day light and in UV. The sensitivity of developed assay was found 10 fold higher than taqman based qPCR. The specificity of the RT LAMP assay was validated by the absence of any cross reaction with other members of enterobacteriaceae family and other gram negative bacteria. These results indicate that the probe based RT LAMP assay is extremely rapid, cost effective, highly specific and sensitivity and has potential usefulness for rapid Salmonella surveillance. Crown Copyright © 2016. Published by Elsevier B.V. All rights reserved.

  15. Patterning highly ordered arrays of complex nanofeatures through EUV directed polarity switching of non chemically amplified photoresist

    PubMed Central

    Ghosh, Subrata; Satyanarayana, V. S. V.; Pramanick, Bulti; Sharma, Satinder K.; Pradeep, Chullikkattil P.; Morales-Reyes, Israel; Batina, Nikola; Gonsalves, Kenneth E.

    2016-01-01

    Given the importance of complex nanofeatures in the filed of micro-/nanoelectronics particularly in the area of high-density magnetic recording, photonic crystals, information storage, micro-lens arrays, tissue engineering and catalysis, the present work demonstrates the development of new methodology for patterning complex nanofeatures using a recently developed non-chemically amplified photoresist (n-CARs) poly(4-(methacryloyloxy)phenyl)dimethylsulfoniumtriflate) (polyMAPDST) with the help of extreme ultraviolet lithography (EUVL) as patterning tool. The photosensitivity of polyMAPDST is mainly due to the presence of radiation sensitive trifluoromethanesulfonate unit (triflate group) which undergoes photodegradation upon exposure with EUV photons, and thus brings in polarity change in the polymer structure. Integration of such radiation sensitive unit into polymer network avoids the need of chemical amplification which is otherwise needed for polarity switching in the case of chemically amplified photoresists (CARs). Indeed, we successfully patterned highly ordered wide-raging dense nanofeatures that include nanodots, nanowaves, nanoboats, star-elbow etc. All these developed nanopatterns have been well characterized by FESEM and AFM techniques. Finally, the potential of polyMAPDST has been established by successful transfer of patterns into silicon substrate through adaptation of compatible etch recipes. PMID:26975782

  16. Improving the performance of extreme learning machine for hyperspectral image classification

    NASA Astrophysics Data System (ADS)

    Li, Jiaojiao; Du, Qian; Li, Wei; Li, Yunsong

    2015-05-01

    Extreme learning machine (ELM) and kernel ELM (KELM) can offer comparable performance as the standard powerful classifier―support vector machine (SVM), but with much lower computational cost due to extremely simple training step. However, their performance may be sensitive to several parameters, such as the number of hidden neurons. An empirical linear relationship between the number of training samples and the number of hidden neurons is proposed. Such a relationship can be easily estimated with two small training sets and extended to large training sets so as to greatly reduce computational cost. Other parameters, such as the steepness parameter in the sigmodal activation function and regularization parameter in the KELM, are also investigated. The experimental results show that classification performance is sensitive to these parameters; fortunately, simple selections will result in suboptimal performance.

  17. Improved Extreme Learning Machine based on the Sensitivity Analysis

    NASA Astrophysics Data System (ADS)

    Cui, Licheng; Zhai, Huawei; Wang, Benchao; Qu, Zengtang

    2018-03-01

    Extreme learning machine and its improved ones is weak in some points, such as computing complex, learning error and so on. After deeply analyzing, referencing the importance of hidden nodes in SVM, an novel analyzing method of the sensitivity is proposed which meets people’s cognitive habits. Based on these, an improved ELM is proposed, it could remove hidden nodes before meeting the learning error, and it can efficiently manage the number of hidden nodes, so as to improve the its performance. After comparing tests, it is better in learning time, accuracy and so on.

  18. Meteorological risks and impacts on crop production systems in Belgium

    NASA Astrophysics Data System (ADS)

    Gobin, Anne

    2013-04-01

    Extreme weather events such as droughts, heat stress, rain storms and floods can have devastating effects on cropping systems. The perspective of rising risk-exposure is exacerbated further by projected increases of extreme events with climate change. More limits to aid received for agricultural damage and an overall reduction of direct income support to farmers further impacts farmers' resilience. Based on insurance claims, potatoes and rapeseed are the most vulnerable crops, followed by cereals and sugar beets. Damages due to adverse meteorological events are strongly dependent on crop type, crop stage and soil type. Current knowledge gaps exist in the response of arable crops to the occurrence of extreme events. The degree of temporal overlap between extreme weather events and the sensitive periods of the farming calendar requires a modelling approach to capture the mixture of non-linear interactions between the crop and its environment. The regional crop model REGCROP (Gobin, 2010) enabled to examine the likely frequency and magnitude of drought, heat stress and waterlogging in relation to the cropping season and crop sensitive stages of six arable crops: winter wheat, winter barley, winter rapeseed, potato, sugar beet and maize. Since crop development is driven by thermal time, crops matured earlier during the warmer 1988-2008 period than during the 1947-1987 period. Drought and heat stress, in particular during the sensitive crop stages, occur at different times in the cropping season and significantly differ between two climatic periods, 1947-1987 and 1988-2008. Soil moisture deficit increases towards harvesting, such that earlier maturing winter crops may avoid drought stress that occurs in late spring and summer. This is reflected in a decrease both in magnitude and frequency of soil moisture deficit around the sensitive stages during the 1988-2008 period when atmospheric drought may be compensated for with soil moisture. The risk of drought spells during the sensitive stages of summer crops increases and may be further aggravated by atmospheric moisture deficits and heat stress. Summer crops may therefore benefit from earlier planting dates and beneficial moisture conditions during early canopy development, but will suffer from increased drought and heat stress during crop maturity. During the harvesting stages, the number of waterlogged days increases in particular for tuber crops. Physically based crop models assist in understanding the links between different factors causing crop damage. The approach allows for assessing the meteorological impacts on crop growth due to the sensitive stages occurring earlier during the growing season and due to extreme weather events. Though average yields have risen continuously between 1947 and 2008 mainly due to technological advances, there is no evidence that relative tolerance to adverse weather conditions such as atmospheric moisture deficit and temperature extremes has changed.

  19. Colorado River basin sensitivity to disturbance impacts

    NASA Astrophysics Data System (ADS)

    Bennett, K. E.; Urrego-Blanco, J. R.; Jonko, A. K.; Vano, J. A.; Newman, A. J.; Bohn, T. J.; Middleton, R. S.

    2017-12-01

    The Colorado River basin is an important river for the food-energy-water nexus in the United States and is projected to change under future scenarios of increased CO2emissions and warming. Streamflow estimates to consider climate impacts occurring as a result of this warming are often provided using modeling tools which rely on uncertain inputs—to fully understand impacts on streamflow sensitivity analysis can help determine how models respond under changing disturbances such as climate and vegetation. In this study, we conduct a global sensitivity analysis with a space-filling Latin Hypercube sampling of the model parameter space and statistical emulation of the Variable Infiltration Capacity (VIC) hydrologic model to relate changes in runoff, evapotranspiration, snow water equivalent and soil moisture to model parameters in VIC. Additionally, we examine sensitivities of basin-wide model simulations using an approach that incorporates changes in temperature, precipitation and vegetation to consider impact responses for snow-dominated headwater catchments, low elevation arid basins, and for the upper and lower river basins. We find that for the Colorado River basin, snow-dominated regions are more sensitive to uncertainties. New parameter sensitivities identified include runoff/evapotranspiration sensitivity to albedo, while changes in snow water equivalent are sensitive to canopy fraction and Leaf Area Index (LAI). Basin-wide streamflow sensitivities to precipitation, temperature and vegetation are variable seasonally and also between sub-basins; with the largest sensitivities for smaller, snow-driven headwater systems where forests are dense. For a major headwater basin, a 1ºC of warming equaled a 30% loss of forest cover, while a 10% precipitation loss equaled a 90% forest cover decline. Scenarios utilizing multiple disturbances led to unexpected results where changes could either magnify or diminish extremes, such as low and peak flows and streamflow timing, dependent on the strength and direction of the forcing. These results indicate the importance of understanding model sensitivities under disturbance impacts to manage these shifts; plan for future water resource changes and determine how the impacts will affect the sustainability and adaptability of food-energy-water systems.

  20. Spinoff 2009

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Topics covered include: Image-Capture Devices Extend Medicine's Reach; Medical Devices Assess, Treat Balance Disorders; NASA Bioreactors Advance Disease Treatments; Robotics Algorithms Provide Nutritional Guidelines; "Anti-Gravity" Treadmills Speed Rehabilitation; Crew Management Processes Revitalize Patient Care; Hubble Systems Optimize Hospital Schedules; Web-based Programs Assess Cognitive Fitness; Electrolyte Concentrates Treat Dehydration; Tools Lighten Designs, Maintain Structural Integrity; Insulating Foams Save Money, Increase Safety; Polyimide Resins Resist Extreme Temperatures; Sensors Locate Radio Interference; Surface Operations Systems Improve Airport Efficiency; Nontoxic Resins Advance Aerospace Manufacturing; Sensors Provide Early Warning of Biological Threats; Robot Saves Soldier's Lives Overseas (MarcBot); Apollo-Era Life Raft Saves Hundreds of Sailors; Circuits Enhance Scientific Instruments and Safety Devices; Tough Textiles Protect Payloads and Public Safety Officers; Forecasting Tools Point to Fishing Hotspots; Air Purifiers Eliminate Pathogens, Preserve Food; Fabrics Protect Sensitive Skin from UV Rays; Phase Change Fabrics Control Temperature; Tiny Devices Project Sharp, Colorful Images; Star-Mapping Tools Enable Tracking of Endangered Animals; Nanofiber Filters Eliminate Contaminants; Modeling Innovations Advance Wind Energy Industry; Thermal Insulation Strips Conserve Energy; Satellite Respondent Buoys Identify Ocean Debris; Mobile Instruments Measure Atmospheric Pollutants; Cloud Imagers Offer New Details on Earth's Health; Antennas Lower Cost of Satellite Access; Feature Detection Systems Enhance Satellite Imagery; Chlorophyll Meters Aid Plant Nutrient Management; Telemetry Boards Interpret Rocket, Airplane Engine Data; Programs Automate Complex Operations Monitoring; Software Tools Streamline Project Management; Modeling Languages Refine Vehicle Design; Radio Relays Improve Wireless Products; Advanced Sensors Boost Optical Communication, Imaging; Tensile Fabrics Enhance Architecture Around the World; Robust Light Filters Support Powerful Imaging Devices; Thermoelectric Devices Cool, Power Electronics; Innovative Tools Advance Revolutionary Weld Technique; Methods Reduce Cost, Enhance Quality of Nanotubes; Gauging Systems Monitor Cryogenic Liquids; Voltage Sensors Monitor Harmful Static; and Compact Instruments Measure Heat Potential.

  1. Surface-Enhanced Raman Scattering Surface Selection Rules for the Proteomic Liquid Biopsy in Real Samples: Efficient Detection of the Oncoprotein c-MYC.

    PubMed

    Pazos, Elena; Garcia-Algar, Manuel; Penas, Cristina; Nazarenus, Moritz; Torruella, Arnau; Pazos-Perez, Nicolas; Guerrini, Luca; Vázquez, M Eugenio; Garcia-Rico, Eduardo; Mascareñas, José L; Alvarez-Puebla, Ramon A

    2016-11-02

    Blood-based biomarkers (liquid biopsy) offer extremely valuable tools for the noninvasive diagnosis and monitoring of tumors. The protein c-MYC, a transcription factor that has been shown to be deregulated in up to 70% of human cancers, can be used as a robust proteomic signature for cancer. Herein, we developed a rapid, highly specific, and sensitive surface-enhanced Raman scattering (SERS) assay for the quantification of c-MYC in real blood samples. The sensing scheme relies on the use of specifically designed hybrid plasmonic materials and their bioderivatization with a selective peptidic receptor modified with a SERS transducer. Peptide/c-MYC recognition events translate into measurable alterations of the SERS spectra associated with a molecular reorientation of the transducer, in agreement with the surface selection rules. The efficiency of the sensor is demonstrated in cellular lines, healthy donors and a cancer patient.

  2. The complex fabric of public opinion on space

    NASA Astrophysics Data System (ADS)

    A. Roy, Stephanie; C. Gresham, Elaine; Christensen, Carissa Bryce

    2000-07-01

    Survey questions can be useful tools in gauging public interest. An historical analysis of U.S. public opinion on space-related issues presents some valuable results. Space-related poll questions closely track major events in the history of the U.S. space program. Funding questions are consistently asked, although program-related questions are becoming increasingly popular. Support for space funding has remained remarkably stable at approximately 80% since 1965, with only one significant dip in support in the early 1970s. However, responses on funding questions are extremely sensitive to question wording and should be used cautiously. Around 75% of the American public generally approve of the job that NASA is doing. Human space flight wins out over robotic space programs when put head-to-head, although support for a human Mars mission is on the decline. Despite dramatic increases in commercial space activities, in general opinion polls fail to reflect this increasingly dominant sector of the space economy.

  3. Neuronal activity-dependent membrane traffic at the neuromuscular junction

    PubMed Central

    Miana-Mena, Francisco Javier; Roux, Sylvie; Benichou, Jean-Claude; Osta, Rosario; Brûlet, Philippe

    2002-01-01

    During development and also in adulthood, synaptic connections are modulated by neuronal activity. To follow such modifications in vivo, new genetic tools are designed. The nontoxic C-terminal fragment of tetanus toxin (TTC) fused to a reporter gene such as LacZ retains the retrograde and transsynaptic transport abilities of the holotoxin itself. In this work, the hybrid protein is injected intramuscularly to analyze in vivo the mechanisms of intracellular and transneuronal traffics at the neuromuscular junction (NMJ). Traffic on both sides of the synapse are strongly dependent on presynaptic neural cell activity. In muscle, a directional membrane traffic concentrates β-galactosidase-TTC hybrid protein into the NMJ postsynaptic side. In neurons, the probe is sorted across the cell to dendrites and subsequently to an interconnected neuron. Such fusion protein, sensitive to presynaptic neuronal activity, would be extremely useful to analyze morphological changes and plasticity at the NMJ. PMID:11880654

  4. Holographic photolysis of caged neurotransmitters

    PubMed Central

    Lutz, Christoph; Otis, Thomas S.; DeSars, Vincent; Charpak, Serge; DiGregorio, David A.; Emiliani, Valentina

    2009-01-01

    Stimulation of light-sensitive chemical probes has become a powerful tool for the study of dynamic signaling processes in living tissue. Classically, this approach has been constrained by limitations of lens–based and point-scanning illumination systems. Here we describe a novel microscope configuration that incorporates a nematic liquid crystal spatial light modulator (LC-SLM) to generate holographic patterns of illumination. This microscope can produce illumination spots of variable size and number and patterns shaped to precisely match user-defined elements in a specimen. Using holographic illumination to photolyse caged glutamate in brain slices, we demonstrate that shaped excitation on segments of neuronal dendrites and simultaneous, multi-spot excitation of different dendrites enables precise spatial and rapid temporal control of glutamate receptor activation. By allowing the excitation volume shape to be tailored precisely, the holographic microscope provides an extremely flexible method for activation of various photosensitive proteins and small molecules. PMID:19160517

  5. The promise and challenge of high-throughput sequencing of the antibody repertoire

    PubMed Central

    Georgiou, George; Ippolito, Gregory C; Beausang, John; Busse, Christian E; Wardemann, Hedda; Quake, Stephen R

    2014-01-01

    Efforts to determine the antibody repertoire encoded by B cells in the blood or lymphoid organs using high-throughput DNA sequencing technologies have been advancing at an extremely rapid pace and are transforming our understanding of humoral immune responses. Information gained from high-throughput DNA sequencing of immunoglobulin genes (Ig-seq) can be applied to detect B-cell malignancies with high sensitivity, to discover antibodies specific for antigens of interest, to guide vaccine development and to understand autoimmunity. Rapid progress in the development of experimental protocols and informatics analysis tools is helping to reduce sequencing artifacts, to achieve more precise quantification of clonal diversity and to extract the most pertinent biological information. That said, broader application of Ig-seq, especially in clinical settings, will require the development of a standardized experimental design framework that will enable the sharing and meta-analysis of sequencing data generated by different laboratories. PMID:24441474

  6. Gravitational Waves From Ultra Short Period Exoplanets

    NASA Astrophysics Data System (ADS)

    Cunha, J. V.; Silva, F. E.; Lima, J. A. S.

    2018-06-01

    In the last two decades, thousands of extrasolar planets were discovered based on different observational techniques, and their number must increase substantially in virtue of the ongoing and near-future approved missions and facilities. It is shown that interesting signatures of binary systems from nearby exoplanets and their parent stars can also be obtained measuring the pattern of gravitational waves that will be made available by the new generation of detectors including the space-based LISA (Laser Interferometer Space Antenna) observatory. As an example, a subset of exoplanets with extremely short periods (less than 80 min) is discussed. All of them have gravitational luminosity, LGW ˜ 1030erg/s, strain h ˜ 10-22, frequencies fgw > 10-4Hz, and, as such, are within the standard sensitivity curve of LISA. Our analysis suggests that the emitted gravitational wave pattern may also provide an efficient tool to discover ultra short period exoplanets.

  7. Designed Strategies for Fluorescence-Based Biosensors for the Detection of Mycotoxins

    PubMed Central

    Sharma, Atul; Khan, Reem; Catanante, Gaelle; Sherazi, Tauqir A.; Bhand, Sunil; Hayat, Akhtar; Marty, Jean Louis

    2018-01-01

    Small molecule toxins such as mycotoxins with low molecular weight are the most widely studied biological toxins. These biological toxins are responsible for food poisoning and have the potential to be used as biological warfare agents at the toxic dose. Due to the poisonous nature of mycotoxins, effective analysis techniques for quantifying their toxicity are indispensable. In this context, biosensors have been emerged as a powerful tool to monitors toxins at extremely low level. Recently, biosensors based on fluorescence detection have attained special interest with the incorporation of nanomaterials. This review paper will focus on the development of fluorescence-based biosensors for mycotoxin detection, with particular emphasis on their design as well as properties such as sensitivity and specificity. A number of these fluorescent biosensors have shown promising results in food samples for the detection of mycotoxins, suggesting their future potential for food applications. PMID:29751687

  8. Assignment of Pre-edge Features in the Ru K-edge X-ray Absorption Spectra of Organometallic Ruthenium Complexes

    PubMed Central

    Getty, Kendra; Delgado-Jaime, Mario Ulises

    2010-01-01

    The nature of the lowest energy bound-state transition in the Ru K-edge X-ray Absorption Spectra for a series of Grubbs-type ruthenium complexes was investigated. The pre-edge feature was unambiguously assigned as resulting from formally electric dipole forbidden Ru 4d←1s transitions. The intensities of these transitions are extremely sensitive to the ligand environment and the symmetry of the metal centre. In centrosymmetric complexes the pre-edge is very weak since it is limited by the weak electric quadrupole intensity mechanism. By contrast, upon breaking centrosymmetry, Ru 5p-4d mixing allows for introduction of electric dipole allowed character resulting in a dramatic increase in the pre-edge intensity. The information content of this approach is explored as it relates to complexes of importance in olefin metathesis and its relevance as a tool for the study of reactive intermediates. PMID:20151030

  9. Fermi-edge transmission resonance in graphene driven by a single Coulomb impurity.

    PubMed

    Karnatak, Paritosh; Goswami, Srijit; Kochat, Vidya; Pal, Atindra Nath; Ghosh, Arindam

    2014-07-11

    The interaction between the Fermi sea of conduction electrons and a nonadiabatic attractive impurity potential can lead to a power-law divergence in the tunneling probability of charge through the impurity. The resulting effect, known as the Fermi edge singularity (FES), constitutes one of the most fundamental many-body phenomena in quantum solid state physics. Here we report the first observation of FES for Dirac fermions in graphene driven by isolated Coulomb impurities in the conduction channel. In high-mobility graphene devices on hexagonal boron nitride substrates, the FES manifests in abrupt changes in conductance with a large magnitude ≈e(2)/h at resonance, indicating total many-body screening of a local Coulomb impurity with fluctuating charge occupancy. Furthermore, we exploit the extreme sensitivity of graphene to individual Coulomb impurities and demonstrate a new defect-spectroscopy tool to investigate strongly correlated phases in graphene in the quantum Hall regime.

  10. Environmental applications of single collector high resolution ICP-MS.

    PubMed

    Krachler, Michael

    2007-08-01

    The number of environmental applications of single collector high resolution ICP-MS (HR-ICP-MS) has increased rapidly in recent years. There are many factors that contribute to make HR-ICP-MS a very powerful tool in environmental analysis. They include the extremely low detection limits achievable, tremendously high sensitivity, the ability to separate ICP-MS signals of the analyte from spectral interferences, enabling the reliable determination of many trace elements, and the reasonable precision of isotope ratio measurements. These assets are improved even further using high efficiency sample introduction systems. Therefore, external factors such as the stability of laboratory blanks are frequently the limiting factor in HR-ICP-MS analysis rather than the detection power. This review aims to highlight the most recent applications of HR-ICP-MS in this sector, focusing on matrices and applications where the superior capabilities of the instrumental technique are most useful and often ultimately required.

  11. Designed Strategies for Fluorescence-Based Biosensors for the Detection of Mycotoxins.

    PubMed

    Sharma, Atul; Khan, Reem; Catanante, Gaelle; Sherazi, Tauqir A; Bhand, Sunil; Hayat, Akhtar; Marty, Jean Louis

    2018-05-11

    Small molecule toxins such as mycotoxins with low molecular weight are the most widely studied biological toxins. These biological toxins are responsible for food poisoning and have the potential to be used as biological warfare agents at the toxic dose. Due to the poisonous nature of mycotoxins, effective analysis techniques for quantifying their toxicity are indispensable. In this context, biosensors have been emerged as a powerful tool to monitors toxins at extremely low level. Recently, biosensors based on fluorescence detection have attained special interest with the incorporation of nanomaterials. This review paper will focus on the development of fluorescence-based biosensors for mycotoxin detection, with particular emphasis on their design as well as properties such as sensitivity and specificity. A number of these fluorescent biosensors have shown promising results in food samples for the detection of mycotoxins, suggesting their future potential for food applications.

  12. Endocrine Parameters and Phenotypes of the Growth Hormone Receptor Gene Disrupted (GHR−/−) Mouse

    PubMed Central

    List, Edward O.; Sackmann-Sala, Lucila; Berryman, Darlene E.; Funk, Kevin; Kelder, Bruce; Gosney, Elahu S.; Okada, Shigeru; Ding, Juan; Cruz-Topete, Diana

    2011-01-01

    Disruption of the GH receptor (GHR) gene eliminates GH-induced intracellular signaling and, thus, its biological actions. Therefore, the GHR gene disrupted mouse (GHR−/−) has been and is a valuable tool for helping to define various parameters of GH physiology. Since its creation in 1995, this mouse strain has been used by our laboratory and others for numerous studies ranging from growth to aging. Some of the most notable discoveries are their extreme insulin sensitivity in the presence of obesity. Also, the animals have an extended lifespan, which has generated a large number of investigations into the roles of GH and IGF-I in the aging process. This review summarizes the many results derived from the GHR−/− mice. We have attempted to present the findings in the context of current knowledge regarding GH action and, where applicable, to discuss how these mice compare to GH insensitivity syndrome in humans. PMID:21123740

  13. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.; Kramer, S.; Koch, T.; Pfützner, B.

    2017-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high-resolution groundwater level simulation was carried out. A decision support process with an intensive stakeholder interaction combined with high-resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  14. Workshop on Using NASA Data for Time-Sensitive Applications

    NASA Technical Reports Server (NTRS)

    Davies, Diane K.; Brown, Molly E.; Murphy, Kevin J.; Michael, Karen A.; Zavodsky, Bradley T.; Stavros, E. Natasha; Carroll, Mark L.

    2017-01-01

    Over the past decade, there has been an increase in the use of NASA's Earth Observing System (EOS) data and imagery for time-sensitive applications such as monitoring wildfires, floods, and extreme weather events. In September 2016, NASA sponsored a workshop for data users, producers, and scientists to discuss the needs of time-sensitive science applications.

  15. Handheld Chem/Biosensor Using Extreme Conformational Changes in Designed Binding Proteins to Enhance Surface Plasmon Resonance (SPR)

    DTIC Science & Technology

    2016-04-01

    AFCEC-CX-TY-TR-2016-0007 HANDHELD CHEM/ BIOSENSOR USING EXTREME CONFORMATIONAL CHANGES IN DESIGNED BINDING PROTEINS TO ENHANCE SURFACE PLASMON...Include area code) 03/24/2016 Abstract 08/14/2015--03/31/2016 Handheld chem/ biosensor using extreme conformational changes in designed binding...Baltimore, Maryland on 17-21 April 2016. We propose the development of a highly sensitive handheld chem/ biosensor device using a novel class of engineered

  16. ESH assessment of advanced lithography materials and processes

    NASA Astrophysics Data System (ADS)

    Worth, Walter F.; Mallela, Ram

    2004-05-01

    The ESH Technology group at International SEMATECH is conducting environment, safety, and health (ESH) assessments in collaboration with the lithography technologists evaluating the performance of an increasing number of new materials and technologies being considered for advanced lithography such as 157nm photresist and extreme ultraviolet (EUV). By performing data searches for 75 critical data types, emissions characterizations, and industrial hygiene (IH) monitoring during the use of the resist candidates, it has been shown that the best performing resist formulations, so far, appear to be free of potential ESH concerns. The ESH assessment of the EUV lithography tool that is being developed for SEMATECH has identified several features of the tool that are of ESH concern: high energy consumption, poor energy conversion efficiency, tool complexity, potential ergonomic and safety interlock issues, use of high powered laser(s), generation of ionizing radiation (soft X-rays), need for adequate shielding, and characterization of the debris formed by the extreme temperature of the plasma. By bringing these ESH challenges to the attention of the technologists and tool designers, it is hoped that the processes and tools can be made more ESH friendly.

  17. Ergonomic study on wrist posture when using laparoscopic tools in four different techniques regarding minimally invasive surgery.

    PubMed

    Bartnicka, Joanna; Zietkiewicz, Agnieszka A; Kowalski, Grzegorz J

    2018-03-19

    With reference to four different minimally invasive surgery (MIS) cholecystectomy the aims were: to recognize the factors influencing dominant wrist postures manifested by the surgeon; to detect risk factors involved in maintaining deviated wrist postures; to compare the wrist postures of surgeons while using laparoscopic tools. Video films were recorded during live surgeries. The films were synchronized with wrist joint angles obtained from wireless electrogoniometers placed on the surgeon's hand. The analysis was conducted for five different laparoscopic tools used during all surgical techniques. The most common wrist posture was extension. In the case of one laparoscopic tool, the mean values defining extended wrist posture were distinct in all four surgical techniques. For one type of surgical technique, considered to be the most beneficial for patients, more extreme postures were noticed regarding all laparoscopic tools. We recognized a new factor, apart from the tool's handle design, that influences extreme and deviated wrist postures. It involves three areas of task specification including the type of action, type of motion patterns and motion dynamism. The outcomes proved that the surgical technique which is most beneficial for the patient imposes the greatest strain on the surgeon's wrist.

  18. VARS-TOOL: A Comprehensive, Efficient, and Robust Sensitivity Analysis Toolbox

    NASA Astrophysics Data System (ADS)

    Razavi, S.; Sheikholeslami, R.; Haghnegahdar, A.; Esfahbod, B.

    2016-12-01

    VARS-TOOL is an advanced sensitivity and uncertainty analysis toolbox, applicable to the full range of computer simulation models, including Earth and Environmental Systems Models (EESMs). The toolbox was developed originally around VARS (Variogram Analysis of Response Surfaces), which is a general framework for Global Sensitivity Analysis (GSA) that utilizes the variogram/covariogram concept to characterize the full spectrum of sensitivity-related information, thereby providing a comprehensive set of "global" sensitivity metrics with minimal computational cost. VARS-TOOL is unique in that, with a single sample set (set of simulation model runs), it generates simultaneously three philosophically different families of global sensitivity metrics, including (1) variogram-based metrics called IVARS (Integrated Variogram Across a Range of Scales - VARS approach), (2) variance-based total-order effects (Sobol approach), and (3) derivative-based elementary effects (Morris approach). VARS-TOOL is also enabled with two novel features; the first one being a sequential sampling algorithm, called Progressive Latin Hypercube Sampling (PLHS), which allows progressively increasing the sample size for GSA while maintaining the required sample distributional properties. The second feature is a "grouping strategy" that adaptively groups the model parameters based on their sensitivity or functioning to maximize the reliability of GSA results. These features in conjunction with bootstrapping enable the user to monitor the stability, robustness, and convergence of GSA with the increase in sample size for any given case study. VARS-TOOL has been shown to achieve robust and stable results within 1-2 orders of magnitude smaller sample sizes (fewer model runs) than alternative tools. VARS-TOOL, available in MATLAB and Python, is under continuous development and new capabilities and features are forthcoming.

  19. Quantifying the relationship between extreme air pollution events and extreme weather events

    NASA Astrophysics Data System (ADS)

    Zhang, Henian; Wang, Yuhang; Park, Tae-Won; Deng, Yi

    2017-05-01

    Extreme weather events can strongly affect surface air quality, which has become a major environmental factor to affect human health. Here, we examined the relationship between extreme ozone and PM2.5 (particular matter with an aerodynamic diameter less than 2.5 μm) events and the representative meteorological parameters such as daily maximum temperature (Tmax), minimum relative humidity (RHmin), and minimum wind speed (Vmin), using the location-specific 95th or 5th percentile threshold derived from historical reanalysis data (30 years for ozone and 10 years for PM2.5). We found that ozone and PM2.5 extremes were decreasing over the years, reflecting EPA's tightened standards and effort on reducing the corresponding precursor's emissions. Annual ozone and PM2.5 extreme days were highly correlated with Tmax and RHmin, especially in the eastern U.S. They were positively (negatively) correlated with Vmin in urban (rural and suburban) stations. The overlapping ratios of ozone extreme days with Tmax were fairly constant, about 32%, and tended to be high in fall and low in winter. Ozone extreme days were most sensitive to Tmax, then RHmin, and least sensitive to Vmin. The majority of ozone extremes occurred when Tmax was between 300 K and 320 K, RHmin was less than 40%, and Vmin was less than 3 m/s. The number of annual extreme PM2.5 days was highly positively correlated with the extreme RHmin/Tmax days, with correlation coefficient between PM2.5/RHmin highest in urban and suburban regions and the correlation coefficient between PM2.5/Tmax highest in rural area. Tmax has more impact on PM2.5 extreme over the eastern U.S. Extreme PM2.5 days were more likely to occur at low RH conditions in the central and southeastern U.S., especially during spring time, and at high RH conditions in the northern U.S. and the Great Plains. Most extreme PM2.5 events occurred when Tmax was between 300 K and 320 K and RHmin was between 10% and 50%. Extreme PM2.5 days usually occurred when Vmin was under 2 m/s. However, during spring season in the Southeast and fall season in Northwest, high winds were found to accompany extreme PM2.5 days, likely reflecting the impact of fire emissions.

  20. The Role of Transport Phenomena in Whispering Gallery Mode Optical Biosensor Performance

    NASA Astrophysics Data System (ADS)

    Gamba, Jason

    Whispering gallery mode (WGM) optical resonator sensors have emerged as promising tools for label-free detection of biomolecules in solution. These devices have even demonstrated single-molecule limits of detection in complex biological uids. This extraordinary sensitivity makes them ideal for low-concentration analytical and diagnostic measurements, but a great deal of work must be done toward understanding and optimizing their performance before they are capable of reliable quantitative measurents. The present work explores the physical processes behind this extreme sensitivity and how to best take advantage of them for practical applications of this technology. I begin by examining the nature of the interaction between the intense electromagnetic elds that build up in the optical biosensor and the biomolecules that bind to its surface. This work addresses the need for a coherent and thorough physical model that can be used to predict sensor behavior for a range of experimental parameters. While this knowledge will prove critical for the development of this technology, it has also shone a light on nonlinear thermo-optical and optical phenomena that these devices are uniquely suited to probing. The surprisingly rapid transient response of toroidal WGM biosensors despite sub-femtomolar analyte concentrations is also addressed. The development of asymmetric boundary layers around these devices under ow is revealed to enhance the capture rate of proteins from solution compared to the spherical sensors used previously. These lessons will guide the design of ow systems to minimize measurement time and consumption of precious sample, a key factor in any medically relevant assay. Finally, experimental results suggesting that WGM biosensors could be used to improve the quantitative detection of small-molecule biomarkers in exhaled breath condensate demonstrate how their exceptional sensitivity and transient response can enable the use of this noninvasive method to probe respiratory distress. WGM bioensors are unlike any other analytical tool, and the work presented here focuses on answering engineering questions surrounding their performance and potential.

  1. A pseudoproxy assessment of data assimilation for reconstructing the atmosphere-ocean dynamics of hydroclimate extremes

    NASA Astrophysics Data System (ADS)

    Steiger, Nathan J.; Smerdon, Jason E.

    2017-10-01

    Because of the relatively brief observational record, the climate dynamics that drive multiyear to centennial hydroclimate variability are not adequately characterized and understood. Paleoclimate reconstructions based on data assimilation (DA) optimally fuse paleoclimate proxies with the dynamical constraints of climate models, thus providing a coherent dynamical picture of the past. DA is therefore an important new tool for elucidating the mechanisms of hydroclimate variability over the last several millennia. But DA has so far remained untested for global hydroclimate reconstructions. Here we explore whether or not DA can be used to skillfully reconstruct global hydroclimate variability along with the driving climate dynamics. Through a set of idealized pseudoproxy experiments, we find that an established DA reconstruction approach can in principle be used to reconstruct hydroclimate at both annual and seasonal timescales. We find that the skill of such reconstructions is generally highest near the proxy sites. This set of reconstruction experiments is specifically designed to estimate a realistic upper bound for the skill of this DA approach. Importantly, this experimental framework allows us to see where and for what variables the reconstruction approach may never achieve high skill. In particular for tree rings, we find that hydroclimate reconstructions depend critically on moisture-sensitive trees, while temperature reconstructions depend critically on temperature-sensitive trees. Real-world DA-based reconstructions will therefore likely require a spatial mixture of temperature- and moisture-sensitive trees to reconstruct both temperature and hydroclimate variables. Additionally, we illustrate how DA can be used to elucidate the dynamical mechanisms of drought with two examples: tropical drivers of multiyear droughts in the North American Southwest and in equatorial East Africa. This work thus provides a foundation for future DA-based hydroclimate reconstructions using real-proxy networks while also highlighting the utility of this important tool for hydroclimate research.

  2. The National Extreme Events Data and Research Center (NEED)

    NASA Astrophysics Data System (ADS)

    Gulledge, J.; Kaiser, D. P.; Wilbanks, T. J.; Boden, T.; Devarakonda, R.

    2014-12-01

    The Climate Change Science Institute at Oak Ridge National Laboratory (ORNL) is establishing the National Extreme Events Data and Research Center (NEED), with the goal of transforming how the United States studies and prepares for extreme weather events in the context of a changing climate. NEED will encourage the myriad, distributed extreme events research communities to move toward the adoption of common practices and will develop a new database compiling global historical data on weather- and climate-related extreme events (e.g., heat waves, droughts, hurricanes, etc.) and related information about impacts, costs, recovery, and available research. Currently, extreme event information is not easy to access and is largely incompatible and inconsistent across web sites. NEED's database development will take into account differences in time frames, spatial scales, treatments of uncertainty, and other parameters and variables, and leverage informatics tools developed at ORNL (i.e., the Metadata Editor [1] and Mercury [2]) to generate standardized, robust documentation for each database along with a web-searchable catalog. In addition, NEED will facilitate convergence on commonly accepted definitions and standards for extreme events data and will enable integrated analyses of coupled threats, such as hurricanes/sea-level rise/flooding and droughts/wildfires. Our goal and vision is that NEED will become the premiere integrated resource for the general study of extreme events. References: [1] Devarakonda, Ranjeet, et al. "OME: Tool for generating and managing metadata to handle BigData." Big Data (Big Data), 2014 IEEE International Conference on. IEEE, 2014. [2] Devarakonda, Ranjeet, et al. "Mercury: reusable metadata management, data discovery and access system." Earth Science Informatics 3.1-2 (2010): 87-94.

  3. Predictive value of the DASH tool for predicting return to work of injured workers with musculoskeletal disorders of the upper extremity.

    PubMed

    Armijo-Olivo, Susan; Woodhouse, Linda J; Steenstra, Ivan A; Gross, Douglas P

    2016-12-01

    To determine whether the Disabilities of the Arm, Shoulder, and Hand (DASH) tool added to the predictive ability of established prognostic factors, including patient demographic and clinical outcomes, to predict return to work (RTW) in injured workers with musculoskeletal (MSK) disorders of the upper extremity. A retrospective cohort study using a population-based database from the Workers' Compensation Board of Alberta (WCB-Alberta) that focused on claimants with upper extremity injuries was used. Besides the DASH, potential predictors included demographic, occupational, clinical and health usage variables. Outcome was receipt of compensation benefits after 3 months. To identify RTW predictors, a purposeful logistic modelling strategy was used. A series of receiver operating curve analyses were performed to determine which model provided the best discriminative ability. The sample included 3036 claimants with upper extremity injuries. The final model for predicting RTW included the total DASH score in addition to other established predictors. The area under the curve for this model was 0.77, which is interpreted as fair discrimination. This model was statistically significantly different than the model of established predictors alone (p<0.001). When comparing the DASH total score versus DASH item 23, a non-significant difference was obtained between the models (p=0.34). The DASH tool together with other established predictors significantly helped predict RTW after 3 months in participants with upper extremity MSK disorders. An appealing result for clinicians and busy researchers is that DASH item 23 has equal predictive ability to the total DASH score. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. Statistical analysis and ANN modeling for predicting hydrological extremes under climate change scenarios: the example of a small Mediterranean agro-watershed.

    PubMed

    Kourgialas, Nektarios N; Dokou, Zoi; Karatzas, George P

    2015-05-01

    The purpose of this study was to create a modeling management tool for the simulation of extreme flow events under current and future climatic conditions. This tool is a combination of different components and can be applied in complex hydrogeological river basins, where frequent flood and drought phenomena occur. The first component is the statistical analysis of the available hydro-meteorological data. Specifically, principal components analysis was performed in order to quantify the importance of the hydro-meteorological parameters that affect the generation of extreme events. The second component is a prediction-forecasting artificial neural network (ANN) model that simulates, accurately and efficiently, river flow on an hourly basis. This model is based on a methodology that attempts to resolve a very difficult problem related to the accurate estimation of extreme flows. For this purpose, the available measurements (5 years of hourly data) were divided in two subsets: one for the dry and one for the wet periods of the hydrological year. This way, two ANNs were created, trained, tested and validated for a complex Mediterranean river basin in Crete, Greece. As part of the second management component a statistical downscaling tool was used for the creation of meteorological data according to the higher and lower emission climate change scenarios A2 and B1. These data are used as input in the ANN for the forecasting of river flow for the next two decades. The final component is the application of a meteorological index on the measured and forecasted precipitation and flow data, in order to assess the severity and duration of extreme events. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Behavior Prediction Tools Strengthen Nanoelectronics

    NASA Technical Reports Server (NTRS)

    2013-01-01

    Several years ago, NASA started making plans to send robots to explore the deep, dark craters on the Moon. As part of these plans, NASA needed modeling tools to help engineer unique electronics to withstand extremely cold temperatures. According to Jonathan Pellish, a flight systems test engineer at Goddard Space Flight Center, "An instrument sitting in a shadowed crater on one of the Moon s poles would hover around 43 K", that is, 43 kelvin, equivalent to -382 F. Such frigid temperatures are one of the main factors that make the extreme space environments encountered on the Moon and elsewhere so extreme. Radiation is another main concern. "Radiation is always present in the space environment," says Pellish. "Small to moderate solar energetic particle events happen regularly and extreme events happen less than a handful of times throughout the 7 active years of the 11-year solar cycle." Radiation can corrupt data, propagate to other systems, require component power cycling, and cause a host of other harmful effects. In order to explore places like the Moon, Jupiter, Saturn, Venus, and Mars, NASA must use electronic communication devices like transmitters and receivers and data collection devices like infrared cameras that can resist the effects of extreme temperature and radiation; otherwise, the electronics would not be reliable for the duration of the mission.

  6. Novel application of lower body positive-pressure in the rehabilitation of an individual with multiple lower extremity fractures.

    PubMed

    Takacs, Judit; Leiter, Jeff R S; Peeler, Jason D

    2011-06-01

    Lower extremity fractures, if not treated appropriately, can increase the risk of morbidity. Partial weight-bearing after surgical repair is recommended; however, current methods of partial weight-bearing may cause excessive loads through the lower extremity. A new rehabilitation tool that uses lower body positive-pressure is described, that may allow partial weight-bearing while preventing excessive loads, thereby improving functional outcomes. A patient with multiple lower extremity fractures underwent a 6-month rehabilitation programme using bodyweight support technology 3 times per week, post-surgery. The patient experienced a reduction in pain and an improvement in ankle range of motion (p=0.002), walking speed (p>0.05) and physical function (p=0.004), as assessed by the Foot and Ankle Module of the American Academy of Orthopaedic Surgeons Lower Limb Outcomes Assessment Instrument. Training did not appear to affect fracture healing, as was evident on radiograph. The effect of lower body positive-pressure on effusion, which has not previously been reported in the literature, was also investigated. No significant difference in effusion of the foot and ankle when using lower body positive-pressure was found. Initial results suggest that this new technology may be a useful rehabilitation tool that allows partial weight-bearing during the treatment of lower extremity injuries.

  7. Short physical performance battery for middle-aged and older adult cardiovascular disease patients: implication for strength tests and lower extremity morphological evaluation.

    PubMed

    Yasuda, Tomohiro; Fukumura, Kazuya; Nakajima, Toshiaki

    2017-04-01

    [Purpose] To examine if the SPPB is higher with healthy subjects than outpatients, which was higher than inpatients and if the SPPB can be validated assessment tool for strength tests and lower extremity morphological evaluation in cardiovascular disease patients. [Subjects and Methods] Twenty-four middle aged and older adults with cardiovascular disease were recruited from inpatient and outpatient facilities and assigned to separate experimental groups. Twelve age-matched healthy volunteers were assigned to a control group. SPPB test was used to assess balance and functional motilities. The test outcomes were compared with level of care (inpatient vs. outpatient), physical characteristics, strength and lower extremity morphology. [Results] Total SPPB scores, strength tests (knee extensor muscle strength), and lower extremity morphological evaluation (muscle thickness of anterior and posterior mid-thigh and posterior lower-leg) were greater in healthy subjects and outpatients groups compared with inpatients. To predict total Short Physical Performance Battery scores, the predicted knee extension and anterior mid-thigh muscle thickness were calculated. [Conclusion] The SPPB is an effective tool as the strength tests and lower extremity morphological evaluation for middle-aged and older adult cardiovascular disease patients. Notably, high knee extensor muscle strength and quadriceps femoris muscle thickness are positively associated with high SPPB scores.

  8. [The use of a detector of the extremely weak radiation as a variometer of gravitation field].

    PubMed

    Gorshkov, E S; Bondarenko, E G; Shapovalov, S N; Sokolovskiĭ, V V; Troshichev, O A

    2001-01-01

    It was shown that the detector of extremely weak radiation with selectively increased sensitivity to the nonelectromagnetic, including the gravitational component of the spectrum of active physical fields can be used as the basis for constructing a variometer of gravitational field of a new type.

  9. Simulation of future global warming scenarios in rice paddies with an open-field ecosystem warming facility

    USDA-ARS?s Scientific Manuscript database

    Rice (Oryza sativa L.) in Yangtze River Valley (YRV) suffered serious yield losses in 2003 when extreme heatwave (HW), hampered rice reproductive growth phase (RGP). Climate change induced extreme and asymmetrical fluctuations in temperature during heat sensitive stage of rice growth cycle, i.e., RG...

  10. Cost-Effectiveness Analysis of Preoperative Versus Postoperative Radiation Therapy in Extremity Soft Tissue Sarcoma

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qu, Xuanlu M.; Louie, Alexander V.; Ashman, Jonathan

    Purpose: Surgery combined with radiation therapy (RT) is the cornerstone of multidisciplinary management of extremity soft tissue sarcoma (STS). Although RT can be given in either the preoperative or the postoperative setting with similar local recurrence and survival outcomes, the side effect profiles, costs, and long-term functional outcomes are different. The aim of this study was to use decision analysis to determine optimal sequencing of RT with surgery in patients with extremity STS. Methods and Materials: A cost-effectiveness analysis was conducted using a state transition Markov model, with quality-adjusted life years (QALYs) as the primary outcome. A time horizon ofmore » 5 years, a cycle length of 3 months, and a willingness-to-pay threshold of $50,000/QALY was used. One-way deterministic sensitivity analyses were performed to determine the thresholds at which each strategy would be preferred. The robustness of the model was assessed by probabilistic sensitivity analysis. Results: Preoperative RT is a more cost-effective strategy ($26,633/3.00 QALYs) than postoperative RT ($28,028/2.86 QALYs) in our base case scenario. Preoperative RT is the superior strategy with either 3-dimensional conformal RT or intensity-modulated RT. One-way sensitivity analyses identified the relative risk of chronic adverse events as having the greatest influence on the preferred timing of RT. The likelihood of preoperative RT being the preferred strategy was 82% on probabilistic sensitivity analysis. Conclusions: Preoperative RT is more cost effective than postoperative RT in the management of resectable extremity STS, primarily because of the higher incidence of chronic adverse events with RT in the postoperative setting.« less

  11. Extreme warming challenges sentinel status of kelp forests as indicators of climate change.

    PubMed

    Reed, Daniel; Washburn, Libe; Rassweiler, Andrew; Miller, Robert; Bell, Tom; Harrer, Shannon

    2016-12-13

    The desire to use sentinel species as early warning indicators of impending climate change effects on entire ecosystems is attractive, but we need to verify that such approaches have sound biological foundations. A recent large-scale warming event in the North Pacific Ocean of unprecedented magnitude and duration allowed us to evaluate the sentinel status of giant kelp, a coastal foundation species that thrives in cold, nutrient-rich waters and is considered sensitive to warming. Here, we show that giant kelp and the majority of species that associate with it did not presage ecosystem effects of extreme warming off southern California despite giant kelp's expected vulnerability. Our results challenge the general perception that kelp-dominated systems are highly vulnerable to extreme warming events and expose the more general risk of relying on supposed sentinel species that are assumed to be very sensitive to climate change.

  12. Extreme warming challenges sentinel status of kelp forests as indicators of climate change

    NASA Astrophysics Data System (ADS)

    Reed, Daniel; Washburn, Libe; Rassweiler, Andrew; Miller, Robert; Bell, Tom; Harrer, Shannon

    2016-12-01

    The desire to use sentinel species as early warning indicators of impending climate change effects on entire ecosystems is attractive, but we need to verify that such approaches have sound biological foundations. A recent large-scale warming event in the North Pacific Ocean of unprecedented magnitude and duration allowed us to evaluate the sentinel status of giant kelp, a coastal foundation species that thrives in cold, nutrient-rich waters and is considered sensitive to warming. Here, we show that giant kelp and the majority of species that associate with it did not presage ecosystem effects of extreme warming off southern California despite giant kelp's expected vulnerability. Our results challenge the general perception that kelp-dominated systems are highly vulnerable to extreme warming events and expose the more general risk of relying on supposed sentinel species that are assumed to be very sensitive to climate change.

  13. ATHLETE: Lunar Cargo Unloading from a High Deck

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.

    2010-01-01

    As part of the NASA Exploration Technology Development Program, the Jet Propulsion Laboratory is developing a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. Each vehicle is based on six wheels at the ends of six multi-degree-of freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through or at least out of extreme terrain, the wheels and wheel actuators can be sized for nominal terrain. There are substantial mass savings in the wheel and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are at least comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be lighter than a conventional all-terrain mobility chassis. A side benefit of this approach is that each limb has sufficient degrees-of freedom to be used as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb. A power-take-off from the wheel actuates the tools, so that they can take advantage of the 1+ horsepower motor in each wheel to enable drilling, gripping or other power-tool functions.

  14. ATHLETE: a Cargo and Habitat Transporter for the Moon

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.

    2009-01-01

    As part of the NASA Exploration Technology Development Program, the Jet Propulsion Laboratory is developing a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. The vehicle concept is based on six wheels at the ends of six multi-degree-of-freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through (or at least out of) extreme terrain, the wheels and wheel actuators can be sized only for nominal terrain. There are substantial mass savings in the wheels and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be about 25 percent lighter than a conventional mobility chassis for planetary exploration. A side benefit of this approach is that each limb has sufficient degrees-of-freedom for use as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb. A rotating power-take-off from the wheel actuates the tools, so that they can take advantage of the 1-plus-horsepower motor in each wheel to enable drilling, gripping or other power-tool functions.

  15. Orthogonally combined motion- and diffusion-sensitized driven equilibrium (OC-MDSDE) preparation for vessel signal suppression in 3D turbo spin echo imaging of peripheral nerves in the extremities.

    PubMed

    Cervantes, Barbara; Kirschke, Jan S; Klupp, Elizabeth; Kooijman, Hendrik; Börnert, Peter; Haase, Axel; Rummeny, Ernst J; Karampinos, Dimitrios C

    2018-01-01

    To design a preparation module for vessel signal suppression in MR neurography of the extremities, which causes minimal attenuation of nerve signal and is highly insensitive to eddy currents and motion. The orthogonally combined motion- and diffusion-sensitized driven equilibrium (OC-MDSDE) preparation was proposed, based on the improved motion- and diffusion-sensitized driven equilibrium methods (iMSDE and FC-DSDE, respectively), with specific gradient design and orientation. OC-MDSDE was desensitized against eddy currents using appropriately designed gradient prepulses. The motion sensitivity and vessel signal suppression capability of OC-MDSDE and its components were assessed in vivo in the knee using 3D turbo spin echo (TSE). Nerve-to-vessel signal ratios were measured for iMSDE and OC-MDSDE in 7 subjects. iMSDE was shown to be highly sensitive to motion with increasing flow sensitization. FC-DSDE showed robustness against motion, but resulted in strong nerve signal loss with diffusion gradients oriented parallel to the nerve. OC-MDSDE showed superior vessel suppression compared to iMSDE and FC-DSDE and maintained high nerve signal. Mean nerve-to-vessel signal ratios in 7 subjects were 0.40 ± 0.17 for iMSDE and 0.63 ± 0.37 for OC-MDSDE. OC-MDSDE combined with 3D TSE in the extremities allows high-near-isotropic-resolution imaging of peripheral nerves with reduced vessel contamination and high nerve signal. Magn Reson Med 79:407-415, 2018. © 2017 Wiley Periodicals, Inc. © 2017 International Society for Magnetic Resonance in Medicine.

  16. Eigenvalue Contributon Estimator for Sensitivity Calculations with TSUNAMI-3D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Williams, Mark L

    2007-01-01

    Since the release of the Tools for Sensitivity and Uncertainty Analysis Methodology Implementation (TSUNAMI) codes in SCALE [1], the use of sensitivity and uncertainty analysis techniques for criticality safety applications has greatly increased within the user community. In general, sensitivity and uncertainty analysis is transitioning from a technique used only by specialists to a practical tool in routine use. With the desire to use the tool more routinely comes the need to improve the solution methodology to reduce the input and computational burden on the user. This paper reviews the current solution methodology of the Monte Carlo eigenvalue sensitivity analysismore » sequence TSUNAMI-3D, describes an alternative approach, and presents results from both methodologies.« less

  17. Ability of Ultrasonography in Detection of Different Extremity Bone Fractures; a Case Series Study

    PubMed Central

    Bozorgi, Farzad; Shayesteh Azar, Massoud; Montazer, Seyed Hossein; Chabra, Aroona; Heidari, Seyed Farshad; Khalilian, Alireza

    2017-01-01

    Introduction: Despite radiography being the gold standard in evaluation of orthopedic injuries, using bedside ultrasonography has several potential supremacies such as avoiding exposure to ionizing radiation, availability in pre-hospital settings, being extensively accessible, and ability to be used on the bedside. The aim of the present study is to evaluate the diagnostic accuracy of ultrasonography in detection of extremity bone fractures. Methods: This study is a case series study, which was prospectively conducted on multiple blunt trauma patients, who were 18 years old or older, had stable hemodynamic, Glasgow coma scale 15, and signs or symptoms of a possible extremity bone fracture. After initial assessment, ultrasonography of suspected bones was performed by a trained emergency medicine resident and prevalence of true positive and false negative findings were calculated compared to plain radiology. Results: 108 patients with the mean age of 44.6 ± 20.4 years were studied (67.6% male). Analysis was done on 158 sites of fracture, which were confirmed with plain radiography. 91 (57.6%) cases were suspected to have upper extremity fracture(s) and 67 (42.4%) to have lower ones. The most frequent site of injuries were forearm (36.7%) in upper limbs and leg (27.8%) in lower limbs. Prevalence of true positive and false negative cases for fractures detected by ultrasonography were 59 (64.8%) and 32 (35.52%) for upper and 49 (73.1%) and 18 (26.9%) for lower extremities, respectively. In addition, prevalence of true positive and false negative detected cases for intra-articular fractures were 24 (48%) and 26 (52%), respectively. Conclusion The present study shows the moderate sensitivity (68.3%) of ultrasonography in detection of different extremity bone fractures. Ultrasonography showed the best sensitivity in detection of femur (100%) and humerus (76.2%) fractures, respectively. It had low sensitivity in detection of in intra-articular fractures. PMID:28286822

  18. Ability of Ultrasonography in Detection of Different Extremity Bone Fractures; a Case Series Study.

    PubMed

    Bozorgi, Farzad; Shayesteh Azar, Massoud; Montazer, Seyed Hossein; Chabra, Aroona; Heidari, Seyed Farshad; Khalilian, Alireza

    2017-01-01

    Despite radiography being the gold standard in evaluation of orthopedic injuries, using bedside ultrasonography has several potential supremacies such as avoiding exposure to ionizing radiation, availability in pre-hospital settings, being extensively accessible, and ability to be used on the bedside. The aim of the present study is to evaluate the diagnostic accuracy of ultrasonography in detection of extremity bone fractures. This study is a case series study, which was prospectively conducted on multiple blunt trauma patients, who were 18 years old or older, had stable hemodynamic, Glasgow coma scale 15, and signs or symptoms of a possible extremity bone fracture. After initial assessment, ultrasonography of suspected bones was performed by a trained emergency medicine resident and prevalence of true positive and false negative findings were calculated compared to plain radiology. 108 patients with the mean age of 44.6 ± 20.4 years were studied (67.6% male). Analysis was done on 158 sites of fracture, which were confirmed with plain radiography. 91 (57.6%) cases were suspected to have upper extremity fracture(s) and 67 (42.4%) to have lower ones. The most frequent site of injuries were forearm (36.7%) in upper limbs and leg (27.8%) in lower limbs. Prevalence of true positive and false negative cases for fractures detected by ultrasonography were 59 (64.8%) and 32 (35.52%) for upper and 49 (73.1%) and 18 (26.9%) for lower extremities, respectively. In addition, prevalence of true positive and false negative detected cases for intra-articular fractures were 24 (48%) and 26 (52%), respectively. The present study shows the moderate sensitivity (68.3%) of ultrasonography in detection of different extremity bone fractures. Ultrasonography showed the best sensitivity in detection of femur (100%) and humerus (76.2%) fractures, respectively. It had low sensitivity in detection of in intra-articular fractures.

  19. Reliability of the mangled extremity severity score in combat-related upper and lower extremity injuries.

    PubMed

    Ege, Tolga; Unlu, Aytekin; Tas, Huseyin; Bek, Dogan; Turkan, Selim; Cetinkaya, Aytac

    2015-01-01

    Decision of limb salvage or amputation is generally aided with several trauma scoring systems such as the mangled extremity severity score (MESS). However, the reliability of the injury scores in the settling of open fractures due to explosives and missiles is challenging. Mortality and morbidity of the extremity trauma due to firearms are generally associated with time delay in revascularization, injury mechanism, anatomy of the injured site, associated injuries, age and the environmental circumstance. The purpose of the retrospective study was to evaluate the extent of extremity injuries due to ballistic missiles and to detect the reliability of mangled extremity severity score (MESS) in both upper and lower extremities. Between 2004 and 2014, 139 Gustillo Anderson Type III open fractures of both the upper and lower extremities were enrolled in the study. Data for patient age, fire arm type, transporting time from the field to the hospital (and the method), injury severity scores, MESS scores, fracture types, amputation levels, bone fixation methods and postoperative infections and complications retrieved from the two level-2 trauma center's data base. Sensitivity, specificity, positive and negative predictive values of the MESS were calculated to detect the ability in deciding amputation in the mangled limb. Amputation was performed in 39 extremities and limb salvage attempted in 100 extremities. The mean followup time was 14.6 months (range 6-32 months). In the amputated group, the mean MESS scores for upper and lower extremity were 8.8 (range 6-11) and 9.24 (range 6-11), respectively. In the limb salvage group, the mean MESS scores for upper and lower extremities were 5.29 (range 4-7) and 5.19 (range 3-8), respectively. Sensitivity of MESS in upper and lower extremities were calculated as 80% and 79.4% and positive predictive values detected as 55.55% and 83.3%, respectively. Specificity of MESS score for upper and lower extremities was 84% and 86.6%; negative predictive values were calculated as 95.45% and 90.2%, respectively. MESS is not predictive in combat related extremity injuries especially if between a score of 6-8. Limb ischemia and presence or absence of shock can be used in initial decision-making for amputation.

  20. Reliability of the mangled extremity severity score in combat-related upper and lower extremity injuries

    PubMed Central

    Ege, Tolga; Unlu, Aytekin; Tas, Huseyin; Bek, Dogan; Turkan, Selim; Cetinkaya, Aytac

    2015-01-01

    Background: Decision of limb salvage or amputation is generally aided with several trauma scoring systems such as the mangled extremity severity score (MESS). However, the reliability of the injury scores in the settling of open fractures due to explosives and missiles is challenging. Mortality and morbidity of the extremity trauma due to firearms are generally associated with time delay in revascularization, injury mechanism, anatomy of the injured site, associated injuries, age and the environmental circumstance. The purpose of the retrospective study was to evaluate the extent of extremity injuries due to ballistic missiles and to detect the reliability of mangled extremity severity score (MESS) in both upper and lower extremities. Materials and Methods: Between 2004 and 2014, 139 Gustillo Anderson Type III open fractures of both the upper and lower extremities were enrolled in the study. Data for patient age, fire arm type, transporting time from the field to the hospital (and the method), injury severity scores, MESS scores, fracture types, amputation levels, bone fixation methods and postoperative infections and complications retrieved from the two level-2 trauma center's data base. Sensitivity, specificity, positive and negative predictive values of the MESS were calculated to detect the ability in deciding amputation in the mangled limb. Results: Amputation was performed in 39 extremities and limb salvage attempted in 100 extremities. The mean followup time was 14.6 months (range 6–32 months). In the amputated group, the mean MESS scores for upper and lower extremity were 8.8 (range 6–11) and 9.24 (range 6–11), respectively. In the limb salvage group, the mean MESS scores for upper and lower extremities were 5.29 (range 4–7) and 5.19 (range 3–8), respectively. Sensitivity of MESS in upper and lower extremities were calculated as 80% and 79.4% and positive predictive values detected as 55.55% and 83.3%, respectively. Specificity of MESS score for upper and lower extremities was 84% and 86.6%; negative predictive values were calculated as 95.45% and 90.2%, respectively. Conclusion: MESS is not predictive in combat related extremity injuries especially if between a score of 6–8. Limb ischemia and presence or absence of shock can be used in initial decision-making for amputation. PMID:26806974

  1. Designing ecological climate change impact assessments to reflect key climatic drivers

    USGS Publications Warehouse

    Sofaer, Helen R.; Barsugli, Joseph J.; Jarnevich, Catherine S.; Abatzoglou, John T.; Talbert, Marian; Miller, Brian W.; Morisette, Jeffrey T.

    2017-01-01

    Identifying the climatic drivers of an ecological system is a key step in assessing its vulnerability to climate change. The climatic dimensions to which a species or system is most sensitive – such as means or extremes – can guide methodological decisions for projections of ecological impacts and vulnerabilities. However, scientific workflows for combining climate projections with ecological models have received little explicit attention. We review Global Climate Model (GCM) performance along different dimensions of change and compare frameworks for integrating GCM output into ecological models. In systems sensitive to climatological means, it is straightforward to base ecological impact assessments on mean projected changes from several GCMs. Ecological systems sensitive to climatic extremes may benefit from what we term the ‘model space’ approach: a comparison of ecological projections based on simulated climate from historical and future time periods. This approach leverages the experimental framework used in climate modeling, in which historical climate simulations serve as controls for future projections. Moreover, it can capture projected changes in the intensity and frequency of climatic extremes, rather than assuming that future means will determine future extremes. Given the recent emphasis on the ecological impacts of climatic extremes, the strategies we describe will be applicable across species and systems. We also highlight practical considerations for the selection of climate models and data products, emphasizing that the spatial resolution of the climate change signal is generally coarser than the grid cell size of downscaled climate model output. Our review illustrates how an understanding of how climate model outputs are derived and downscaled can improve the selection and application of climatic data used in ecological modeling.

  2. Designing ecological climate change impact assessments to reflect key climatic drivers.

    PubMed

    Sofaer, Helen R; Barsugli, Joseph J; Jarnevich, Catherine S; Abatzoglou, John T; Talbert, Marian K; Miller, Brian W; Morisette, Jeffrey T

    2017-07-01

    Identifying the climatic drivers of an ecological system is a key step in assessing its vulnerability to climate change. The climatic dimensions to which a species or system is most sensitive - such as means or extremes - can guide methodological decisions for projections of ecological impacts and vulnerabilities. However, scientific workflows for combining climate projections with ecological models have received little explicit attention. We review Global Climate Model (GCM) performance along different dimensions of change and compare frameworks for integrating GCM output into ecological models. In systems sensitive to climatological means, it is straightforward to base ecological impact assessments on mean projected changes from several GCMs. Ecological systems sensitive to climatic extremes may benefit from what we term the 'model space' approach: a comparison of ecological projections based on simulated climate from historical and future time periods. This approach leverages the experimental framework used in climate modeling, in which historical climate simulations serve as controls for future projections. Moreover, it can capture projected changes in the intensity and frequency of climatic extremes, rather than assuming that future means will determine future extremes. Given the recent emphasis on the ecological impacts of climatic extremes, the strategies we describe will be applicable across species and systems. We also highlight practical considerations for the selection of climate models and data products, emphasizing that the spatial resolution of the climate change signal is generally coarser than the grid cell size of downscaled climate model output. Our review illustrates how an understanding of how climate model outputs are derived and downscaled can improve the selection and application of climatic data used in ecological modeling. © 2017 John Wiley & Sons Ltd.

  3. An "Extreme Makeover" of a Course in Special Education

    ERIC Educational Resources Information Center

    Nicoll-Senft, Joan M.

    2009-01-01

    Just as the popular television show "Extreme Makeover: Home Edition" targets the demolition and reconstruction of a home so that it better meets the needs of its owners, Fink's approach to integrated course design (ICD; 2003) provides higher education faculty with the tools to deconstruct and do a major remodel of their college courses. Teaching…

  4. Generalized extreme gust wind speeds distributions

    USGS Publications Warehouse

    Cheng, E.; Yeung, C.

    2002-01-01

    Since summer 1996, the US wind engineers are using the extreme gust (or 3-s gust) as the basic wind speed to quantify the destruction of extreme winds. In order to better understand these destructive wind forces, it is important to know the appropriate representations of these extreme gust wind speeds. Therefore, the purpose of this study is to determine the most suitable extreme value distributions for the annual extreme gust wind speeds recorded in large selected areas. To achieve this objective, we are using the generalized Pareto distribution as the diagnostic tool for determining the types of extreme gust wind speed distributions. The three-parameter generalized extreme value distribution function is, thus, reduced to either Type I Gumbel, Type II Frechet or Type III reverse Weibull distribution function for the annual extreme gust wind speeds recorded at a specific site.With the considerations of the quality and homogeneity of gust wind data collected at more than 750 weather stations throughout the United States, annual extreme gust wind speeds at selected 143 stations in the contiguous United States were used in the study. ?? 2002 Elsevier Science Ltd. All rights reserved.

  5. Satellite skill in detecting extreme episodes in near-surface air quality

    NASA Astrophysics Data System (ADS)

    Ruiz, D. J.; Prather, M. J.

    2017-12-01

    Ozone (O3) contributes to ambient air pollution, adversely affecting public health, agriculture, and ecosystems. Reliable, long-term, densely distributed surface networks are required to establish the scale, intensity and repeatability of major pollution events (designated here in a climatological sense as air quality extremes, AQX as defined in Schnell's work). Regrettably, such networks are only available for North America (NA) and Europe (EU), which does not include many populated regions where the deaths associated with air pollution exposure are alarmingly high. Directly measuring surface pollutants from space without lidar is extremely difficult. Mapping of daily pollution events requires cross-track nadir scanners and these have limited sensitivity to surface O3 levels. This work examines several years of coincident surface and OMI satellite measurements over NA-EU, in combination with a chemistry-transport model (CTM) hindcast of that period to understand how the large-scale AQX episodes may extend into the free troposphere and thus be more amenable to satellite mapping. We show how extreme NA-EU episodes are measured from OMI and then look for such patterns over other polluted regions of the globe. We gather individual high-quality O3 surface site measurements from these other regions, to check on our satellite detection. Our approach with global satellite detection would avoid issues associated with regional variations in seasonality, chemical regime, data product biases; and it does not require defining a separate absolute threshold for each data product (surface site and satellite). This also enables coherent linking of the extreme events into large-scale pollution episodes whose magnitude evolves over 100's of km for several days. Tools used here include the UC Irvine CTM, which shows that much of the O3 surface variability is lost at heights above 2 km, but AQX local events are readily seen in a 0-3 km column average. The OMI data are taken from X. Liu's dataset using an improved algorithm for detection of tropospheric O3. Surface site observations outside NA and EU are taken from research stations where possible.

  6. Role of storms and forest practices in sedimentation of an Oregon Coast Range lake

    NASA Astrophysics Data System (ADS)

    Richardson, K.; Hatten, J. A.; Wheatcroft, R. A.; Guerrero, F. J.

    2014-12-01

    The design of better management practices in forested watersheds to face climate change and the associated increase in the frequency of extreme events requires a better understanding of watershed responses to extreme events in the past and also under management regimes. One of the most sensitive watershed processes affected is sediment yield. Lake sediments record events which occur in a watershed and provide an opportunity to examine the interaction of storms and forest management practices in the layers of the stratigraphy. We hypothesize that timber harvesting and road building since the 1900s has resulted in increases in sedimentation; however, the passage of the Oregon Forest Practices Act (OFPA) in 1972 has led to a decrease in sedimentation. Sediment cores were taken at Loon Lake in the Oregon Coast Range. The 32-m deep lake captures sediment from a catchment highly impacted by recent land use and episodic Pacific storms. We can use sedimentological tools to measure changes in sediment production as motivated by extreme floods before settlement, during a major timber harvesting period, and after installation of forestry Best Management Practices. Quantification of changes in particle size and elemental composition (C, N, C/N) throughout the cores can elucidate changes in watershed response to extreme events, as can changes in layer thickness. Age control in the cores is being established by Cesium-137 and radiocarbon dating. Given the instrumental meteorological data and decadal climate reconstructions, we will disentangle climate driven signals from changes in land use practices. The sediment shows distinct laminations and varying thickness of layers throughout the cores. Background deposition is composed of thin layers (<0.5 cm) of fine silts and clays, punctuated by thicker layers (3-25 cm) every 10 to 75 cm. These thick layers consist of distinctly textured units, generally fining upward. We interpret the thick layers in Loon Lake to be deposited by sediment-producing floods throughout much of the 1500-year lifespan of this lake. We will explore the relationship between sedimentation, land use, and climate forcing events to determine if the OFPA is having an effect on reducing sedimentation rates as a result of extreme magnitude storm events.

  7. Prediction of protein-protein interactions from amino acid sequences with ensemble extreme learning machines and principal component analysis

    PubMed Central

    2013-01-01

    Background Protein-protein interactions (PPIs) play crucial roles in the execution of various cellular processes and form the basis of biological mechanisms. Although large amount of PPIs data for different species has been generated by high-throughput experimental techniques, current PPI pairs obtained with experimental methods cover only a fraction of the complete PPI networks, and further, the experimental methods for identifying PPIs are both time-consuming and expensive. Hence, it is urgent and challenging to develop automated computational methods to efficiently and accurately predict PPIs. Results We present here a novel hierarchical PCA-EELM (principal component analysis-ensemble extreme learning machine) model to predict protein-protein interactions only using the information of protein sequences. In the proposed method, 11188 protein pairs retrieved from the DIP database were encoded into feature vectors by using four kinds of protein sequences information. Focusing on dimension reduction, an effective feature extraction method PCA was then employed to construct the most discriminative new feature set. Finally, multiple extreme learning machines were trained and then aggregated into a consensus classifier by majority voting. The ensembling of extreme learning machine removes the dependence of results on initial random weights and improves the prediction performance. Conclusions When performed on the PPI data of Saccharomyces cerevisiae, the proposed method achieved 87.00% prediction accuracy with 86.15% sensitivity at the precision of 87.59%. Extensive experiments are performed to compare our method with state-of-the-art techniques Support Vector Machine (SVM). Experimental results demonstrate that proposed PCA-EELM outperforms the SVM method by 5-fold cross-validation. Besides, PCA-EELM performs faster than PCA-SVM based method. Consequently, the proposed approach can be considered as a new promising and powerful tools for predicting PPI with excellent performance and less time. PMID:23815620

  8. A century of climate and ecosystem change in Western Montana: What do temperature trends portend?

    USGS Publications Warehouse

    Pederson, G.T.; Graumlich, L.J.; Fagre, D.B.; Kipfer, T.; Muhlfeld, C.C.

    2010-01-01

    The physical science linking human-induced increases in greenhouse gasses to the warming of the global climate system is well established, but the implications of this warming for ecosystem processes and services at regional scales is still poorly understood. Thus, the objectives of this work were to: (1) describe rates of change in temperature averages and extremes for western Montana, a region containing sensitive resources and ecosystems, (2) investigate associations between Montana temperature change to hemispheric and global temperature change, (3) provide climate analysis tools for land and resource managers responsible for researching and maintaining renewable resources, habitat, and threatened/endangered species and (4) integrate our findings into a more general assessment of climate impacts on ecosystem processes and services over the past century. Over 100 years of daily and monthly temperature data collected in western Montana, USA are analyzed for long-term changes in seasonal averages and daily extremes. In particular, variability and trends in temperature above or below ecologically and socially meaningful thresholds within this region (e.g., -17.8??C (0??F), 0??C (32??F), and 32.2??C (90??F)) are assessed. The daily temperature time series reveal extremely cold days (??? -17.8??C) terminate on average 20 days earlier and decline in number, whereas extremely hot days (???32??C) show a three-fold increase in number and a 24-day increase in seasonal window during which they occur. Results show that regionally important thresholds have been exceeded, the most recent of which include the timing and number of the 0??C freeze/thaw temperatures during spring and fall. Finally, we close with a discussion on the implications for Montana's ecosystems. Special attention is given to critical processes that respond non-linearly as temperatures exceed critical thresholds, and have positive feedbacks that amplify the changes. ?? Springer Science + Business Media B.V. 2009.

  9. Use of SCALE Continuous-Energy Monte Carlo Tools for Eigenvalue Sensitivity Coefficient Calculations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perfetti, Christopher M; Rearden, Bradley T

    2013-01-01

    The TSUNAMI code within the SCALE code system makes use of eigenvalue sensitivity coefficients for an extensive number of criticality safety applications, such as quantifying the data-induced uncertainty in the eigenvalue of critical systems, assessing the neutronic similarity between different critical systems, and guiding nuclear data adjustment studies. The need to model geometrically complex systems with improved fidelity and the desire to extend TSUNAMI analysis to advanced applications has motivated the development of a methodology for calculating sensitivity coefficients in continuous-energy (CE) Monte Carlo applications. The CLUTCH and Iterated Fission Probability (IFP) eigenvalue sensitivity methods were recently implemented in themore » CE KENO framework to generate the capability for TSUNAMI-3D to perform eigenvalue sensitivity calculations in continuous-energy applications. This work explores the improvements in accuracy that can be gained in eigenvalue and eigenvalue sensitivity calculations through the use of the SCALE CE KENO and CE TSUNAMI continuous-energy Monte Carlo tools as compared to multigroup tools. The CE KENO and CE TSUNAMI tools were used to analyze two difficult models of critical benchmarks, and produced eigenvalue and eigenvalue sensitivity coefficient results that showed a marked improvement in accuracy. The CLUTCH sensitivity method in particular excelled in terms of efficiency and computational memory requirements.« less

  10. The Psychological Essence of the Child Prodigy Phenomenon: Sensitive Periods and Cognitive Experience.

    ERIC Educational Resources Information Center

    Shavinina, Larisa V.

    1999-01-01

    Examination of the child prodigy phenomenon suggests it is a result of extremely accelerated mental development during sensitive periods that leads to the rapid growth of a child's cognitive resources and their construction into specific exceptional achievements. (Author/DB)

  11. Relationship between sensitizer concentration and resist performance of chemically amplified extreme ultraviolet resists in sub-10 nm half-pitch resolution region

    NASA Astrophysics Data System (ADS)

    Kozawa, Takahiro; Santillan, Julius Joseph; Itani, Toshiro

    2017-01-01

    The development of lithography processes with sub-10 nm resolution is challenging. Stochastic phenomena such as line width roughness (LWR) are significant problems. In this study, the feasibility of sub-10 nm fabrication using chemically amplified extreme ultraviolet resists with photodecomposable quenchers was investigated from the viewpoint of the suppression of LWR. The relationship between sensitizer concentration (the sum of acid generator and photodecomposable quencher concentrations) and resist performance was clarified, using the simulation based on the sensitization and reaction mechanisms of chemically amplified resists. For the total sensitizer concentration of 0.5 nm-3 and the effective reaction radius for the deprotection of 0.1 nm, the reachable half-pitch while maintaining 10% critical dimension (CD) LWR was 11 nm. The reachable half-pitch was 7 nm for 20% CD LWR. The increase in the effective reaction radius is required to realize the sub-10 nm fabrication with 10% CD LWR.

  12. High-resolution crystal spectrometer for the 10-60 A extreme ultraviolet region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Beiersdorfer, P.; Brown, G.V.; Goddard, R.

    2004-10-01

    A vacuum crystal spectrometer with nominal resolving power approaching 1000 is described for measuring emission lines with wavelength in the extreme ultraviolet region up to 60 A. The instrument utilizes a flat octadecyl hydrogen maleate crystal and a thin-window 1D position-sensitive gas proportional detector. This detector employs a 1-{mu}m-thick 100x8 mm{sup 2} aluminized polyimide window and operates at one atmosphere pressure. The spectrometer has been implemented on the Livermore electron beam ion traps. The performance of the instrument is illustrated in measurements of the newly discovered magnetic field-sensitive line in Ar{sup 8+}.

  13. An overview of the extreme ultraviolet explorer and its scientific program

    NASA Technical Reports Server (NTRS)

    Malina, Roger F.; Finley, David S.; Jelinsky, Patrick; Vallerga, John; Bowyer, Stuart

    1987-01-01

    NASA's Extreme Ultraviolet Explorer (EUVE) will carry out an all-sky survey from 8 to 90 nm in four bandpasses; the limiting sensitivity will be between 2 to 3 orders of magnitude fainter than the hot white dwarf HZ 43. A deep survey will also be carried out along the ecliptic which will have a limiting sensitivity of 1 to 2 orders of magnitude fainter than the all-sky survey in the bandpass from 8 to 50 nm. The payload also includes a spectrometer which will be used to observe the brighter sources found in the surveys with a spectral resolution of 1 to 2 A.

  14. Imaging of upper extremity stress fractures in the athlete.

    PubMed

    Anderson, Mark W

    2006-07-01

    Although it is much less common than injuries in the lower extremities, an upper extremity stress injury can have a significant impact on an athlete. If an accurate and timely diagnosis is to be made, the clinician must have a high index of suspicion of a stress fracture in any athlete who is involved in a throwing, weightlifting, or upper extremity weight-bearing sport and presents with chronic pain in the upper extremity. Imaging should play an integral role in the work-up of these patients; if initial radiographs are unrevealing, further cross-sectional imaging should be strongly considered. Although a three-phase bone scan is highly sensitive in this regard, MRI has become the study of choice at most centers.

  15. An improvement of LLNA:DA to assess the skin sensitization potential of chemicals.

    PubMed

    Zhang, Hongwei; Shi, Ying; Wang, Chao; Zhao, Kangfeng; Zhang, Shaoping; Wei, Lan; Dong, Li; Gu, Wen; Xu, Yongjun; Ruan, Hongjie; Zhi, Hong; Yang, Xiaoyan

    2017-01-01

    We developed a modified local lymph node assay based on ATP (LLNA:DA), termed the Two-Stage LLNA:DA, to further reduce the animal numbers in the identification of sensitizers. In the Two-Stage LLNA:DA procedure, 13 chemicals ranging from non-sensitizers to extreme sensitizers were selected. The first stage used reduced LLNA:DA (rLLNA:DA) to screen out sensitive chemicals. The second stage used LLNA:DA based on OECD 442 (A) to classify those potential sensitizers screened out in the first stage. In the first stage, the SIs of the methyl methacrylate, salicylic acid, methyl salicylate, ethyl salicylate, isopropanol and propanediol were below 1.8 and need not to be tested in the second step. Others continued to be tested by LLNA:DA. In the second stage, sodium lauryl sulphate and xylene were classified as weak sensitizers. a-hexyl cinnamic aldehyde and eugenol were moderate sensitizers. Benzalkonium chloride and glyoxal were strong sensitizers, and phthalic anhydride was an extreme sensitizer. The 9/9, 11/12, 10/11, and 8/13 (positive or negative only) categories of the Two-Stage LLNA:DA were consistent with those from the other methods (LLNA, LLNA:DA, GPMT/BT and HMT/HPTA), suggesting that Two-Stage LLNA:DA have a high coincidence rate with reported data. In conclusion, The Two-Stage LLNA:DA is in line with the "3R" rules, and can be a modification of LLNA:DA but needs more study.

  16. Fourier fringe analysis and its application to metrology of extreme physical phenomena: a review [Invited].

    PubMed

    Takeda, Mitsuo

    2013-01-01

    The paper reviews a technique for fringe analysis referred to as Fourier fringe analysis (FFA) or the Fourier transform method, with a particular focus on its application to metrology of extreme physical phenomena. Examples include the measurement of extremely small magnetic fields with subfluxon sensitivity by electron wave interferometry, subnanometer wavefront evaluation of projection optics for extreme UV lithography, the detection of sub-Ångstrom distortion of a crystal lattice, and the measurement of ultrashort optical pulses in the femotsecond to attosecond range, which show how the advantages of FFA are exploited in these cutting edge applications.

  17. How sensitive extreme precipitation events on the west coast of Norway are to changes in the Sea Surface Temperature?

    NASA Astrophysics Data System (ADS)

    Sandvik, M. I.; Sorteberg, A.

    2013-12-01

    Studies (RegClim, 2005; Caroletti & Barstad, 2010; Bengtsson et al., 2009; Trenberth, 1999; Pall et al., 2007) indicate an increased risk of more frequent precipitation extremes in a warming world, which may result in more frequent flooding, avalanches and landslides. Thus, the ability to understand how processes influence extreme precipitation events could result in a better representation in models used in both research and weather forecasting. The Weather Research and Forecasting (WRF) model was used on 26 extreme precipitation events located on the west coast of Norway between 1980-2011. The goal of the study was to see how sensitive the intensity and distribution of the precipitation for these case studies were to a warmer/colder Atlantic Ocean, with a uniform change of ×2°C. To secure that the large-scale system remained the same when the Sea Surface Temperature (SST) was changed, spectral nudging was introduced. To avoid the need of a convective scheme, and the uncertainties it brings, a nested domain with a 2km grid resolution was used over Southern Norway. WRF generally underestimated the daily precipitation. The case studies were divided into 2 clusters, depending on the wind direction towards the coast, to search for patterns within each of the clusters. By the use of ensemble mean, the percentage change between the control run and the 2 sensitivity runs were different for the 2 clusters.

  18. Overview of EPA tools for supporting local- and regional-scale decision makers addressing energy and environmental issues

    EPA Science Inventory

    EPA’s Office of Research and Development (ORD) has been developing tools and illustrative case studies for decision makers in local and regional authorities who are facing challenges of establishing resilience to extreme weather events, aging built environment and infrastru...

  19. Manipulations of extracellular Loop 2 in α1 GlyR ultra-sensitive ethanol receptors (USERs) enhance receptor sensitivity to isoflurane, ethanol, and lidocaine, but not propofol

    PubMed Central

    Naito, Anna; Muchhala, Karan H.; Trang, Janice; Asatryan, Liana; Trudell, James R.; Homanics, Gregg E.; Alkana, Ronald L.; Davies, Daryl L.

    2015-01-01

    We recently developed Ultra-Sensitive Ethanol Receptors (USERs) as a novel tool for investigation of single receptor subunit populations sensitized to extremely low ethanol concentrations that do not affect other receptors in the nervous system. To this end, we found that mutations within the extracellular Loop 2 region of glycine receptors (GlyRs) and γ-aminobutyric acid type A receptors (GABAARs) can significantly increase receptor sensitivity to micro-molar concentrations of ethanol resulting in up to a 100-fold increase in ethanol sensitivity relative to wild type (WT) receptors. The current study investigated: 1) Whether structural manipulations of Loop 2 in α1 GlyRs could similarly increase receptor sensitivity to other anesthetics; and 2) If mutations exclusive to the C-terminal end of Loop 2 are sufficient to impart these changes. We expressed α1 GlyR USERs in Xenopus oocytes and tested the effects of three classes of anesthetics, isoflurane (volatile), propofol (intravenous), and lidocaine (local), known to enhance glycine-induced chloride currents using two-electrode voltage clamp electrophysiology. Loop 2 mutations produced a significant 10-fold increase in isoflurane and lidocaine sensitivity, but no increase in propofol sensitivity compared to WT α1 GlyRs. Interestingly, we also found that structural manipulations in the C-terminal end of Loop 2 were sufficient and selective for α1 GlyR modulation by ethanol, isoflurane, and lidocaine. These studies are the first to report the extracellular region of α1 GlyRs as a site of lidocaine action. Overall, the findings suggest that Loop 2 of α1 GlyRs is a key region that mediates isoflurane and lidocaine modulation. Moreover, the results identify important amino acids in Loop 2 that regulate isoflurane, lidocaine, and ethanol action. Collectively, these data indicate the commonality of the sites for isoflurane, lidocaine, and ethanol action, and the structural requirements for allosteric modulation on α1 GlyRs within the extracellular Loop 2 region. PMID:25827497

  20. Towards a General Theory of Extremes for Observables of Chaotic Dynamical Systems.

    PubMed

    Lucarini, Valerio; Faranda, Davide; Wouters, Jeroen; Kuna, Tobias

    2014-01-01

    In this paper we provide a connection between the geometrical properties of the attractor of a chaotic dynamical system and the distribution of extreme values. We show that the extremes of so-called physical observables are distributed according to the classical generalised Pareto distribution and derive explicit expressions for the scaling and the shape parameter. In particular, we derive that the shape parameter does not depend on the chosen observables, but only on the partial dimensions of the invariant measure on the stable, unstable, and neutral manifolds. The shape parameter is negative and is close to zero when high-dimensional systems are considered. This result agrees with what was derived recently using the generalized extreme value approach. Combining the results obtained using such physical observables and the properties of the extremes of distance observables, it is possible to derive estimates of the partial dimensions of the attractor along the stable and the unstable directions of the flow. Moreover, by writing the shape parameter in terms of moments of the extremes of the considered observable and by using linear response theory, we relate the sensitivity to perturbations of the shape parameter to the sensitivity of the moments, of the partial dimensions, and of the Kaplan-Yorke dimension of the attractor. Preliminary numerical investigations provide encouraging results on the applicability of the theory presented here. The results presented here do not apply for all combinations of Axiom A systems and observables, but the breakdown seems to be related to very special geometrical configurations.

  1. Towards a General Theory of Extremes for Observables of Chaotic Dynamical Systems

    NASA Astrophysics Data System (ADS)

    Lucarini, Valerio; Faranda, Davide; Wouters, Jeroen; Kuna, Tobias

    2014-02-01

    In this paper we provide a connection between the geometrical properties of the attractor of a chaotic dynamical system and the distribution of extreme values. We show that the extremes of so-called physical observables are distributed according to the classical generalised Pareto distribution and derive explicit expressions for the scaling and the shape parameter. In particular, we derive that the shape parameter does not depend on the chosen observables, but only on the partial dimensions of the invariant measure on the stable, unstable, and neutral manifolds. The shape parameter is negative and is close to zero when high-dimensional systems are considered. This result agrees with what was derived recently using the generalized extreme value approach. Combining the results obtained using such physical observables and the properties of the extremes of distance observables, it is possible to derive estimates of the partial dimensions of the attractor along the stable and the unstable directions of the flow. Moreover, by writing the shape parameter in terms of moments of the extremes of the considered observable and by using linear response theory, we relate the sensitivity to perturbations of the shape parameter to the sensitivity of the moments, of the partial dimensions, and of the Kaplan-Yorke dimension of the attractor. Preliminary numerical investigations provide encouraging results on the applicability of the theory presented here. The results presented here do not apply for all combinations of Axiom A systems and observables, but the breakdown seems to be related to very special geometrical configurations.

  2. Lateral position detection and control for friction stir systems

    DOEpatents

    Fleming, Paul; Lammlein, David H.; Cook, George E.; Wilkes, Don Mitchell; Strauss, Alvin M.; Delapp, David R.; Hartman, Daniel A.

    2012-06-05

    An apparatus and computer program are disclosed for processing at least one workpiece using a rotary tool with rotating member for contacting and processing the workpiece. The methods include oscillating the rotary tool laterally with respect to a selected propagation path for the rotating member with respect to the workpiece to define an oscillation path for the rotating member. The methods further include obtaining force signals or parameters related to the force experienced by the rotary tool at least while the rotating member is disposed at the extremes of the oscillation. The force signals or parameters associated with the extremes can then be analyzed to determine a lateral position of the selected path with respect to a target path and a lateral offset value can be determined based on the lateral position. The lateral distance between the selected path and the target path can be decreased based on the lateral offset value.

  3. Lateral position detection and control for friction stir systems

    DOEpatents

    Fleming, Paul [Boulder, CO; Lammlein, David H [Houston, TX; Cook, George E [Brentwood, TN; Wilkes, Don Mitchell [Nashville, TN; Strauss, Alvin M [Nashville, TN; Delapp, David R [Ashland City, TN; Hartman, Daniel A [Fairhope, AL

    2011-11-08

    Friction stir methods are disclosed for processing at least one workpiece using a rotary tool with rotating member for contacting and processing the workpiece. The methods include oscillating the rotary tool laterally with respect to a selected propagation path for the rotating member with respect to the workpiece to define an oscillation path for the rotating member. The methods further include obtaining force signals or parameters related to the force experienced by the rotary tool at least while the rotating member is disposed at the extremes of the oscillation. The force signals or parameters associated with the extremes can then be analyzed to determine a lateral position of the selected path with respect to a target path and a lateral offset value can be determined based on the lateral position. The lateral distance between the selected path and the target path can be decreased based on the lateral offset value.

  4. Comparing the performance of circulating cathodic antigen and Kato-Katz techniques in evaluating Schistosoma mansoni infection in areas with low prevalence in selected counties of Kenya: a cross-sectional study.

    PubMed

    Okoyo, Collins; Simiyu, Elses; Njenga, Sammy M; Mwandawiro, Charles

    2018-04-11

    Kato-Katz technique has been the mainstay test in Schistosoma mansoni diagnosis in endemic areas. However, recent studies have documented its poor sensitivity in evaluating Schistosoma mansoni infection especially in areas with lower rates of transmission. It's the primary diagnostic tool in monitoring impact of the Kenya national school based deworming program on infection transmission, but there is need to consider a more sensitive technique as the prevalence reduces. Therefore, this study explored the relationship between results of the stool-based Kato-Katz technique with urine-based point-of-care circulating cathodic antigen (POC-CCA) test in view to inform decision-making by the program in changing from Kato-Katz to POC-CCA test. We used two cross-sectional surveys conducted pre- and post- mass drug administration (MDA) using praziquantel in a representative random sample of children from 18 schools across 11 counties. A total of 1944 children were randomly sampled for the study. Stool and urine samples were tested for S. mansoni infection using Kato-Katz and POC-CCA methods, respectively. S. mansoni prevalence using each technique was calculated and 95% confidence intervals obtained using binomial regression model. Specificity (Sp) and sensitivity (Sn) were determined using 2 × 2 contingency tables and compared using the McNemar's chi-square test. A total of 1899 and 1878 children were surveyed at pre- and post-treatment respectively. S. mansoni infection prevalence was 26.5 and 21.4% during pre- and post-treatment respectively using POC-CCA test, and 4.9 and 1.5% for pre- and post-treatment respectively using Kato-Katz technique. Taking POC-CCA as the gold standard, Kato-Katz was found to have significantly lower sensitivity both at pre- and post-treatment, Sn = 12.5% and Sn = 5.2% respectively, McNemar test χ 2 m  = 782.0, p < 0.001. In overall, the results showed a slight/poor agreement between the two methods, kappa index (k) = 0.11, p < 0.001, inter-rater agreement = 77.1%. Results showed POC-CCA technique as an effective, sensitive and accurate screening tool for Schistosoma mansoni infection in areas of low prevalence. It was up to 14-fold accurate than Kato-Katz which had extremely inadequate sensitivity. We recommend usage of POC-CCA alongside Kato-Katz examinations by Schistosomiasis control programs in low prevalence areas.

  5. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, F.

    2016-12-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  6. Extreme groundwater levels caused by extreme weather conditions - the highest ever measured groundwater levels in Middle Germany and their management

    NASA Astrophysics Data System (ADS)

    Reinstorf, Frido; Kramer, Stefanie; Koch, Thomas; Seifert, Sven; Monninkhoff, Bertram; Pfützner, Bernd

    2017-04-01

    Extreme weather conditions during the years 2009 - 2011 in combination with changes in the regional water management and possible impacts of climate change led to maximum groundwater levels in large areas of Germany in 2011. This resulted in extensive water logging, with problems especially in urban areas near rivers, where water logging produced huge problems for buildings and infrastructure. The acute situation still exists in many areas and requires the development of solution concepts. Taken the example of the Elbe-Saale-Region in the Federal State of Saxony-Anhalt, were a pilot research project was carried out, the analytical situation, the development of a management tool and the implementation of a groundwater management concept are shown. The central tool is a coupled water budget - groundwater flow model. In combination with sophisticated multi-scale parameter estimation, a high resolution groundwater level simulation was carried out. A decision support process with a very intensive stakeholder interaction combined with high resolution simulations enables the development of a management concept for extreme groundwater situations in consideration of sustainable and environmentally sound solutions mainly on the base of passive measures.

  7. Hough transform for clustered microcalcifications detection in full-field digital mammograms

    NASA Astrophysics Data System (ADS)

    Fanizzi, A.; Basile, T. M. A.; Losurdo, L.; Amoroso, N.; Bellotti, R.; Bottigli, U.; Dentamaro, R.; Didonna, V.; Fausto, A.; Massafra, R.; Moschetta, M.; Tamborra, P.; Tangaro, S.; La Forgia, D.

    2017-09-01

    Many screening programs use mammography as principal diagnostic tool for detecting breast cancer at a very early stage. Despite the efficacy of the mammograms in highlighting breast diseases, the detection of some lesions is still doubtless for radiologists. In particular, the extremely minute and elongated salt-like particles of microcalcifications are sometimes no larger than 0.1 mm and represent approximately half of all cancer detected by means of mammograms. Hence the need for automatic tools able to support radiologists in their work. Here, we propose a computer assisted diagnostic tool to support radiologists in identifying microcalcifications in full (native) digital mammographic images. The proposed CAD system consists of a pre-processing step, that improves contrast and reduces noise by applying Sobel edge detection algorithm and Gaussian filter, followed by a microcalcification detection step performed by exploiting the circular Hough transform. The procedure performance was tested on 200 images coming from the Breast Cancer Digital Repository (BCDR), a publicly available database. The automatically detected clusters of microcalcifications were evaluated by skilled radiologists which asses the validity of the correctly identified regions of interest as well as the system error in case of missed clustered microcalcifications. The system performance was evaluated in terms of Sensitivity and False Positives per images (FPi) rate resulting comparable to the state-of-art approaches. The proposed model was able to accurately predict the microcalcification clusters obtaining performances (sensibility = 91.78% and FPi rate = 3.99) which favorably compare to other state-of-the-art approaches.

  8. Sliding into happiness: A new tool for measuring affective responses to words

    PubMed Central

    Warriner, Amy Beth; Shore, David I.; Schmidt, Louis A.; Imbault, Constance L.; Kuperman, Victor

    2016-01-01

    Reliable measurement of affective responses is critical for research into human emotion. Affective evaluation of words is most commonly gauged on multiple dimensions—including valence (positivity) and arousal—using a rating scale. Despite its popularity, this scale is open to criticism: it generates ordinal data that is often misinterpreted as interval, it does not provide the fine resolution that is essential by recent theoretical accounts of emotion, and its extremes may not be properly calibrated. In five experiments, we introduce a new slider tool for affective evaluation of words on a continuous, well-calibrated and high-resolution scale. In Experiment 1, participants were shown a word and asked to move a manikin representing themselves closer to or farther away from the word. The manikin’s distance from the word strongly correlated with the word’s valence. In Experiment 2, individual differences in shyness and sociability elicited reliable differences in distance from the words. Experiment 3 validated the results of Experiments 1 and 2 using a demographically more diverse population of responders. Finally, Experiment 4 (along with Experiment 2) suggested that task demand is not a potential cause for scale recalibration. In Experiment 5, men and women placed a manikin closer or farther from words that showed sex differences in valence, highlighting the sensitivity of this measure to group differences. These findings shed a new light on interactions among affect, language, and individual differences, and demonstrate the utility of a new tool for measuring word affect. PMID:28252996

  9. The East London glaucoma prediction score: web-based validation of glaucoma risk screening tool

    PubMed Central

    Stephen, Cook; Benjamin, Longo-Mbenza

    2013-01-01

    AIM It is difficult for Optometrists and General Practitioners to know which patients are at risk. The East London glaucoma prediction score (ELGPS) is a web based risk calculator that has been developed to determine Glaucoma risk at the time of screening. Multiple risk factors that are available in a low tech environment are assessed to provide a risk assessment. This is extremely useful in settings where access to specialist care is difficult. Use of the calculator is educational. It is a free web based service. Data capture is user specific. METHOD The scoring system is a web based questionnaire that captures and subsequently calculates the relative risk for the presence of Glaucoma at the time of screening. Three categories of patient are described: Unlikely to have Glaucoma; Glaucoma Suspect and Glaucoma. A case review methodology of patients with known diagnosis is employed to validate the calculator risk assessment. RESULTS Data from the patient records of 400 patients with an established diagnosis has been captured and used to validate the screening tool. The website reports that the calculated diagnosis correlates with the actual diagnosis 82% of the time. Biostatistics analysis showed: Sensitivity = 88%; Positive predictive value = 97%; Specificity = 75%. CONCLUSION Analysis of the first 400 patients validates the web based screening tool as being a good method of screening for the at risk population. The validation is ongoing. The web based format will allow a more widespread recruitment for different geographic, population and personnel variables. PMID:23550097

  10. The parser generator as a general purpose tool

    NASA Technical Reports Server (NTRS)

    Noonan, R. E.; Collins, W. R.

    1985-01-01

    The parser generator has proven to be an extremely useful, general purpose tool. It can be used effectively by programmers having only a knowledge of grammars and no training at all in the theory of formal parsing. Some of the application areas for which a table-driven parser can be used include interactive, query languages, menu systems, translators, and programming support tools. Each of these is illustrated by an example grammar.

  11. Mangrove expansion and contraction at a poleward range limit: climate extremes and land-ocean temperature gradients.

    PubMed

    Osland, Michael J; Day, Richard H; Hall, Courtney T; Brumfield, Marisa D; Dugas, Jason L; Jones, William R

    2017-01-01

    Within the context of climate change, there is a pressing need to better understand the ecological implications of changes in the frequency and intensity of climate extremes. Along subtropical coasts, less frequent and warmer freeze events are expected to permit freeze-sensitive mangrove forests to expand poleward and displace freeze-tolerant salt marshes. Here, our aim was to better understand the drivers of poleward mangrove migration by quantifying spatiotemporal patterns in mangrove range expansion and contraction across land-ocean temperature gradients. Our work was conducted in a freeze-sensitive mangrove-marsh transition zone that spans a land-ocean temperature gradient in one of the world's most wetland-rich regions (Mississippi River Deltaic Plain; Louisiana, USA). We used historical air temperature data (1893-2014), alternative future climate scenarios, and coastal wetland coverage data (1978-2011) to investigate spatiotemporal fluctuations and climate-wetland linkages. Our analyses indicate that changes in mangrove coverage have been controlled primarily by extreme freeze events (i.e., air temperatures below a threshold zone of -6.3 to -7.6°C). We expect that in the past 121 yr, mangrove range expansion and contraction has occurred across land-ocean temperature gradients. Mangrove resistance, resilience, and dominance were all highest in areas closer to the ocean where temperature extremes were buffered by large expanses of water and saturated soil. Under climate change, these areas will likely serve as local hotspots for mangrove dispersal, growth, range expansion, and displacement of salt marsh. Collectively, our results show that the frequency and intensity of freeze events across land-ocean temperature gradients greatly influences spatiotemporal patterns of range expansion and contraction of freeze-sensitive mangroves. We expect that, along subtropical coasts, similar processes govern the distribution and abundance of other freeze-sensitive organisms. In broad terms, our findings can be used to better understand and anticipate the ecological effects of changing winter climate extremes, especially within the transition zone between tropical and temperate climates. © 2016 by the Ecological Society of America.

  12. An anxiety, personality and altitude symptomatology study during a 31-day period of hypoxia in a hypobaric chamber (experiment 'Everest-Comex 1997').

    PubMed

    Nicolas, M; Thullier-Lestienne, F; Bouquet, C; Gardette, B; Gortan, C; Joulia, F; Bonnon, M; Richalet, J P; Therme, P; Abraini, J H

    1999-12-01

    Extreme environmental situations are useful tools for the investigation of the general processes of adaptation. Among such situations, high altitude of more than 3000 m produces a set of pathological disorders that includes both cerebral (cAS) and respiratory (RAS) altitude symptoms. High altitude exposure further induces anxiety responses and behavioural disturbances. The authors report an investigation on anxiety responses, personality traits, and altitude symptoms (AS) in climbers participating in a 31-day period of confinement and gradual decompression in a hypobaric chamber equivalent to a climb from sea-level to Mount Everest (8848 m altitude). Personality traits, state-trait anxiety, and AS were assessed, using the Cattell 16 Personality Factor questionnaire (16PF), the Spielberger's State-Trait Anxiety Inventory (STAI), and the Lake Louise concensus questionnaire. Results show significant group effect for state-anxiety and AS; state-anxiety and AS increased as altitude increased. They also show that state-type anxiety shows a similar time-course to cAS, but not RAS. Alternatively, our results demonstrate a significant negative correlation between Factor M of the 16PF questionnaire, which is a personality trait that ranges from praxernia to autia. In contrast, no significant correlation was found between personality traits and AS. This suggests that AS could not be predicted using personality traits and further support that personality traits, such as praxernia (happening sensitivity), could play a major role in the occurrence of state-type anxiety responses in extreme environments. In addition, the general processes of coping and adaptation in individuals participating in extreme environmental experiments are discussed.

  13. When is Chemical Similarity Significant? The Statistical Distribution of Chemical Similarity Scores and Its Extreme Values

    PubMed Central

    Baldi, Pierre

    2010-01-01

    As repositories of chemical molecules continue to expand and become more open, it becomes increasingly important to develop tools to search them efficiently and assess the statistical significance of chemical similarity scores. Here we develop a general framework for understanding, modeling, predicting, and approximating the distribution of chemical similarity scores and its extreme values in large databases. The framework can be applied to different chemical representations and similarity measures but is demonstrated here using the most common binary fingerprints with the Tanimoto similarity measure. After introducing several probabilistic models of fingerprints, including the Conditional Gaussian Uniform model, we show that the distribution of Tanimoto scores can be approximated by the distribution of the ratio of two correlated Normal random variables associated with the corresponding unions and intersections. This remains true also when the distribution of similarity scores is conditioned on the size of the query molecules in order to derive more fine-grained results and improve chemical retrieval. The corresponding extreme value distributions for the maximum scores are approximated by Weibull distributions. From these various distributions and their analytical forms, Z-scores, E-values, and p-values are derived to assess the significance of similarity scores. In addition, the framework allows one to predict also the value of standard chemical retrieval metrics, such as Sensitivity and Specificity at fixed thresholds, or ROC (Receiver Operating Characteristic) curves at multiple thresholds, and to detect outliers in the form of atypical molecules. Numerous and diverse experiments carried in part with large sets of molecules from the ChemDB show remarkable agreement between theory and empirical results. PMID:20540577

  14. Dynamical Characterization of Galaxies at z ˜ 4-6 via Tilted Ring Fitting to ALMA [C II] Observations

    NASA Astrophysics Data System (ADS)

    Jones, G. C.; Carilli, C. L.; Shao, Y.; Wang, R.; Capak, P. L.; Pavesi, R.; Riechers, D. A.; Karim, A.; Neeleman, M.; Walter, F.

    2017-12-01

    Until recently, determining the rotational properties of galaxies in the early universe (z> 4, universe age < 1.5 Gyr) was impractical, with the exception of a few strongly lensed systems. Combining the high resolution and sensitivity of ALMA at (sub-)millimeter wavelengths with the typically high strength of the [C II] 158 μm emission line from galaxies and long-developed dynamical modeling tools raises the possibility of characterizing the gas dynamics in both extreme starburst galaxies and normal star-forming disk galaxies at z˜ 4{--}7. Using a procedure centered around GIPSY’s ROTCUR task, we have fit tilted ring models to some of the best available ALMA [C II] data of a small set of galaxies: the MS galaxies HZ9 and HZ10, the damped Lyα absorber host galaxy ALMA J0817+1351, the submm galaxies AzTEC/C159 and COSMOS J1000+0234, and the quasar host galaxy ULAS J1319+0950. This procedure directly derives rotation curves and dynamical masses as functions of radius for each object. In one case, we present evidence for a dark matter halo of { O }({10}11) {M}⊙ . We present an analysis of the possible velocity dispersions of two sources based on matching simulated observations to the integrated [C II] line profiles. Finally, we test the effects of observation resolution and sensitivity on our results. While the conclusions remain limited at the resolution and signal-to-noise ratios of these observations, the results demonstrate the viability of the modeling tools at high redshift, and the exciting potential for detailed dynamical analysis of the earliest galaxies, as ALMA achieves full observational capabilities.

  15. Differences in Dietary Preferences, Personality and Mental Health in Australian Adults with and without Food Addiction.

    PubMed

    Burrows, Tracy; Hides, Leanne; Brown, Robyn; Dayas, Christopher V; Kay-Lambkin, Frances

    2017-03-15

    Increased obesity rates, an evolving food supply and the overconsumption of energy dense foods has led to an increase in research exploring addictive eating behaviours. This study aimed to investigate food addiction in a sample of Australian adults using the revised Yale Food Addiction Survey (YFAS) 2.0 tool and how it is associated with dietary intake, personality traits and mental health issues. Australian adults were invited to complete an online survey that collected information including: demographics, dietary intake, depression, anxiety, stress and personality dimensions including impulsivity, sensation seeking, hopelessness and anxiety sensitivity. A total of 1344 individuals were recruited with the samples comprising 75.7% female, mean age 39.8 ± 13.1 years (range 18-91 years) and body mass index BMI 27.7 ± 9.5. Food addiction was identified in 22.2% of participants using the YFAS 2.0 tool, which classified the severity of food addiction as "mild" in 0.7% of cases, "moderate" in 2.6% and "severe" in 18.9% of cases. Predictors of severe food addiction were female gender (odds ratio (OR) 3.65 95% CI 1.86-7.11) and higher levels of soft drink OR 1.36 (1.07-1.72), confectionary consumption and anxiety sensitivity 1.16 (1.07-1.26). Overall people with "severe" (OR 13.2, 5.8-29.8) or extremely severe depressive symptoms (OR 15.6, range 7.1-34.3) had the highest odds of having severe food addiction. The only variable that reduced the odds of having severe food addiction was vegetable intake. The current study highlights that addictive food behaviours are associated with a complex pattern of poor dietary choices and a clustering with mental health issues, particularly depression.

  16. Quantitative measures detect sensory and motor impairments in multiple sclerosis.

    PubMed

    Newsome, Scott D; Wang, Joseph I; Kang, Jonathan Y; Calabresi, Peter A; Zackowski, Kathleen M

    2011-06-15

    Sensory and motor dysfunction in multiple sclerosis (MS) is often assessed with rating scales which rely heavily on clinical judgment. Quantitative devices may be more precise than rating scales. To quantify lower extremity sensorimotor measures in individuals with MS, evaluate the extent to which they can detect functional systems impairments, and determine their relationship to global disability measures. We tested 145 MS subjects and 58 controls. Vibration thresholds were quantified using a Vibratron-II device. Strength was quantified by a hand-held dynamometer. We also recorded Expanded Disability Status Scale (EDSS) and Timed 25-Foot Walk (T25FW). t-tests and Wilcoxon-rank sum were used to compare group data. Spearman correlations were used to assess relationships between each measure. We also used a step-wise linear regression model to determine how much the quantitative measures explain the variance in the respective functional systems scores (FSS). EDSS scores ranged from 0-7.5, mean disease duration was 10.4 ± 9.6 years, and 66% were female. In relapsing-remitting MS, but not progressive MS, poorer vibration sensation correlated with a worse EDSS score, whereas progressive groups' ankle/hip strength changed significantly with EDSS progression. Interestingly, not only did sensorimotor measures significantly correlate with global disability measures (i.e., EDSS), but they had improved sensitivity, as they detected impairments in up to 32% of MS subjects with normal sensory and pyramidal FSS. Sensory and motor deficits in MS can be quantified using clinically accessible tools and distinguish differences among MS subtypes. We show that quantitative sensorimotor measures are more sensitive than FSS from the EDSS. These tools have the potential to be used as clinical outcome measures in practice and for future MS clinical trials of neurorehabilitative and neuroreparative interventions. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Quantitative measures detect sensory and motor impairments in multiple sclerosis

    PubMed Central

    Newsome, Scott D.; Wang, Joseph I.; Kang, Jonathan Y.; Calabresi, Peter A.; Zackowski, Kathleen M.

    2011-01-01

    Background Sensory and motor dysfunction in multiple sclerosis (MS) is often assessed with rating scales which rely heavily on clinical judgment. Quantitative devices may be more precise than rating scales. Objective To quantify lower extremity sensorimotor measures in individuals with MS, evaluate the extent to which they can detect functional systems impairments, and determine their relationship to global disability measures. Methods We tested 145 MS subjects and 58 controls. Vibration thresholds were quantified using a Vibratron-II device. Strength was quantified by a hand-held dynamometer. We also recorded Expanded Disability Status Scale (EDSS) and timed 25-foot walk (T25FW). T-tests and Wilcoxon-rank sum were used to compare group data. Spearman correlations were used to assess relationships between each measure. We also used a step-wise linear regression model to determine how much the quantitative measures explain the variance in the respective functional systems scores (FSS). Results EDSS scores ranged from 0-7.5, mean disease duration was 10.4±9.6 years, and 66% were female. In RRMS, but not progressive MS, poorer vibration sensation correlated with a worse EDSS score, whereas progressive groups’ ankle/hip strength changed significantly with EDSS progression. Interestingly, not only did sensorimotor measures significantly correlate with global disability measures (EDSS), but they had improved sensitivity, as they detected impairments in up to 32% of MS subjects with normal sensory FSS. Conclusions Sensory and motor deficits can be quantified using clinically accessible tools and distinguish differences among MS subtypes. We show that quantitative sensorimotor measures are more sensitive than FSS from the EDSS. These tools have the potential to be used as clinical outcome measures in practice and for future MS clinical trials of neurorehabilitative and neuroreparative interventions. PMID:21458828

  18. [Analysis of the pathogenic characteristics of 162 severely burned patients with bloodstream infection].

    PubMed

    Gong, Y L; Yang, Z C; Yin, S P; Liu, M X; Zhang, C; Luo, X Q; Peng, Y Z

    2016-09-20

    To analyze the distribution and drug resistance of pathogen isolated from severely burned patients with bloodstream infection, so as to provide reference for the clinical treatment of these patients. Blood samples of 162 severely burned patients (including 120 patients with extremely severe burn) with bloodstream infection admitted into our burn ICU from January 2011 to December 2014 were collected. Pathogens were cultured by fully automatic blood culture system, and API bacteria identification panels were used to identify pathogen. Kirby-Bauer paper disk diffusion method was used to detect the drug resistance of major Gram-negative and -positive bacteria to 37 antibiotics including ampicillin, piperacillin and teicoplanin, etc. (resistance to vancomycin was detected by E test), and drug resistance of fungi to 5 antibiotics including voriconazole and amphotericin B, etc. Modified Hodge test was used to further identify imipenem and meropenem resistant Klebsiella pneumonia. D test was used to detect erythromycin-induced clindamycin resistant Staphylococcus aureus. The pathogen distribution and drug resistance rate were analyzed by WHONET 5.5. Mortality rate and infected pathogens of patients with extremely severe burn and patients with non-extremely severe burn were recorded. Data were processed with Wilcoxon rank sum test. (1) Totally 1 658 blood samples were collected during the four years, and 339 (20.4%) strains of pathogens were isolated. The isolation rate of Gram-negative bacteria, Gram-positive bacteria, and fungi were 68.4% (232/339), 24.5% (83/339), and 7.1% (24/339), respectively. The top three pathogens with isolation rate from high to low were Acinetobacter baumannii, Staphylococcus aureus, and Pseudomonas aeruginosa in turn. (2) Except for the low drug resistance rate to polymyxin B and minocycline, drug resistance rate of Acinetobacter baumannii to the other antibiotics were relatively high (81.0%-100.0%). Pseudomonas aeruginosa was sensitive to polymyxin B but highly resistant to other antibiotics (57.7%-100.0%). Enterobacter cloacae was sensitive to imipenem and meropenem, while its drug resistance rates to ciprofloxacin, levofloxacin, cefoperazone/sulbactam, cefepime, piperacillin/tazobactam were 25.0%-49.0%, and those to the other antibiotics were 66.7%-100.0%. Drug resistance rates of Klebsiella pneumoniae to cefoperazone/sulbactam, imipenem, and meropenem were low (5.9%-15.6%, two imipenem- and meropenem-resistant strains were identified by modified Hodge test), while its drug resistance rates to amoxicillin/clavulanic acid, piperacillin/tazobactam, cefepime, cefoxitin, amikacin, levofloxacin were 35.3%-47.1%, and those to the other antibiotics were 50.0%-100.0%. (3) Drug resistance rates of methicillin-resistant Staphylococcus aureus (MRSA) to most of the antibiotics were higher than those of the methicillin-sensitive Staphylococcus aureus (MSSA). MRSA was sensitive to linezolid, vancomycin, and teicoplanin, while its drug resistance rates to compound sulfamethoxazole, clindamycin, minocycline, and erythromycin were 5.3%-31.6%, and those to the other antibiotics were 81.6%-100.0%. Except for totally resistant to penicillin G and tetracycline, MSSA was sensitive to the other antibiotics. Fourteen Staphylococcus aureus strains were resistant to erythromycin-induced clindamycin. Enterococcus was sensitive to vancomycin and teicoplanin, while its drug resistance rates to linezolid, chloramphenicol, nitrofurantoin, and high unit gentamicin were low (10.0%-30.0%), and those to ciprofloxacin, erythromycin, minocycline, and ampicillin were high (60.0%-80.0%). Enterococcus was fully resistant to rifampicin. (4) Fungi was sensitive to amphotericin B, and drug resistance rates of fungi to voriconazole, fluconazole, itraconazole, and ketoconazole were 7.2%-12.5%. (5) The mortality of patients with extremely severe burn was higher than that of patients with non-extremely severe burn. The variety of infected pathogens in patients with extremely severe burn significantly outnumbered that in patients with non-extremely severe burn (Z=-2.985, P=0.005). The variety of pathogen in severely burned patients with bloodstream infection is wide, with the main pathogens as Acinetobacter baumannii, Staphylococcus aureus, and Pseudomonas aeruginosa, and the drug resistance situation is grim. The types of infected pathogen in patients with extremely severe burn are more complex, and the mortality of these patients is higher when compared with that of patients with non-extremely severe burn.

  19. Surface Desorption Dielectric-Barrier Discharge Ionization Mass Spectrometry.

    PubMed

    Zhang, Hong; Jiang, Jie; Li, Na; Li, Ming; Wang, Yingying; He, Jing; You, Hong

    2017-07-18

    A variant of dielectric-barrier discharge named surface desorption dielectric-barrier discharge ionization (SDDBDI) mass spectrometry was developed for high-efficiency ion transmission and high spatial resolution imaging. In SDDBDI, a tungsten nanotip and the inlet of the mass spectrometer are used as electrodes, and a piece of coverslip is used as a sample plate as well as an insulating dielectric barrier, which simplifies the configuration of instrument and thus the operation. Different from volume dielectric-barrier discharge (VDBD), the microdischarges are generated on the surface at SDDBDI, and therefore the plasma density is extremely high. Analyte ions are guided directly into the MS inlet without any deflection. This configuration significantly improves the ion transmission efficiency and thus the sensitivity. The dependence of sensitivity and spatial resolution of the SDDBDI on the operation parameters were systematically investigated. The application of SDDBDI was successfully demonstrated by analysis of multiple species including amino acids, pharmaceuticals, putative cancer biomarkers, and mixtures of both fatty acids and hormones. Limits of detection (S/N = 3) were determined to be 0.84 and 0.18 pmol, respectively, for the analysis of l-alanine and metronidazole. A spatial resolution of 22 μm was obtained for the analysis of an imprinted cyclophosphamide pattern, and imaging of a "T" character was successfully demonstrated under ambient conditions. These results indicate that SDDBDI has high-efficiency ion transmission, high sensitivity, and high spatial resolution, which render it a potential tool for mass spectrometry imaging.

  20. Multiplex Detection of Rare Mutations by Picoliter Droplet Based Digital PCR: Sensitivity and Specificity Considerations

    PubMed Central

    Zonta, Eleonora; Garlan, Fanny; Pécuchet, Nicolas; Perez-Toralla, Karla; Caen, Ouriel; Milbury, Coren; Didelot, Audrey; Fabre, Elizabeth; Blons, Hélène; Laurent-Puig, Pierre; Taly, Valérie

    2016-01-01

    In cancer research, the accuracy of the technology used for biomarkers detection is remarkably important. In this context, digital PCR represents a highly sensitive and reproducible method that could serve as an appropriate tool for tumor mutational status analysis. In particular, droplet-based digital PCR approaches have been developed for detection of tumor-specific mutated alleles within plasmatic circulating DNA. Such an approach calls for the development and validation of a very significant quantity of assays, which can be extremely costly and time consuming. Herein, we evaluated assays for the detection and quantification of various mutations occurring in three genes often misregulated in cancers: the epidermal growth factor receptor (EGFR), the v-Ki-ras2 Kirsten rat sarcoma viral oncogene homolog (KRAS) and the Tumoral Protein p53 (TP53) genes. In particular, commercial competitive allele-specific TaqMan® PCR (castPCR™) technology, as well as TaqMan® and ZEN™ assays, have been evaluated for EGFR p.L858R, p.T790M, p.L861Q point mutations and in-frame deletions Del19. Specificity and sensitivity have been determined on cell lines DNA, plasmatic circulating DNA of lung cancer patients or Horizon Diagnostics Reference Standards. To show the multiplexing capabilities of this technology, several multiplex panels for EGFR (several three- and four-plexes) have been developed, offering new "ready-to-use" tests for lung cancer patients. PMID:27416070

  1. Sensitivity to psychostimulants in mice bred for high and low stimulation to methamphetamine.

    PubMed

    Kamens, H M; Burkhart-Kasch, S; McKinnon, C S; Li, N; Reed, C; Phillips, T J

    2005-03-01

    Methamphetamine (MA) and cocaine induce behavioral effects primarily through modulation of dopamine neurotransmission. However, the genetic regulation of sensitivity to these two drugs may be similar or disparate. Using selective breeding, lines of mice were produced with extreme sensitivity (high MA activation; HMACT) and insensitivity (low MA activation; LMACT) to the locomotor stimulant effects of acute MA treatment. Studies were performed to determine whether there is pleiotropic genetic influence on sensitivity to the locomotor stimulant effect of MA and to other MA- and cocaine-related behaviors. The HMACT line exhibited more locomotor stimulation in response to several doses of MA and cocaine, compared to the LMACT line. Both lines exhibited locomotor sensitization to 2 mg/kg of MA and 10 mg/kg of cocaine; the magnitude of sensitization was similar in the two lines. However, the lines differed in the magnitude of sensitization to a 1 mg/kg dose of MA, a dose that did not produce a ceiling effect that may confound interpretation of studies using higher doses. The LMACT line consumed more MA and cocaine in a two-bottle choice drinking paradigm; the lines consumed similar amounts of saccharin and quinine, although the HMACT line exhibited slightly elevated preference for a low concentration of saccharin. These results suggest that some genes that influence sensitivity to the acute locomotor stimulant effect of MA have a pleiotropic influence on the magnitude of behavioral sensitization to MA and sensitivity to the stimulant effects of cocaine. Further, extreme sensitivity to MA may protect against MA and cocaine self-administration.

  2. RETINOID METABOLISM IN FISH EMBRYOS FROM SENSITIVE AND RESISTANT POPULATIONS EXPOSED TO DIOXIN-LIKE COMPOUNDS

    EPA Science Inventory

    Early developmental stages of fish are extremely sensitive to a class of toxic and persistent environmental contaminants known as dioxin-like compounds (DLCs). Most of the toxicological actions of DLCs are mediated via the Aryl hydrocarbon Receptor (AhR) that regulates transcript...

  3. Strangeness Production in the ALICE Experiment at the LHC

    NASA Astrophysics Data System (ADS)

    Johnson, Harold; Fenner, Kiara; Harton, Austin; Garcia-Solis, Edmundo; Soltz, Ron

    2015-04-01

    The study of strange particle production is an important tool in understanding the properties of a hot and dense medium, the quark-gluon plasma, created in heavy-ion collisions at ultra-relativistic energies. This quark-gluon plasma (QGP) is believed to have been present just after the big bang. The standard model of physics contains six types of quarks. Strange quarks are not among the valence quarks found in protons and neutrons. Strange quark production is sensitive to the extremely high temperatures of the QGP. CERN's Large Hadron Collider accelerates particles to nearly the speed of light before colliding them to create this QGP state. In the results of high-energy particle collisions, hadrons are formed out of quarks and gluons when cooling from extremely high temperatures. Jets are a highly collimated cone of particles coming from the hadronization of a single quark or gluon. Understanding jet interactions may give us clues about the QGP. Using FastJet (a popular jet finder algorithm), we extracted strangeness, or strange particle characteristics of jets contained within proton-proton collisions during our research at CERN. We have identified jets with and without strange particles in proton-proton collisions and we will present a comparison of pT spectra in both cases. This material is based upon work supported by the National Science Foundation under grants PHY-1305280 and PHY-1407051.

  4. Highly Sensitive and Selective Gas Sensor Using Hydrophilic and Hydrophobic Graphenes

    PubMed Central

    Some, Surajit; Xu, Yang; Kim, Youngmin; Yoon, Yeoheung; Qin, Hongyi; Kulkarni, Atul; Kim, Taesung; Lee, Hyoyoung

    2013-01-01

    New hydrophilic 2D graphene oxide (GO) nanosheets with various oxygen functional groups were employed to maintain high sensitivity in highly unfavorable environments (extremely high humidity, strong acidic or basic). Novel one-headed polymer optical fiber sensor arrays using hydrophilic GO and hydrophobic reduced graphene oxide (rGO) were carefully designed, leading to the selective sensing of volatile organic gases for the first time. The two physically different surfaces of GO and rGO could provide the sensing ability to distinguish between tetrahydrofuran (THF) and dichloromethane (MC), respectively, which is the most challenging issue in the area of gas sensors. The eco-friendly physical properties of GO allowed for faster sensing and higher sensitivity when compared to previous results for rGO even under extreme environments of over 90% humidity, making it the best choice for an environmentally friendly gas sensor. PMID:23736838

  5. Fundamentals and practice for ultrasensitive laser-induced fluorescence detection in microanalytical systems.

    PubMed

    Johnson, Mitchell E; Landers, James P

    2004-11-01

    Laser-induced fluorescence is an extremely sensitive method for detection in chemical separations. In addition, it is well-suited to detection in small volumes, and as such is widely used for capillary electrophoresis and microchip-based separations. This review explores the detailed instrumental conditions required for sub-zeptomole, sub-picomolar detection limits. The key to achieving the best sensitivity is to use an excitation and emission volume that is matched to the separation system and that, simultaneously, will keep scattering and luminescence background to a minimum. We discuss how this is accomplished with confocal detection, 90 degrees on-capillary detection, and sheath-flow detection. It is shown that each of these methods have their advantages and disadvantages, but that all can be used to produce extremely sensitive detectors for capillary- or microchip-based separations. Analysis of these capabilities allows prediction of the optimal means of achieving ultrasensitive detection on microchips.

  6. Robust, Thin Optical Films for Extreme Environments

    NASA Technical Reports Server (NTRS)

    2006-01-01

    The environment of space presents scientists and engineers with the challenges of a harsh, unforgiving laboratory in which to conduct their scientific research. Solar astronomy and X-ray astronomy are two of the more challenging areas into which NASA scientists delve, as the optics for this high-tech work must be extremely sensitive and accurate, yet also be able to withstand the battering dished out by radiation, extreme temperature swings, and flying debris. Recent NASA work on this rugged equipment has led to the development of a strong, thin film for both space and laboratory use.

  7. Digital image profilers for detecting faint sources which have bright companions

    NASA Technical Reports Server (NTRS)

    Morris, Elena; Flint, Graham; Slavey, Robert

    1992-01-01

    For this program, an image profiling system was developed which offers the potential for detecting extremely faint optical sources that are located in close proximity to bright companions. The approach employed is novel in three respects. First, it does not require an optical system wherein extraordinary measures must be taken to minimize diffraction and scatter. Second, it does not require detectors possessing either extreme uniformity in sensitivity or extreme temporal stability. Finally, the system can readily be calibrated, or nulled, in space by testing against an unresolved singular stellar source.

  8. FastID: Extremely Fast Forensic DNA Comparisons

    DTIC Science & Technology

    2017-05-19

    FastID: Extremely Fast Forensic DNA Comparisons Darrell O. Ricke, PhD Bioengineering Systems & Technologies Massachusetts Institute of...Technology Lincoln Laboratory Lexington, MA USA Darrell.Ricke@ll.mit.edu Abstract—Rapid analysis of DNA forensic samples can have a critical impact on...time sensitive investigations. Analysis of forensic DNA samples by massively parallel sequencing is creating the next gold standard for DNA

  9. Synthesis and characterization of attosecond light vortices in the extreme ultraviolet

    PubMed Central

    Géneaux, R.; Camper, A.; Auguste, T.; Gobert, O.; Caillat, J.; Taïeb, R.; Ruchon, T.

    2016-01-01

    Infrared and visible light beams carrying orbital angular momentum (OAM) are currently thoroughly studied for their extremely broad applicative prospects, among which are quantum information, micromachining and diagnostic tools. Here we extend these prospects, presenting a comprehensive study for the synthesis and full characterization of optical vortices carrying OAM in the extreme ultraviolet (XUV) domain. We confirm the upconversion rules of a femtosecond infrared helically phased beam into its high-order harmonics, showing that each harmonic order carries the total number of OAM units absorbed in the process up to very high orders (57). This allows us to synthesize and characterize helically shaped XUV trains of attosecond pulses. To demonstrate a typical use of these new XUV light beams, we show our ability to generate and control, through photoionization, attosecond electron beams carrying OAM. These breakthroughs pave the route for the study of a series of fundamental phenomena and the development of new ultrafast diagnosis tools using either photonic or electronic vortices. PMID:27573787

  10. Possible alternatives to critical elements in coatings for extreme applications

    NASA Astrophysics Data System (ADS)

    Grilli, Maria Luisa; Valerini, Daniele; Piticescu, Radu Robert; Bellezze, Tiziano; Yilmaz, Mehmet; Rinaldi, Antonio; Cuesta-López, Santiago; Rizzo, Antonella

    2018-03-01

    Surface functionalisation and protection have been used since a long time for improving specific properties of materials such as lubrication, water repellence, brightness, and for increasing durability of objects and tools. Among the different kinds of surface treatments used to achieve the required properties, the use of coatings is fundamental to guarantee substrate durability in harsh environments. Extreme working conditions of temperature, pressure, irradiation, wear and corrosion occur in several applications, thus very often requiring bulk material protection by means of coatings. In this study, three main classes of coatings used in extreme conditions are considered: i) hard and superhard coatings for application in machining tools, ii) coatings for high temperatures (thermal barrier coatings), and iii) coatings against corrosion. The presence of critical elements in such coatings (Cr, Y, W, Co, etc.) is analysed and the possibility to use CRMs-free substitutes is reviewed. The role of multilayers and nanocomposites in tailoring coating performances is also discussed for thermal barrier and superhard coatings.

  11. Synthesis and characterization of attosecond light vortices in the extreme ultraviolet

    DOE PAGES

    Géneaux, R.; Camper, A.; Auguste, T.; ...

    2016-08-30

    Infrared and visible light beams carrying orbital angular momentum (OAM) are currently thoroughly studied for their extremely broad applicative prospects, among which are quantum information, micromachining and diagnostic tools. Here we extend these prospects, presenting a comprehensive study for the synthesis and full characterization of optical vortices carrying OAM in the extreme ultraviolet (XUV) domain. We confirm the upconversion rules of a femtosecond infrared helically phased beam into its high-order harmonics, showing that each harmonic order carries the total number of OAM units absorbed in the process up to very high orders (57). This allows us to synthesize and characterizemore » helically shaped XUV trains of attosecond pulses. To demonstrate a typical use of these new XUV light beams, we show our ability to generate and control, through photoionization, attosecond electron beams carrying OAM. Furthermore, these breakthroughs pave the route for the study of a series of fundamental phenomena and the development of new ultrafast diagnosis tools using either photonic or electronic vortices.« less

  12. The influence of weather on health-related help-seeking behavior of senior citizens in Hong Kong.

    PubMed

    Wong, Ho Ting; Chiu, Marcus Yu Lung; Wu, Cynthia Sau Ting; Lee, Tsz Cheung

    2015-03-01

    It is believed that extreme hot and cold weather has a negative impact on general health conditions. Much research focuses on mortality, but there is relatively little community health research. This study is aimed at identifying high-risk groups who are sensitive to extreme weather conditions, in particular, very hot and cold days, through an analysis of the health-related help-seeking patterns of over 60,000 Personal Emergency Link (PE-link) users in Hong Kong relative to weather conditions. In the study, 1,659,716 PE-link calls to the help center were analyzed. Results showed that females, older elderly, people who did not live alone, non-subsidized (relatively high-income) users, and those without medical histories of heart disease, hypertension, stroke, and diabetes were more sensitive to extreme weather condition. The results suggest that using official government weather forecast reports to predict health-related help-seeking behavior is feasible. An evidence-based strategic plan could be formulated by using a method similar to that used in this study to identify high-risk groups. Preventive measures could be established for protecting the target groups when extreme weather conditions are forecasted.

  13. The influence of weather on health-related help-seeking behavior of senior citizens in Hong Kong

    NASA Astrophysics Data System (ADS)

    Wong, Ho Ting; Chiu, Marcus Yu Lung; Wu, Cynthia Sau Ting; Lee, Tsz Cheung

    2015-03-01

    It is believed that extreme hot and cold weather has a negative impact on general health conditions. Much research focuses on mortality, but there is relatively little community health research. This study is aimed at identifying high-risk groups who are sensitive to extreme weather conditions, in particular, very hot and cold days, through an analysis of the health-related help-seeking patterns of over 60,000 Personal Emergency Link (PE-link) users in Hong Kong relative to weather conditions. In the study, 1,659,716 PE-link calls to the help center were analyzed. Results showed that females, older elderly, people who did not live alone, non-subsidized (relatively high-income) users, and those without medical histories of heart disease, hypertension, stroke, and diabetes were more sensitive to extreme weather condition. The results suggest that using official government weather forecast reports to predict health-related help-seeking behavior is feasible. An evidence-based strategic plan could be formulated by using a method similar to that used in this study to identify high-risk groups. Preventive measures could be established for protecting the target groups when extreme weather conditions are forecasted.

  14. [Membranotropic effects of electromagnetic radiation of extremely high frequency on Escherichia coli].

    PubMed

    Trchunian, A; Ogandzhanian, E; Sarkisian, E; Gonian, S; Oganesian, A; Oganesian, S

    2001-01-01

    It was found that "sound" electromagnetic radiations of extremely high frequencies (53.5-68 GHz) or millimeter waves (wavelength range of 4.2-5.6 mm) of low intensity (power density 0.01 mW) have a bactericidal effect on Escherichia coli bacteria. It was shown that exposure to irradiation of extremely high frequencies increases the electrokinetic potential and surface change density of bacteria and decreases of membrane potential. The total secretion of hydrogen ions was suppressed, the H+ flux from the cytoplasm to medium decreased, and the flux of N,N'-dicyclohexylcarbodiimide-sensitive potassium ions increased, which was accompanied by changes in the stoichiometry of these fluxes and an increase in the sensitivity of H+ ions to N,N'-dicyclohexylcarbodiimide. The effects depended on duration of exposure: as the time of exposure increased, the bactericidal effect increased, whereas the membranotropic effects decreased. The effects also depended on growth phase of bacteria: the irradiation affected the cells in the stationary but not in the logarithmic phase. It is assumed that the H(+)-ATPase complex F0F1 is involved in membranotropic effects of electromagnetic radiation of extremely high frequencies. Presumably, there are some compensatory mechanisms that eliminate the membranotropic effects.

  15. Evaluation of in silico tools to predict the skin sensitization potential of chemicals.

    PubMed

    Verheyen, G R; Braeken, E; Van Deun, K; Van Miert, S

    2017-01-01

    Public domain and commercial in silico tools were compared for their performance in predicting the skin sensitization potential of chemicals. The packages were either statistical based (Vega, CASE Ultra) or rule based (OECD Toolbox, Toxtree, Derek Nexus). In practice, several of these in silico tools are used in gap filling and read-across, but here their use was limited to make predictions based on presence/absence of structural features associated to sensitization. The top 400 ranking substances of the ATSDR 2011 Priority List of Hazardous Substances were selected as a starting point. Experimental information was identified for 160 chemically diverse substances (82 positive and 78 negative). The prediction for skin sensitization potential was compared with the experimental data. Rule-based tools perform slightly better, with accuracies ranging from 0.6 (OECD Toolbox) to 0.78 (Derek Nexus), compared with statistical tools that had accuracies ranging from 0.48 (Vega) to 0.73 (CASE Ultra - LLNA weak model). Combining models increased the performance, with positive and negative predictive values up to 80% and 84%, respectively. However, the number of substances that were predicted positive or negative for skin sensitization in both models was low. Adding more substances to the dataset will increase the confidence in the conclusions reached. The insights obtained in this evaluation are incorporated in a web database www.asopus.weebly.com that provides a potential end user context for the scope and performance of different in silico tools with respect to a common dataset of curated skin sensitization data.

  16. ATHLETE: A Limbed Vehicle for Solar System Exploration

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.

    2012-01-01

    As part of the Human-Robot Systems project funded by NASA, the Jet Propulsion Laboratory has developed a vehicle called ATHLETE: the All-Terrain Hex-Limbed Extra-Terrestrial Explorer. Each vehicle is based on six wheels at the ends of six multi-degree-of-freedom limbs. Because each limb has enough degrees of freedom for use as a general-purpose leg, the wheels can be locked and used as feet to walk out of excessively soft or other extreme terrain. Since the vehicle has this alternative mode of traversing through or at least out of extreme terrain, the wheels and wheel actuators can be sized for nominal terrain. There are substantial mass savings in the wheel and wheel actuators associated with designing for nominal instead of extreme terrain. These mass savings are comparable-to or larger-than the extra mass associated with the articulated limbs. As a result, the entire mobility system, including wheels and limbs, can be about 25% lighter than a conventional mobility chassis. A side benefit of this approach is that each limb has sufficient degrees-of-freedom to use as a general-purpose manipulator (hence the name "limb" instead of "leg"). Our prototype ATHLETE vehicles have quick-disconnect tool adapters on the limbs that allow tools to be drawn out of a "tool belt" and maneuvered by the limb.

  17. Using the SWAT model to improve process descriptions and define hydrologic partitioning in South Korea

    NASA Astrophysics Data System (ADS)

    Shope, C. L.; Maharjan, G. R.; Tenhunen, J.; Seo, B.; Kim, K.; Riley, J.; Arnhold, S.; Koellner, T.; Ok, Y. S.; Peiffer, S.; Kim, B.; Park, J.-H.; Huwe, B.

    2014-02-01

    Watershed-scale modeling can be a valuable tool to aid in quantification of water quality and yield; however, several challenges remain. In many watersheds, it is difficult to adequately quantify hydrologic partitioning. Data scarcity is prevalent, accuracy of spatially distributed meteorology is difficult to quantify, forest encroachment and land use issues are common, and surface water and groundwater abstractions substantially modify watershed-based processes. Our objective is to assess the capability of the Soil and Water Assessment Tool (SWAT) model to capture event-based and long-term monsoonal rainfall-runoff processes in complex mountainous terrain. To accomplish this, we developed a unique quality-control, gap-filling algorithm for interpolation of high-frequency meteorological data. We used a novel multi-location, multi-optimization calibration technique to improve estimations of catchment-wide hydrologic partitioning. The interdisciplinary model was calibrated to a unique combination of statistical, hydrologic, and plant growth metrics. Our results indicate scale-dependent sensitivity of hydrologic partitioning and substantial influence of engineered features. The addition of hydrologic and plant growth objective functions identified the importance of culverts in catchment-wide flow distribution. While this study shows the challenges of applying the SWAT model to complex terrain and extreme environments; by incorporating anthropogenic features into modeling scenarios, we can enhance our understanding of the hydroecological impact.

  18. Diagnosing clean margins through Raman spectroscopy in human and animal mammary tumour surgery: a short review

    PubMed Central

    Birtoiu, I. A.; Rizea, C.; Togoe, D.; Munteanu, R. M.; Micsa, C.; Rusu, M. I.; Tautan, M.; Braic, L.; Scoicaru, L. O.; Parau, A.; Becherescu-Barbu, N. D.; Udrea, M. V.; Tonetto, A.; Notonier, R.

    2016-01-01

    Breast cancer frequency in human and other mammal female populations has worryingly increased lately. The acute necessity for taxonomy of the aetiological factors along with seeking for new diagnostic tools and therapy procedures aimed at reducing mortality have yielded in an intense research effort worldwide. Surgery is a regular method to counteract extensive development of breast cancer and prevent metastases provided that negative surgical margins are achieved. This highly technical challenge requires fast, extremely sensitive and selective discrimination between malignant and benign tissues even down to molecular level. The particular advantages of Raman spectroscopy, such as high chemical specificity, and the ability to measure raw samples and optical responses in the visible or near-infrared spectral range, have recently recommended it as a means with elevated potential in precise diagnostic in oncology surgery. This review spans mainly the latter 10 years of exceptional efforts of scientists implementing Raman spectroscopy as a nearly real-time diagnostic tool for clean margins assessment in mastectomy and lumpectomy. Although greatly contributing to medical discoveries for the wealth of humanity, animals as patients have benefitted less from advances in surgery diagnostic using Raman spectroscopy. This work also dedicates a few lines to applications of surface enhanced Raman spectroscopy in veterinary oncological surgery. PMID:27920899

  19. KAPS (kinematic assessment of passive stretch): a tool to assess elbow flexor and extensor spasticity after stroke using a robotic exoskeleton.

    PubMed

    Centen, Andrew; Lowrey, Catherine R; Scott, Stephen H; Yeh, Ting-Ting; Mochizuki, George

    2017-06-19

    Spasticity is a common sequela of stroke. Traditional assessment methods include relatively coarse scales that may not capture all characteristics of elevated muscle tone. Thus, the aim of this study was to develop a tool to quantitatively assess post-stroke spasticity in the upper extremity. Ninety-six healthy individuals and 46 individuals with stroke participated in this study. The kinematic assessment of passive stretch (KAPS) protocol consisted of passive elbow stretch in flexion and extension across an 80° range in 5 movement durations. Seven parameters were identified and assessed to characterize spasticity (peak velocity, final angle, creep (or release), between-arm peak velocity difference, between-arm final angle, between-arm creep, and between-arm catch angle). The fastest movement duration (600 ms) was most effective at identifying impairment in each parameter associated with spasticity. A decrease in peak velocity during passive stretch between the affected and unaffected limb was most effective at identifying individuals as impaired. Spasticity was also associated with a decreased passive range (final angle) and a classic 'catch and release' as seen through between-arm catch and creep metrics. The KAPS protocol and robotic technology can provide a sensitive and quantitative assessment of post-stroke elbow spasticity not currently attainable through traditional measures.

  20. Adaptive Optics: Arroyo Simulation Tool and Deformable Mirror Actuation Using Golay Cells

    NASA Technical Reports Server (NTRS)

    Lint, Adam S.

    2005-01-01

    The Arroyo C++ libraries, written by Caltech post-doc student Matthew Britton, have the ability to simulate optical systems and atmospheric signal interference. This program was chosen for use in an end-to-end simulation model of a laser communication system because it is freely distributed and has the ability to be controlled by a remote system or "smart agent." Proposed operation of this program by a smart agent has been demonstrated, and the results show it to be a suitable simulation tool. Deformable mirrors, as a part of modern adaptive optics systems, may contain thousands of tiny, independently controlled actuators used to modify the shape of the mirror. Each actuator is connected to two wires, creating a cumbersome and expensive device. Recently, an alternative actuation method that uses gas-filled tubes known as Golay cells has been explored. Golay cells, operated by infrared lasers instead of electricity, would replace the actuator system thereby creating a more compact deformable mirror. The operation of Golay cells and their ability to move a deformable mirror in excess of the required 20 microns has been demonstrated. Experimentation has shown them to be extremely sensitive to pressure and temperature, making them ideal for use in a controlled environment.

  1. Automated condition-invariable neurite segmentation and synapse classification using textural analysis-based machine-learning algorithms

    PubMed Central

    Kandaswamy, Umasankar; Rotman, Ziv; Watt, Dana; Schillebeeckx, Ian; Cavalli, Valeria; Klyachko, Vitaly

    2013-01-01

    High-resolution live-cell imaging studies of neuronal structure and function are characterized by large variability in image acquisition conditions due to background and sample variations as well as low signal-to-noise ratio. The lack of automated image analysis tools that can be generalized for varying image acquisition conditions represents one of the main challenges in the field of biomedical image analysis. Specifically, segmentation of the axonal/dendritic arborizations in brightfield or fluorescence imaging studies is extremely labor-intensive and still performed mostly manually. Here we describe a fully automated machine-learning approach based on textural analysis algorithms for segmenting neuronal arborizations in high-resolution brightfield images of live cultured neurons. We compare performance of our algorithm to manual segmentation and show that it combines 90% accuracy, with similarly high levels of specificity and sensitivity. Moreover, the algorithm maintains high performance levels under a wide range of image acquisition conditions indicating that it is largely condition-invariable. We further describe an application of this algorithm to fully automated synapse localization and classification in fluorescence imaging studies based on synaptic activity. Textural analysis-based machine-learning approach thus offers a high performance condition-invariable tool for automated neurite segmentation. PMID:23261652

  2. TAxonomy of Self-reported Sedentary behaviour Tools (TASST) framework for development, comparison and evaluation of self-report tools: content analysis and systematic review

    PubMed Central

    Dall, PM; Coulter, EH; Fitzsimons, CF; Skelton, DA; Chastin, SFM

    2017-01-01

    Objective Sedentary behaviour (SB) has distinct deleterious health outcomes, yet there is no consensus on best practice for measurement. This study aimed to identify the optimal self-report tool for population surveillance of SB, using a systematic framework. Design A framework, TAxonomy of Self-reported Sedentary behaviour Tools (TASST), consisting of four domains (type of assessment, recall period, temporal unit and assessment period), was developed based on a systematic inventory of existing tools. The inventory was achieved through a systematic review of studies reporting SB and tracing back to the original description. A systematic review of the accuracy and sensitivity to change of these tools was then mapped against TASST domains. Data sources Systematic searches were conducted via EBSCO, reference lists and expert opinion. Eligibility criteria for selecting studies The inventory included tools measuring SB in adults that could be self-completed at one sitting, and excluded tools measuring SB in specific populations or contexts. The systematic review included studies reporting on the accuracy against an objective measure of SB and/or sensitivity to change of a tool in the inventory. Results The systematic review initially identified 32 distinct tools (141 questions), which were used to develop the TASST framework. Twenty-two studies evaluated accuracy and/or sensitivity to change representing only eight taxa. Assessing SB as a sum of behaviours and using a previous day recall were the most promising features of existing tools. Accuracy was poor for all existing tools, with underestimation and overestimation of SB. There was a lack of evidence about sensitivity to change. Conclusions Despite the limited evidence, mapping existing SB tools onto the TASST framework has enabled informed recommendations to be made about the most promising features for a surveillance tool, identified aspects on which future research and development of SB surveillance tools should focus. Trial registration number International prospective register of systematic reviews (PROPSPERO)/CRD42014009851. PMID:28391233

  3. Protection efficiency of a standard compliant EUV reticle handling solution

    NASA Astrophysics Data System (ADS)

    He, Long; Lystad, John; Wurm, Stefan; Orvek, Kevin; Sohn, Jaewoong; Ma, Andy; Kearney, Patrick; Kolbow, Steve; Halbmaier, David

    2009-03-01

    For successful implementation of extreme ultraviolet lithography (EUVL) technology for late cycle insertion at 32 nm half-pitch (hp) and full introduction for 22 nm hp high volume production, the mask development infrastructure must be in place by 2010. The central element of the mask infrastructure is contamination-free reticle handling and protection. Today, the industry has already developed and balloted an EUV pod standard for shipping, transporting, transferring, and storing EUV masks. We have previously demonstrated that the EUV pod reticle handling method represents the best approach in meeting EUVL high volume production requirements, based on then state-of-the-art inspection capability at ~53nm polystyrene latex (PSL) equivalent sensitivity. In this paper, we will present our latest data to show defect-free reticle handling is achievable down to 40 nm particle sizes, using the same EUV pod carriers as in the previous study and the recently established world's most advanced defect inspection capability of ~40 nm SiO2 equivalent sensitivity. The EUV pod is a worthy solution to meet EUVL pilot line and pre-production exposure tool development requirements. We will also discuss the technical challenges facing the industry in refining the EUV pod solution to meet 22 nm hp EUVL production requirements and beyond.

  4. Label-Free Bioanalyte Detection from Nanometer to Micrometer Dimensions-Molecular Imprinting and QCMs †.

    PubMed

    Mujahid, Adnan; Mustafa, Ghulam; Dickert, Franz L

    2018-06-01

    Modern diagnostic tools and immunoassay protocols urges direct analyte recognition based on its intrinsic behavior without using any labeling indicator. This not only improves the detection reliability, but also reduces sample preparation time and complexity involved during labeling step. Label-free biosensor devices are capable of monitoring analyte physiochemical properties such as binding sensitivity and selectivity, affinity constants and other dynamics of molecular recognition. The interface of a typical biosensor could range from natural antibodies to synthetic receptors for example molecular imprinted polymers (MIPs). The foremost advantages of using MIPs are their high binding selectivity comparable to natural antibodies, straightforward synthesis in short time, high thermal/chemical stability and compatibility with different transducers. Quartz crystal microbalance (QCM) resonators are leading acoustic devices that are extensively used for mass-sensitive measurements. Highlight features of QCM devices include low cost fabrication, room temperature operation, and most importantly ability to monitor extremely low mass shifts, thus potentially a universal transducer. The combination of MIPs with quartz QCM has turned out as a prominent sensing system for label-free recognition of diverse bioanalytes. In this article, we shall encompass the potential applications of MIP-QCM sensors exclusively label-free recognition of bacteria and virus species as representative micro and nanosized bioanalytes.

  5. The smallest natural high-active luciferase: cloning and characterization of novel 16.5-kDa luciferase from copepod Metridia longa.

    PubMed

    Markova, Svetlana V; Larionova, Marina D; Burakova, Ludmila P; Vysotski, Eugene S

    2015-01-30

    Coelenterazine-dependent copepod luciferases containing natural signal peptide for secretion are a very convenient analytical tool as they enable monitoring of intracellular events with high sensitivity, without destroying cells or tissues. This property is well suited for application in biomedical research and development of cell-based assays for high throughput screening. We report the cloning of cDNA gene encoding a novel secreted non-allelic 16.5-kDa isoform (MLuc7) of Metridia longa luciferase, which, in fact, is the smallest natural luciferase of known for today. Despite the small size, isoform contains 10 conservative Cys residues suggesting the presence of up to 5 SS bonds. This hampers the efficient production of functionally active recombinant luciferase in bacterial expression systems. With the use of the baculovirus expression system, we produced substantial amounts of the proper folded MLuc7 luciferase with a yield of ∼3 mg/L of a high purity protein. We demonstrate that MLuc7 produced in insect cells is highly active and extremely thermostable, and is well suited as a secreted reporter when expressed in mammalian cells ensuring higher sensitivity of detection as compared to another Metridia luciferase isoform (MLuc164) which is widely employed in real-time imaging. Copyright © 2014 Elsevier Inc. All rights reserved.

  6. Development of an Ergonomics Checklist for Investigation of Work-Related Whole-Body Disorders in Farming - AWBA: Agricultural Whole-Body Assessment.

    PubMed

    Kong, Y K; Lee, S J; Lee, K S; Kim, G R; Kim, D M

    2015-10-01

    Researchers have been using various ergonomic tools to study occupational musculoskeletal diseases in industrial contexts. However, in agricultural work, where the work environment is poorer and the socio-psychological stress is high due to the high labor intensities of the industry, current research efforts have been scarce, and the number of available tools is small. In our preliminary studies, which focused on a limited number of body parts and other working elements, we developed separate evaluation tools for the upper and lower extremities. The current study was conducted to develop a whole-body ergonomic assessment tool for agricultural work that integrates the existing assessment tools for lower and upper extremities developed in the preliminary studies and to verify the relevance of the integrated assessment tool. To verify the relevance of the Agricultural Whole-Body Assessment (AWBA) tool, we selected 50 different postures that occur frequently in agricultural work. Our results showed that the AWBA-determined risk levels were similar to the subjective risk levels determined by experts. In addition, as the risk level increased, the average risk level increased to a similar extent. Moreover, the differences in risk levels between the AWBA and expert assessments were mostly smaller than the differences in risk levels between other assessment tools and the expert assessments in this study. In conclusion, the AWBA tool developed in this study was demonstrated to be appropriate for use as a tool for assessing various postures commonly assumed in agricultural work. Moreover, we believe that our verification of the assessment tools will contribute to the enhancement of the quality of activities designed to prevent and control work-related musculoskeletal diseases in other industries.

  7. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xiu, Dongbin

    2017-03-03

    The focus of the project is the development of mathematical methods and high-performance computational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly efficient and scalable numerical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  8. Pain sensitivity and temperament in extremely low-birth-weight premature toddlers and preterm and full-term controls.

    PubMed

    Grunau, R V; Whitfield, M F; Petrie, J H

    1994-09-01

    High-technology medical care of extremely low-birth-weight (ELBW) infants (< 1001 g) involves repeated medical interventions which are potentially painful and may later affect reaction to pain. At 18 months corrected age (CCA), we examined parent ratings of pain sensitivity and how pain sensitivity ratings related to child temperament and parenting style in 2 groups of ELBW children (49 with a birth weight of 480-800 g and 75 with a birth weight of 801-1000 g) and 2 control groups (42 heavier preterm (1500-2499 g) and 29 full-birth-weight (FBW) children (> 2500 g). Both groups of ELBW toddlers were rated by parents as significantly lower in pain sensitivity compared with both control groups. The relationships between child temperament and pain sensitivity rating varied systematically across the groups. Temperament was strongly related to rated pain sensitivity in the FBW group, moderately related in the heavier preterm and ELBW 801-1000 g groups, and not related in the lowest birth-weight group (< 801 g). Parental style did not mediate ratings of pain sensitivity. The results suggest that parents perceive differences in pain behavior of ELBW toddlers compared with heavier preterm and FBW toddlers, especially for those less than 801 g. Longitudinal research into the development of pain behavior for infants who experience lengthy hospitalization is warranted.

  9. Decreasing spatial variability in precipitation extremes in southwestern China and the local/large-scale influencing factors

    NASA Astrophysics Data System (ADS)

    Liu, Meixian; Xu, Xianli; Sun, Alex

    2015-07-01

    Climate extremes can cause devastating damage to human society and ecosystems. Recent studies have drawn many conclusions about trends in climate extremes, but few have focused on quantitative analysis of their spatial variability and underlying mechanisms. By using the techniques of overlapping moving windows, the Mann-Kendall trend test, correlation, and stepwise regression, this study examined the spatial-temporal variation of precipitation extremes and investigated the potential key factors influencing this variation in southwestern (SW) China, a globally important biodiversity hot spot and climate-sensitive region. Results showed that the changing trends of precipitation extremes were not spatially uniform, but the spatial variability of these precipitation extremes decreased from 1959 to 2012. Further analysis found that atmospheric circulations rather than local factors (land cover, topographic conditions, etc.) were the main cause of such precipitation extremes. This study suggests that droughts or floods may become more homogenously widespread throughout SW China. Hence, region-wide assessments and coordination are needed to help mitigate the economic and ecological impacts.

  10. Tropical precipitation extremes: Response to SST-induced warming in aquaplanet simulations

    NASA Astrophysics Data System (ADS)

    Bhattacharya, Ritthik; Bordoni, Simona; Teixeira, João.

    2017-04-01

    Scaling of tropical precipitation extremes in response to warming is studied in aquaplanet experiments using the global Weather Research and Forecasting (WRF) model. We show how the scaling of precipitation extremes is highly sensitive to spatial and temporal averaging: while instantaneous grid point extreme precipitation scales more strongly than the percentage increase (˜7% K-1) predicted by the Clausius-Clapeyron (CC) relationship, extremes for zonally and temporally averaged precipitation follow a slight sub-CC scaling, in agreement with results from Climate Model Intercomparison Project (CMIP) models. The scaling depends crucially on the employed convection parameterization. This is particularly true when grid point instantaneous extremes are considered. These results highlight how understanding the response of precipitation extremes to warming requires consideration of dynamic changes in addition to the thermodynamic response. Changes in grid-scale precipitation, unlike those in convective-scale precipitation, scale linearly with the resolved flow. Hence, dynamic changes include changes in both large-scale and convective-scale motions.

  11. Large scale variability, long-term trends and extreme events in total ozone over the northern mid-latitudes based on satellite time series

    NASA Astrophysics Data System (ADS)

    Rieder, H. E.; Staehelin, J.; Maeder, J. A.; Ribatet, M.; Davison, A. C.

    2009-04-01

    Various generations of satellites (e.g. TOMS, GOME, OMI) made spatial datasets of column ozone available to the scientific community. This study has a special focus on column ozone over the northern mid-latitudes. Tools from geostatistics and extreme value theory are applied to analyze variability, long-term trends and frequency distributions of extreme events in total ozone. In a recent case study (Rieder et al., 2009) new tools from extreme value theory (Coles, 2001; Ribatet, 2007) have been applied to the world's longest total ozone record from Arosa, Switzerland (e.g. Staehelin 1998a,b), in order to describe extreme events in low and high total ozone. Within the current study this analysis is extended to satellite datasets for the northern mid-latitudes. Further special emphasis is given on patterns and spatial correlations and the influence of changes in atmospheric dynamics (e.g. tropospheric and lower stratospheric pressure systems) on column ozone. References: Coles, S.: An Introduction to Statistical Modeling of Extreme Values, Springer Series in Statistics, ISBN:1852334592, Springer, Berlin, 2001. Ribatet, M.: POT: Modelling peaks over a threshold, R News, 7, 34-36, 2007. Rieder, H.E., Staehelin, J., Maeder, J.A., Ribatet, M., Stübi, R., Weihs, P., Holawe, F., Peter, T., and Davison, A.C.: From ozone mini holes and mini highs towards extreme value theory: New insights from extreme events and non stationarity, submitted to J. Geophys. Res., 2009. Staehelin, J., Kegel, R., and Harris, N. R.: Trend analysis of the homogenized total ozone series of Arosa (Switzerland), 1929-1996, J. Geophys. Res., 103(D7), 8389-8400, doi:10.1029/97JD03650, 1998a. Staehelin, J., Renaud, A., Bader, J., McPeters, R., Viatte, P., Hoegger, B., Bugnion, V., Giroud, M., and Schill, H.: Total ozone series at Arosa (Switzerland): Homogenization and data comparison, J. Geophys. Res., 103(D5), 5827-5842, doi:10.1029/97JD02402, 1998b.

  12. Optimal sensitivity for molecular recognition MAC-mode AFM

    PubMed

    Schindler; Badt; Hinterdorfer; Kienberger; Raab; Wielert-Badt; Pastushenko

    2000-02-01

    Molecular recognition force microscopy (MRFM) using the magnetic AC mode (MAC mode) atomic force microscope (AFM) was recently investigated to locate and probe recognition sites. A flexible crosslinker carrying a ligand is bound to the tip for the molecular recognition of receptors on the surface of a sample. In this report, the driving frequency is calculated which optimizes the sensitivity (S). The sensitivity of MRFM is defined as the relative change of the magnetically excited cantilever deflection amplitude arising from a crosslinker/antibody/antigen connection that is characterized by a very small force constant. The sensitivity is calculated in a damped oscillator model with a certain value of quality factor Q, which, together with load, defines the frequency response (unloaded oscillator shows resonance at Q > 0.707). If Q < 1, the greatest value of S corresponds to zero driving frequency omega (measured in units of eigenfrequency). Therefore, for Q < 1, MAC-mode has no advantage in comparison with DC-mode. Two additional extremes are found at omegaL = (1 - 1/Q)(1/2) and omegaR = (1 + 1/Q)(1/2), with corresponding sensitivities S(L) = Q2/(2Q - 1), S(R) = Q2/(2Q + 1). The L-extreme exists only for Q > 1, and then S(L) > S(R), i.e. the L-extreme is the main one. For Q > 1, S(L) > 1, and for Q > 2.41, S(R) > 1. These are the critical Q-values, above which selecting driving frequency equal to sigmaL or sigmaR brings advantage to MAC mode vs. DC mode. Satisfactory quality of the oscillator model is demonstrated by comparison of some results with those calculated within the classical description of cantilevers.

  13. Early Ambulation After Microsurgical Reconstruction of the Lower Extremity.

    PubMed

    Orseck, Michael J; Smith, Christopher Robert; Kirby, Sean; Trujillo, Manuel

    2018-06-01

    Successful outcomes after microsurgical reconstruction of the lower extremity include timely return to ambulation. Some combination of physical examination, ViOptix tissue oxygen saturation monitoring, and the implantable venous Doppler have shown promise in increasing sensitivity of current flap monitoring. We have incorporated this system into our postoperative monitoring protocol in an effort to initiate earlier dependency protocols. A prospective analysis of 36 anterolateral thigh free flap and radial forearm flaps for lower extremity reconstruction was performed. Indications for reconstruction were acute and chronic wounds, as well as oncologic resection. Twenty-three patients were able to ambulate and 3 were able to dangle their leg on the first postoperative day. One flap showed early mottling that improved immediately after elevation. After reelevation and return to baseline, the dependency protocol was successfully implemented on postoperative day 3. All flaps went on to successful healing. Physical examination, implantable venous Doppler, and ViOptix can be used reliably as an adjunct to increase the sensitivity of detecting poorly performing flaps during the postoperative progression of dependency.

  14. The extreme ultraviolet explorer

    NASA Technical Reports Server (NTRS)

    Bowyer, Stuart; Malina, Roger F.

    1990-01-01

    The Extreme Ultraviolet Explorer (EUVE) mission, currently scheduled for launch in September 1991, is described. The primary purpose of the mission is to survey the celestial sphere for astronomical sources of Extreme Ultraviolet (EUV) radiation. The survey will be accomplished with the use of three EUV telescopes, each sensitive to a different segment of the EUV band. A fourth telescope will perform a high sensitivity search of a limited sample of the sky in the shortest wavelength bands. The all sky survey will be carried out in the first six months of the mission and will be made in four bands, or colors. The second phase of the mission, conducted entirely by guest observers selected by NASA, will be devoted to spectroscopic observations of EUV sources. The performance of the instrument components is described. An end to end model of the mission, from a stellar source to the resulting scientific data, was constructed. Hypothetical data from astronomical sources processed through this model are shown.

  15. Methodology for sensitivity analysis, approximate analysis, and design optimization in CFD for multidisciplinary applications

    NASA Technical Reports Server (NTRS)

    Taylor, Arthur C., III; Hou, Gene W.

    1994-01-01

    The straightforward automatic-differentiation and the hand-differentiated incremental iterative methods are interwoven to produce a hybrid scheme that captures some of the strengths of each strategy. With this compromise, discrete aerodynamic sensitivity derivatives are calculated with the efficient incremental iterative solution algorithm of the original flow code. Moreover, the principal advantage of automatic differentiation is retained (i.e., all complicated source code for the derivative calculations is constructed quickly with accuracy). The basic equations for second-order sensitivity derivatives are presented; four methods are compared. Each scheme requires that large systems are solved first for the first-order derivatives and, in all but one method, for the first-order adjoint variables. Of these latter three schemes, two require no solutions of large systems thereafter. For the other two for which additional systems are solved, the equations and solution procedures are analogous to those for the first order derivatives. From a practical viewpoint, implementation of the second-order methods is feasible only with software tools such as automatic differentiation, because of the extreme complexity and large number of terms. First- and second-order sensitivities are calculated accurately for two airfoil problems, including a turbulent flow example; both geometric-shape and flow-condition design variables are considered. Several methods are tested; results are compared on the basis of accuracy, computational time, and computer memory. For first-order derivatives, the hybrid incremental iterative scheme obtained with automatic differentiation is competitive with the best hand-differentiated method; for six independent variables, it is at least two to four times faster than central finite differences and requires only 60 percent more memory than the original code; the performance is expected to improve further in the future.

  16. Significant enhancement of 11-Hydroxy-THC detection by formation of picolinic acid esters and application of liquid chromatography/multi stage mass spectrometry (LC-MS(3) ): Application to hair and oral fluid analysis.

    PubMed

    Thieme, Detlef; Sachs, Ulf; Sachs, Hans; Moore, Christine

    2015-07-01

    Formation of picolinic acid esters of hydroxylated drugs or their biotransformation products is a promising tool to improve their mass spectrometric ionization efficiency, alter their fragmentation behaviour and enhance sensitivity and specificity of their detection. The procedure was optimized and tested for the detection of cannabinoids, which proved to be most challenging when dealing with alternative specimens, for example hair and oral fluid. In particular, the detection of the THC metabolites hydroxyl-THC and carboxy-THC requires ultimate sensitivity because of their poor incorporation into hair or saliva. Both biotransformation products are widely accepted as incorporation markers to distinguish drug consumption from passive contamination. The derivatization procedure was carried out by adding a mixture of picolinic acid, 4-(dimethylamino)pyridine and 2-methyl-6-nitrobenzoic anhydride in tetrahydrofuran/triethylamine to the dry extraction residues. Resulting derivatives were found to be very stable and could be reconstituted in aqueous or organic buffers and subsequently analyzed by liquid chromatography-mass spectrometry (LC-MS). Owing to the complex consecutive fragmentation patterns, the application of multistage MS3 proved to be extremely useful for a sensitive identification of doubly picolinated hydroxy-THC in complex matrices. The detection limits - estimated by comparison of corresponding signal-to-noise ratios - increased by a factor of 100 following picolination. All other species examined, like cannabinol, THC, cannabidiol, and carboxy-THC, could also be derivatized exhibiting only moderate sensitivity improvements. The assay was systematically tested using hair samples and exemplarily applied to oral fluid. Concentrations of OH-THC identified in THC-positive hair samples ranged from 0.02 to 0.29pg/mg. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Evaluation of a focussed protocol for hand-held echocardiography and computer-assisted auscultation in detecting latent rheumatic heart disease in scholars.

    PubMed

    Zühlke, Liesl J; Engel, Mark E; Nkepu, Simpiwe; Mayosi, Bongani M

    2016-08-01

    Introduction Echocardiography is the diagnostic test of choice for latent rheumatic heart disease. The utility of echocardiography for large-scale screening is limited by high cost, complex diagnostic protocols, and time to acquire multiple images. We evaluated the performance of a brief hand-held echocardiography protocol and computer-assisted auscultation in detecting latent rheumatic heart disease with or without pathological murmur. A total of 27 asymptomatic patients with latent rheumatic heart disease based on the World Heart Federation criteria and 66 healthy controls were examined by standard cardiac auscultation to detect pathological murmur. Hand-held echocardiography using a focussed protocol that utilises one view - that is, the parasternal long-axis view - and one measurement - that is, mitral regurgitant jet - and a computer-assisted auscultation utilising an automated decision tool were performed on all patients. The sensitivity and specificity of computer-assisted auscultation in latent rheumatic heart disease were 4% (95% CI 1.0-20.4%) and 93.7% (95% CI 84.5-98.3%), respectively. The sensitivity and specificity of the focussed hand-held echocardiography protocol for definite rheumatic heart disease were 92.3% (95% CI 63.9-99.8%) and 100%, respectively. The test reliability of hand-held echocardiography was 98.7% for definite and 94.7% for borderline disease, and the adjusted diagnostic odds ratios were 1041 and 263.9 for definite and borderline disease, respectively. Computer-assisted auscultation has extremely low sensitivity but high specificity for pathological murmur in latent rheumatic heart disease. Focussed hand-held echocardiography has fair sensitivity but high specificity and diagnostic utility for definite or borderline rheumatic heart disease in asymptomatic patients.

  18. Rasch analysis of the Italian Lower Extremity Functional Scale: insights on dimensionality and suggestions for an improved 15-item version.

    PubMed

    Bravini, Elisabetta; Giordano, Andrea; Sartorio, Francesco; Ferriero, Giorgio; Vercelli, Stefano

    2017-04-01

    To investigate dimensionality and the measurement properties of the Italian Lower Extremity Functional Scale using both classical test theory and Rasch analysis methods, and to provide insights for an improved version of the questionnaire. Rasch analysis of individual patient data. Rehabilitation centre. A total of 135 patients with musculoskeletal diseases of the lower limb. Patients were assessed with the Lower Extremity Functional Scale before and after the rehabilitation. Rasch analysis showed some problems related to rating scale category functioning, items fit, and items redundancy. After an iterative process, which resulted in the reduction of rating scale categories from 5 to 4, and in the deletion of 5 items, the psychometric properties of the Italian Lower Extremity Functional Scale improved. The retained 15 items with a 4-level response format fitted the Rasch model (internal construct validity), and demonstrated unidimensionality and good reliability indices (person-separation reliability 0.92; Cronbach's alpha 0.94). Then, the analysis showed differential item functioning for six of the retained items. The sensitivity to change of the Italian 15-item Lower Extremity Functional Scale was nearly equal to the one of the original version (effect size: 0.93 and 0.98; standardized response mean: 1.20 and 1.28, respectively for the 15-item and 20-item versions). The Italian Lower Extremity Functional Scale had unsatisfactory measurement properties. However, removing five items and simplifying the scoring from 5 to 4 levels resulted in a more valid measure with good reliability and sensitivity to change.

  19. Modernizing Distribution System Restoration to Achieve Grid Resiliency Against Extreme Weather Events: An Integrated Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Chen; Wang, Jianhui; Ton, Dan

    Recent severe power outages caused by extreme weather hazards have highlighted the importance and urgency of improving the resilience of the electric power grid. As the distribution grids still remain vulnerable to natural disasters, the power industry has focused on methods of restoring distribution systems after disasters in an effective and quick manner. The current distribution system restoration practice for utilities is mainly based on predetermined priorities and tends to be inefficient and suboptimal, and the lack of situational awareness after the hazard significantly delays the restoration process. As a result, customers may experience an extended blackout, which causes largemore » economic loss. On the other hand, the emerging advanced devices and technologies enabled through grid modernization efforts have the potential to improve the distribution system restoration strategy. However, utilizing these resources to aid the utilities in better distribution system restoration decision-making in response to extreme weather events is a challenging task. Therefore, this paper proposes an integrated solution: a distribution system restoration decision support tool designed by leveraging resources developed for grid modernization. We first review the current distribution restoration practice and discuss why it is inadequate in response to extreme weather events. Then we describe how the grid modernization efforts could benefit distribution system restoration, and we propose an integrated solution in the form of a decision support tool to achieve the goal. The advantages of the solution include improving situational awareness of the system damage status and facilitating survivability for customers. The paper provides a comprehensive review of how the existing methodologies in the literature could be leveraged to achieve the key advantages. The benefits of the developed system restoration decision support tool include the optimal and efficient allocation of repair crews and resources, the expediting of the restoration process, and the reduction of outage durations for customers, in response to severe blackouts due to extreme weather hazards.« less

  20. Synthesizing Neurophysiology, Genetics, Behaviour and Learning to Produce Whole-Insect Programmable Sensors to Detect Volatile Chemicals.

    USDA-ARS?s Scientific Manuscript database

    Most insects have evolved highly sensitive olfactory systems which respond to odors in their environment. The extremely sensitive nature of the insect olfaction system is enhanced by the ability to learn to associate external stimuli with resources, such as food, hosts, and mates. There have been a ...

  1. Ruggedized downhole tool for real-time measurements and uses thereof

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hess, Ryan Falcone; Lindblom, Scott C.; Yelton, William G.

    The present invention relates to ruggedized downhole tools and sensors, as well as uses thereof. In particular, these tools can operate under extreme conditions and, therefore, allow for real-time measurements in geothermal reservoirs or other potentially harsh environments. One exemplary sensor includes a ruggedized ion selective electrode (ISE) for detecting tracer concentrations in real-time. In one embodiment, the ISE includes a solid, non-conductive potting material and an ion selective material, which are disposed in a temperature-resistant electrode body. Other electrode configurations, tools, and methods are also described.

  2. Experimental droughts: Are precipitation variability and methodological trends hindering our understanding of ecological sensitivities to drought?

    NASA Astrophysics Data System (ADS)

    Hoover, D. L.; Wilcox, K.; Young, K. E.

    2017-12-01

    Droughts are projected to increase in frequency and intensity with climate change, which may have dramatic and prolonged effects on ecosystem structure and function. There are currently hundreds of published, ongoing, and new drought experiments worldwide aimed to assess ecosystem sensitivities to drought and identify the mechanisms governing ecological resistance and resilience. However, to date, the results from these experiments have varied widely, and thus patterns of drought sensitivities have been difficult to discern. This lack of consensus at the field scale, limits the abilities of experiments to help improve land surface models, which often fail to realistically simulate ecological responses to extreme events. This is unfortunate because models offer an alternative, yet complementary approach to increase the spatial and temporal assessment of ecological sensitivities to drought that are not possible in the field due to logistical and financial constraints. Here we examined 89 published drought experiments, along with their associated historical precipitation records to (1) identify where and how drought experiments have been imposed, (2) determine the extremity of drought treatments in the context of historical climate, and (3) assess the influence of precipitation variability on drought experiments. We found an overall bias in drought experiments towards short-term, extreme experiments in water-limited ecosystems. When placed in the context of local historical precipitation, most experimental droughts were extreme, with 61% below the 5th, and 43% below the 1st percentile. Furthermore, we found that interannual precipitation variability had a large and potentially underappreciated effect on drought experiments due to the co-varying nature of control and drought treatments. Thus detecting ecological effects in experimental droughts is strongly influenced by the interaction between drought treatment magnitude, precipitation variability, and key physiological thresholds. The results from this study have important implication for the design and interpretation of drought experiments as well as integrating field results with land surface models.

  3. On the Performance of Carbon Nanotubes in Extreme Conditions and in the Presence of Microwaves

    DTIC Science & Technology

    2013-01-01

    been considered for use as transparent conductors include: transparent conducting oxides (TCOs), intrinsically conducting polymers (ICPs), graphene ...optical transmission properties, but are extremely sensitive to environmental conditions (such as temperature and humidity). Graphene has recently...during the dicing procedure, silver paint was applied to the sample to serve as improvised contact/probe-landing points. Figure 1 shows the CNT thin

  4. Weather based risks and insurances for crop production in Belgium

    NASA Astrophysics Data System (ADS)

    Gobin, Anne

    2014-05-01

    Extreme weather events such as late frosts, droughts, heat waves and rain storms can have devastating effects on cropping systems. Damages due to extreme events are strongly dependent on crop type, crop stage, soil type and soil conditions. The perspective of rising risk-exposure is exacerbated further by limited aid received for agricultural damage, an overall reduction of direct income support to farmers and projected intensification of weather extremes with climate change. According to both the agriculture and finance sectors, a risk assessment of extreme weather events and their impact on cropping systems is needed. The impact of extreme weather events particularly during the sensitive periods of the farming calendar requires a modelling approach to capture the mixture of non-linear interactions between the crop, its environment and the occurrence of the meteorological event. The risk of soil moisture deficit increases towards harvesting, such that drought stress occurs in spring and summer. Conversely, waterlogging occurs mostly during early spring and autumn. Risks of temperature stress appear during winter and spring for chilling and during summer for heat. Since crop development is driven by thermal time and photoperiod, the regional crop model REGCROP (Gobin, 2010) enabled to examine the likely frequency, magnitude and impacts of frost, drought, heat stress and waterlogging in relation to the cropping season and crop sensitive stages. The risk profiles were subsequently confronted with yields, yield losses and insurance claims for different crops. Physically based crop models such as REGCROP assist in understanding the links between different factors causing crop damage as demonstrated for cropping systems in Belgium. Extreme weather events have already precipitated contraction of insurance coverage in some markets (e.g. hail insurance), and the process can be expected to continue if the losses or damages from such events increase in the future. Climate change will stress this further and impacts on crop growth are expected to be twofold, owing to the sensitive stages occurring earlier during the growing season and to the changes in return period of extreme weather events. Though average yields have risen continuously due to technological advances, there is no evidence that relative tolerance to adverse weather events has improved. The research is funded by the Belgian Science Policy Organisation (Belspo) under contract nr SD/RI/03A.

  5. Cliffbot Maestro

    NASA Technical Reports Server (NTRS)

    Norris, Jeffrey S.; Powell, Mark W.; Fox, Jason M.; Crockett, Thomas M.; Joswig, Joseph C.

    2009-01-01

    Cliffbot Maestro permits teleoperation of remote rovers for field testing in extreme environments. The application user interface provides two sets of tools for operations: stereo image browsing and command generation.

  6. Machinability of Stellite 6 hardfacing

    NASA Astrophysics Data System (ADS)

    Benghersallah, M.; Boulanouar, L.; Le Coz, G.; Devillez, A.; Dudzinski, D.

    2010-06-01

    This paper reports some experimental findings concerning the machinability at high cutting speed of nickel-base weld-deposited hardfacings for the manufacture of hot tooling. The forging work involves extreme impacts, forces, stresses and temperatures. Thus, mould dies must be extremely resistant. The aim of the project is to create a rapid prototyping process answering to forging conditions integrating a Stellite 6 hardfacing deposed PTA process. This study talks about the dry machining of the hardfacing, using a two tips machining tool and a high speed milling machine equipped by a power consumption recorder Wattpilote. The aim is to show the machinability of the hardfacing, measuring the power and the tip wear by optical microscope and white light interferometer, using different strategies and cutting conditions.

  7. Utility of quantitative sensory testing and screening tools in identifying HIV-associated peripheral neuropathy in Western Kenya: pilot testing.

    PubMed

    Cettomai, Deanna; Kwasa, Judith; Kendi, Caroline; Birbeck, Gretchen L; Price, Richard W; Bukusi, Elizabeth A; Cohen, Craig R; Meyer, Ana-Claire

    2010-12-08

    Neuropathy is the most common neurologic complication of HIV but is widely under-diagnosed in resource-constrained settings. We aimed to identify tools that accurately distinguish individuals with moderate/severe peripheral neuropathy and can be administered by non-physician healthcare workers (HCW) in resource-constrained settings. We enrolled a convenience sample of 30 HIV-infected outpatients from a Kenyan HIV-care clinic. A HCW administered the Neuropathy Severity Score (NSS), Single Question Neuropathy Screen (Single-QNS), Subjective Peripheral Neuropathy Screen (Subjective-PNS), and Brief Peripheral Neuropathy Screen (Brief-PNS). Monofilament, graduated tuning fork, and two-point discrimination examinations were performed. Tools were validated against a neurologist's clinical assessment of moderate/severe neuropathy. The sample was 57% male, mean age 38.6 years, and mean CD4 count 324 cells/µL. Neurologist's assessment identified 20% (6/30) with moderate/severe neuropathy. Diagnostic utilities for moderate/severe neuropathy were: Single-QNS--83% sensitivity, 71% specificity; Subjective-PNS-total--83% sensitivity, 83% specificity; Subjective-PNS-max and NSS--67% sensitivity, 92% specificity; Brief-PNS--0% sensitivity, 92% specificity; monofilament--100% sensitivity, 88% specificity; graduated tuning fork--83% sensitivity, 88% specificity; two-point discrimination--75% sensitivity, 58% specificity. Pilot testing suggests Single-QNS, Subjective-PNS, and monofilament examination accurately identify HIV-infected patients with moderate/severe neuropathy and may be useful diagnostic tools in resource-constrained settings.

  8. Chest Ultrasonography in Modern Day Extreme Settings: From Military Setting and Natural Disasters to Space Flights and Extreme Sports

    PubMed Central

    Mucci, Viviana

    2018-01-01

    Chest ultrasonography (CU) is a noninvasive imaging technique able to provide an immediate diagnosis of the underlying aetiology of acute respiratory failure and traumatic chest injuries. Given the great technologies, it is now possible to perform accurate CU in remote and adverse environments including the combat field, extreme sport settings, and environmental disasters, as well as during space missions. Today, the usage of CU in the extreme emergency setting is more likely to occur, as this technique proved to be a fast diagnostic tool to assist resuscitation manoeuvres and interventional procedures in many cases. A scientific literature review is presented here. This was based on a systematic search of published literature, on the following online databases: PubMed and Scopus. The following words were used: “chest sonography,” “ thoracic ultrasound,” and “lung sonography,” in different combinations with “extreme sport,” “extreme environment,” “wilderness,” “catastrophe,” and “extreme conditions.” This manuscript reports the most relevant usages of CU in the extreme setting as well as technological improvements and current limitations. CU application in the extreme setting is further encouraged here. PMID:29736195

  9. A High-Sensitivity Current Sensor Utilizing CrNi Wire and Microfiber Coils

    PubMed Central

    Xie, Xiaodong; Li, Jie; Sun, Li-Peng; Shen, Xiang; Jin, Long; Guan, Bai-ou

    2014-01-01

    We obtain an extremely high current sensitivity by wrapping a section of microfiber on a thin-diameter chromium-nickel wire. Our detected current sensitivity is as high as 220.65 nm/A2 for a structure length of only 35 μm. Such sensitivity is two orders of magnitude higher than the counterparts reported in the literature. Analysis shows that a higher resistivity or/and a thinner diameter of the metal wire may produce higher sensitivity. The effects of varying the structure parameters on sensitivity are discussed. The presented structure has potential for low-current sensing or highly electrically-tunable filtering applications. PMID:24824372

  10. A high-sensitivity current sensor utilizing CrNi wire and microfiber coils.

    PubMed

    Xie, Xiaodong; Li, Jie; Sun, Li-Peng; Shen, Xiang; Jin, Long; Guan, Bai-ou

    2014-05-12

    We obtain an extremely high current sensitivity by wrapping a section of microfiber on a thin-diameter chromium-nickel wire. Our detected current sensitivity is as high as 220.65 nm/A2 for a structure length of only 35 μm. Such sensitivity is two orders of magnitude higher than the counterparts reported in the literature. Analysis shows that a higher resistivity or/and a thinner diameter of the metal wire may produce higher sensitivity. The effects of varying the structure parameters on sensitivity are discussed. The presented structure has potential for low-current sensing or highly electrically-tunable filtering applications.

  11. Design and Manufacturing of Extremely Low Mass Flight Systems

    NASA Technical Reports Server (NTRS)

    Johnson, Michael R.

    2002-01-01

    Extremely small flight systems pose some unusual design and manufacturing challenges. The small size of the components that make up the system generally must be built with extremely tight tolerances to maintain the functionality of the assembled item. Additionally, the total mass of the system is extremely sensitive to what would be considered small perturbations in a larger flight system. The MUSES C mission, designed, built, and operated by Japan, has a small rover provided by NASA that falls into this small flight system category. This NASA-provided rover is used as a case study of an extremely small flight system design. The issues that were encountered with the rover portion of the MUSES C program are discussed and conclusions about the recommended mass margins at different stages of a small flight system project are presented.

  12. Comparison of lymphoscintigraphy and indocyanine green lymphography for the diagnosis of extremity lymphoedema.

    PubMed

    Akita, Shinsuke; Mitsukawa, Nobuyuki; Kazama, Toshiki; Kuriyama, Motone; Kubota, Yoshitaka; Omori, Naoko; Koizumi, Tomoe; Kosaka, Kentaro; Uno, Takashi; Satoh, Kaneshige

    2013-06-01

    Lymphoscintigraphy is the gold-standard examination for extremity lymphoedema. Indocyanine green lymphography may be useful for diagnosis as well. We compared the utility of these two examination methods for patients with suspected extremity lymphoedema and for those in whom surgical treatment of lymphoedema was under consideration. A total of 169 extremities with lymphoedema secondary to lymph node dissection and 65 extremities with idiopathic oedema (suspected primary lymphoedema) were evaluated; the utility of indocyanine green lymphography for diagnosis was compared with lymphoscintigraphy. Regression analysis between lymphoscintigraphy type and indocyanine green lymphography stage was conducted in the secondary lymphoedema group. In secondary oedema, the sensitivity of indocyanine green lymphography, compared with lymphoscintigraphy, was 0.972, the specificity was 0.548 and the accuracy was 0.816. When patients with lymphoscintigraphy type I and indocyanine green lymphography stage I were regarded as negative, the sensitivity of the indocyanine green lymphography was 0.978, the specificity was 0.925 and the accuracy was 0.953. There was a significant positive correlation between the lymphoscintigraphy type and the indocyanine green lymphography stage. In idiopathic oedema, the sensitivity of indocyanine green lymphography was 0.974, the specificity was 0.778 and the accuracy was 0.892. In secondary lymphoedema, earlier and less severe dysfunction could be detected by indocyanine green lymphography. Indocyanine green lymphography is recommended to determine patients' suitability for lymphaticovenular anastomosis, because the diagnostic ability of the test and its evaluation capability for disease severity is similar to lymphoscintigraphy but with less invasiveness and a lower cost. To detect primary lymphoedema, indocyanine green lymphography should be used first as a screening examination; when the results are positive, lymphoscintigraphy is useful to obtain further information. Copyright © 2013 British Association of Plastic, Reconstructive and Aesthetic Surgeons. Published by Elsevier Ltd. All rights reserved.

  13. Linear regression metamodeling as a tool to summarize and present simulation model results.

    PubMed

    Jalal, Hawre; Dowd, Bryan; Sainfort, François; Kuntz, Karen M

    2013-10-01

    Modelers lack a tool to systematically and clearly present complex model results, including those from sensitivity analyses. The objective was to propose linear regression metamodeling as a tool to increase transparency of decision analytic models and better communicate their results. We used a simplified cancer cure model to demonstrate our approach. The model computed the lifetime cost and benefit of 3 treatment options for cancer patients. We simulated 10,000 cohorts in a probabilistic sensitivity analysis (PSA) and regressed the model outcomes on the standardized input parameter values in a set of regression analyses. We used the regression coefficients to describe measures of sensitivity analyses, including threshold and parameter sensitivity analyses. We also compared the results of the PSA to deterministic full-factorial and one-factor-at-a-time designs. The regression intercept represented the estimated base-case outcome, and the other coefficients described the relative parameter uncertainty in the model. We defined simple relationships that compute the average and incremental net benefit of each intervention. Metamodeling produced outputs similar to traditional deterministic 1-way or 2-way sensitivity analyses but was more reliable since it used all parameter values. Linear regression metamodeling is a simple, yet powerful, tool that can assist modelers in communicating model characteristics and sensitivity analyses.

  14. Tarantula huwentoxin-IV inhibits neuronal sodium channels by binding to receptor site 4 and trapping the domain ii voltage sensor in the closed configuration.

    PubMed

    Xiao, Yucheng; Bingham, Jon-Paul; Zhu, Weiguo; Moczydlowski, Edward; Liang, Songping; Cummins, Theodore R

    2008-10-03

    Peptide toxins with high affinity, divergent pharmacological functions, and isoform-specific selectivity are powerful tools for investigating the structure-function relationships of voltage-gated sodium channels (VGSCs). Although a number of interesting inhibitors have been reported from tarantula venoms, little is known about the mechanism for their interaction with VGSCs. We show that huwentoxin-IV (HWTX-IV), a 35-residue peptide from tarantula Ornithoctonus huwena venom, preferentially inhibits neuronal VGSC subtypes rNav1.2, rNav1.3, and hNav1.7 compared with muscle subtypes rNav1.4 and hNav1.5. Of the five VGSCs examined, hNav1.7 was most sensitive to HWTX-IV (IC(50) approximately 26 nM). Following application of 1 microm HWTX-IV, hNav1.7 currents could only be elicited with extreme depolarizations (>+100 mV). Recovery of hNav1.7 channels from HWTX-IV inhibition could be induced by extreme depolarizations or moderate depolarizations lasting several minutes. Site-directed mutagenesis analysis indicated that the toxin docked at neurotoxin receptor site 4 located at the extracellular S3-S4 linker of domain II. Mutations E818Q and D816N in hNav1.7 decreased toxin affinity for hNav1.7 by approximately 300-fold, whereas the reverse mutations in rNav1.4 (N655D/Q657E) and the corresponding mutations in hNav1.5 (R812D/S814E) greatly increased the sensitivity of the muscle VGSCs to HWTX-IV. Our data identify a novel mechanism for sodium channel inhibition by tarantula toxins involving binding to neurotoxin receptor site 4. In contrast to scorpion beta-toxins that trap the IIS4 voltage sensor in an outward configuration, we propose that HWTX-IV traps the voltage sensor of domain II in the inward, closed configuration.

  15. GeoProMT: A Collaborative Platform Supporting Natural Hazards Project Management From Assessment to Resilience

    NASA Astrophysics Data System (ADS)

    Renschler, C.; Sheridan, M. F.; Patra, A. K.

    2008-05-01

    The impact and consequences of extreme geophysical events (hurricanes, floods, wildfires, volcanic flows, mudflows, etc.) on properties and processes should be continuously assessed by a well-coordinated interdisciplinary research and outreach approach addressing risk assessment and resilience. Communication between various involved disciplines and stakeholders is the key to a successful implementation of an integrated risk management plan. These issues become apparent at the level of decision support tools for extreme events/disaster management in natural and managed environments. The Geospatial Project Management Tool (GeoProMT) is a collaborative platform for research and training to document and communicate the fundamental steps in transforming information for extreme events at various scales for analysis and management. GeoProMT is an internet-based interface for the management of shared geo-spatial and multi-temporal information such as measurements, remotely sensed images, and other GIS data. This tool enhances collaborative research activities and the ability to assimilate data from diverse sources by integrating information management. This facilitates a better understanding of natural processes and enhances the integrated assessment of resilience against both the slow and fast onset of hazard risks. Fundamental to understanding and communicating complex natural processes are: (a) representation of spatiotemporal variability, extremes, and uncertainty of environmental properties and processes in the digital domain, (b) transformation of their spatiotemporal representation across scales (e.g. interpolation, aggregation, disaggregation.) during data processing and modeling in the digital domain, and designing and developing tools for (c) geo-spatial data management, and (d) geo-spatial process modeling and effective implementation, and (e) supporting decision- and policy-making in natural resources and hazard management at various spatial and temporal scales of interest. GeoProMT is useful for researchers, practitioners, and decision-makers, because it provides an integrated environmental system assessment and data management approach that considers the spatial and temporal scales and variability in natural processes. Particularly in the occurrence or onset of extreme events it can utilize the latest data sources that are available at variable scales, combine them with existing information, and update assessment products such as risk and vulnerability assessment maps. Because integrated geo-spatial assessment requires careful consideration of all the steps in utilizing data, modeling and decision-making formats, each step in the sequence must be assessed in terms of how information is being scaled. At the process scale various geophysical models (e.g. TITAN, LAHARZ, or many other examples) are appropriate for incorporation in the tool. Some examples that illustrate our approach include: 1) coastal parishes impacted by Hurricane Rita (Southwestern Louisiana), 2) a watershed affected by extreme rainfall induced debris-flows (Madison County, Virginia; Panabaj, Guatemala; Casita, Nicaragua), and 3) the potential for pyroclastic flows to threaten a city (Tungurahua, Ecuador). This research was supported by the National Science Foundation.

  16. 78 FR 13874 - Watershed Modeling To Assess the Sensitivity of Streamflow, Nutrient, and Sediment Loads to...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-01

    ... an improved understanding of methodological challenges associated with integrating existing tools and... methodological challenges associated with integrating existing tools (e.g., climate models, downscaling... sensitivity to methodological choices such as different approaches for downscaling global climate change...

  17. Computational Failure Modeling of Lower Extremities

    DTIC Science & Technology

    2012-01-01

    bone fracture, ligament tear, and muscle rupture . While these injuries may seem well-defined through medical imaging, the process of injury and the...to vehicles from improvised explosives cause severe injuries to the lower extremities, in- cluding bone fracture, ligament tear, and muscle rupture ...modeling offers a powerful tool to explore the insult-to-injury process with high-resolution. When studying a complex dynamic process such as this, it is

  18. Experimental evaluation of tool wear throughout a continuous stroke blanking process of quenched 22MnB5 ultra-high-strength steel

    NASA Astrophysics Data System (ADS)

    Vogt, S.; Neumayer, F. F.; Serkyov, I.; Jesner, G.; Kelsch, R.; Geile, M.; Sommer, A.; Golle, R.; Volk, W.

    2017-09-01

    Steel is the most common material used in vehicles’ chassis, which makes its research an important topic for the automotive industry. Recently developed ultra-high-strength steels (UHSS) provide extreme tensile strength up to 1,500 MPa and combine great crashworthiness with good weight reduction potential. However, in order to reach the final shape of sheet metal parts additional cutting steps such as trimming and piercing are often required. The final trimming of quenched metal sheets presents a huge challenge to a conventional process, mainly because of the required extreme cutting force. The high cutting impact, due to the materials’ brittleness, causes excessive tool wear or even sudden tool failure. Therefore, a laser is commonly used for the cutting process, which is time and energy consuming. The purpose of this paper is to demonstrate the capability of a conventional blanking tool design in a continuous stroke piercing process using boron steel 22MnB5 sheets. Two different types of tool steel were tested for their suitability as active cutting elements: electro-slag remelted (ESR) cold work tool steel Bohler K340 ISODUR and powder-metallurgic (PM) high speed steel Bohler S390 MICROCLEAN. A FEM study provided information about an optimized punch design, which withstands buckling under high cutting forces. The wear behaviour of the process was assessed by the tool wear of the active cutting elements as well as the quality of cut surfaces.

  19. Ion beam deposition system for depositing low defect density extreme ultraviolet mask blanks

    NASA Astrophysics Data System (ADS)

    Jindal, V.; Kearney, P.; Sohn, J.; Harris-Jones, J.; John, A.; Godwin, M.; Antohe, A.; Teki, R.; Ma, A.; Goodwin, F.; Weaver, A.; Teora, P.

    2012-03-01

    Extreme ultraviolet lithography (EUVL) is the leading next-generation lithography (NGL) technology to succeed optical lithography at the 22 nm node and beyond. EUVL requires a low defect density reflective mask blank, which is considered to be one of the top two critical technology gaps for commercialization of the technology. At the SEMATECH Mask Blank Development Center (MBDC), research on defect reduction in EUV mask blanks is being pursued using the Veeco Nexus deposition tool. The defect performance of this tool is one of the factors limiting the availability of defect-free EUVL mask blanks. SEMATECH identified the key components in the ion beam deposition system that is currently impeding the reduction of defect density and the yield of EUV mask blanks. SEMATECH's current research is focused on in-house tool components to reduce their contributions to mask blank defects. SEMATECH is also working closely with the supplier to incorporate this learning into a next-generation deposition tool. This paper will describe requirements for the next-generation tool that are essential to realize low defect density EUV mask blanks. The goal of our work is to enable model-based predictions of defect performance and defect improvement for targeted process improvement and component learning to feed into the new deposition tool design. This paper will also highlight the defect reduction resulting from process improvements and the restrictions inherent in the current tool geometry and components that are an impediment to meeting HVM quality EUV mask blanks will be outlined.

  20. Avian-specific real-time PCR assay for authenticity control in farm animal feeds and pet foods.

    PubMed

    Pegels, Nicolette; González, Isabel; García, Teresa; Martín, Rosario

    2014-01-01

    A highly sensitive TaqMan real-time PCR assay targeting the mitochondrial 12S rRNA gene was developed for detection of an avian-specific DNA fragment (68bp) in farm animal and pet feeds. The specificity of the assay was verified against a wide representation of animal and plant species. Applicability assessment of the avian real-time PCR was conducted through representative analysis of two types of compound feeds: industrial farm animal feeds (n=60) subjected to extreme temperatures, and commercial dog and cat feeds (n=210). Results obtained demonstrated the suitability of the real-time PCR assay to detect the presence of low percentages of highly processed avian material in the feed samples analysed. Although quantification results were well reproducible under the experimental conditions tested, an accurate estimation of the target content in feeds is impossible in practice. Nevertheless, the method may be useful as an alternative tool for traceability purposes within the framework of feed control. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Distribution of tunnelling times for quantum electron transport.

    PubMed

    Rudge, Samuel L; Kosov, Daniel S

    2016-03-28

    In electron transport, the tunnelling time is the time taken for an electron to tunnel out of a system after it has tunnelled in. We define the tunnelling time distribution for quantum processes in a dissipative environment and develop a practical approach for calculating it, where the environment is described by the general Markovian master equation. We illustrate the theory by using the rate equation to compute the tunnelling time distribution for electron transport through a molecular junction. The tunnelling time distribution is exponential, which indicates that Markovian quantum tunnelling is a Poissonian statistical process. The tunnelling time distribution is used not only to study the quantum statistics of tunnelling along the average electric current but also to analyse extreme quantum events where an electron jumps against the applied voltage bias. The average tunnelling time shows distinctly different temperature dependence for p- and n-type molecular junctions and therefore provides a sensitive tool to probe the alignment of molecular orbitals relative to the electrode Fermi energy.

  2. Challenges and requirements of mask data processing for multi-beam mask writer

    NASA Astrophysics Data System (ADS)

    Choi, Jin; Lee, Dong Hyun; Park, Sinjeung; Lee, SookHyun; Tamamushi, Shuichi; Shin, In Kyun; Jeon, Chan Uk

    2015-07-01

    To overcome the resolution and throughput of current mask writer for advanced lithography technologies, the platform of e-beam writer have been evolved by the developments of hardware and software in writer. Especially, aggressive optical proximity correction (OPC) for unprecedented extension of optical lithography and the needs of low sensitivity resist for high resolution result in the limit of variable shaped beam writer which is widely used for mass production. The multi-beam mask writer is attractive candidate for photomask writing of sub-10nm device because of its high speed and the large degree of freedom which enable high dose and dose modulation for each pixel. However, the higher dose and almost unlimited appetite for dose modulation challenge the mask data processing (MDP) in aspects of extreme data volume and correction method. Here, we discuss the requirements of mask data processing for multi-beam mask writer and presents new challenges of the data format, data flow, and correction method for user and supplier MDP tool.

  3. Performance-Enhancing Methods for Au Film over Nanosphere Surface-Enhanced Raman Scattering Substrate and Melamine Detection Application

    PubMed Central

    Wang, Jun Feng; Wu, Xue Zhong; Xiao, Rui; Dong, Pei Tao; Wang, Chao Guang

    2014-01-01

    A new high-performance surface-enhanced Raman scattering (SERS) substrate with extremely high SERS activity was produced. This SERS substrate combines the advantages of Au film over nanosphere (AuFON) substrate and Ag nanoparticles (AgNPs). A three order enhancement of SERS was observed when Rhodamine 6G (R6G) was used as a probe molecule to compare the SERS effects of the new substrate and commonly used AuFON substrate. These new SERS substrates can detect R6G down to 1 nM. The new substrate was also utilized to detect melamine, and the limit of detection (LOD) is 1 ppb. A linear relationship was also observed between the SERS intensity at Raman peak 682 cm−1 and the logarithm of melamine concentrations ranging from 10 ppm to 1 ppb. This ultrasensitive SERS substrate is a promising tool for detecting trace chemical molecules because of its simple and effective fabrication procedure, high sensitivity and high reproducibility of the SERS effect. PMID:24886913

  4. Optical analysis of nanoparticles via enhanced backscattering facilitated by 3-D photonic nanojets

    NASA Astrophysics Data System (ADS)

    Li, Xu; Chen, Zhigang; Taflove, Allen; Backman, Vadim

    2005-01-01

    We report the phenomenon of ultra-enhanced backscattering of visible light by nanoparticles facilitated by the 3-D photonic nanojet a sub-diffraction light beam appearing at the shadow side of a plane-waveilluminated dielectric microsphere. Our rigorous numerical simulations show that backscattering intensity of nanoparticles can be enhanced up to eight orders of magnitude when locating in the nanojet. As a result, the enhanced backscattering from a nanoparticle with diameter on the order of 10 nm is well above the background signal generated by the dielectric microsphere itself. We also report that nanojet-enhanced backscattering is extremely sensitive to the size of the nanoparticle, permitting in principle resolving sub-nanometer size differences using visible light. Finally, we show how the position of a nanoparticle could be determined with subdiffractional accuracy by recording the angular distribution of the backscattered light. These properties of photonic nanojets promise to make this phenomenon a useful tool for optically detecting, differentiating, and sorting nanoparticles.

  5. In vivo insertion pool sequencing identifies virulence factors in a complex fungal–host interaction

    PubMed Central

    Uhse, Simon; Pflug, Florian G.; Stirnberg, Alexandra; Ehrlinger, Klaus; von Haeseler, Arndt

    2018-01-01

    Large-scale insertional mutagenesis screens can be powerful genome-wide tools if they are streamlined with efficient downstream analysis, which is a serious bottleneck in complex biological systems. A major impediment to the success of next-generation sequencing (NGS)-based screens for virulence factors is that the genetic material of pathogens is often underrepresented within the eukaryotic host, making detection extremely challenging. We therefore established insertion Pool-Sequencing (iPool-Seq) on maize infected with the biotrophic fungus U. maydis. iPool-Seq features tagmentation, unique molecular barcodes, and affinity purification of pathogen insertion mutant DNA from in vivo-infected tissues. In a proof of concept using iPool-Seq, we identified 28 virulence factors, including 23 that were previously uncharacterized, from an initial pool of 195 candidate effector mutants. Because of its sensitivity and quantitative nature, iPool-Seq can be applied to any insertional mutagenesis library and is especially suitable for genetically complex setups like pooled infections of eukaryotic hosts. PMID:29684023

  6. PTSD and key somatic complaints and cultural syndromes among rural Cambodians: the results of a needs assessment survey.

    PubMed

    Hinton, Devon E; Hinton, Alexander L; Eng, Kok-Thay; Choung, Sophearith

    2012-09-01

    This article describes a culturally sensitive assessment tool for traumatized Cambodians, the Cambodian "Somatic Symptom and Syndrome Inventory" (SSI), and reports the outcome of a needs assessment conducted in rural Cambodia using the instrument. Villagers locally identified (N = 139) as still suffering the effects of the Pol Pot genocide were evaluated. All 139 had post-traumatic stress disorder (PTSD) as assessed by the PTSD Checklist (PCL), and they had elevated SSI scores. The severity of the SSI items varied by level of PTSD severity, and several items--for example, dizziness, dizziness on standing, khyâl (a windlike substance) attacks, and "thinking a lot"--were extremely elevated in those participants with higher levels of PTSD. The SSI was more highly correlated to self-perceived health (Short Form Health Survey-3) and past trauma events (Harvard Trauma Questionnaire) than was the PCL. The study shows the SSI items to be a core aspect of the Cambodian trauma ontology.

  7. Optical Magnetic Induction Tomography of the Heart

    NASA Astrophysics Data System (ADS)

    Marmugi, Luca; Renzoni, Ferruccio

    2016-04-01

    Atrial Fibrillation (AF) affects a significant fraction of the ageing population, causing a high level of morbidity and mortality. Despite its significance, the causes of AF are still not uniquely identified. This, combined with the lack of precise diagnostic and guiding tools, makes the clinical treatment of AF sub-optimal. We identify magnetic induction tomography as the most promising technique for the investigation of the causes of fibrillation and for its clinical practice. We therefore propose a novel optical instrument based on optical atomic magnetometers, fulfilling the requirements for diagnostic mapping of the heart’s conductivity. The feasibility of the device is here discussed in view of the final application. Thanks to the potential of atomic magnetometers for miniaturisation and extreme sensitivity at room temperature, a new generation of compact and non-invasive diagnostic instrumentation, with both bedside and intra-operative operation capability, is envisioned. Possible scenarios both in clinical practice and biomedical research are then discussed. The flexibility of the system makes it promising also for application in other fields, such as neurology and oncology.

  8. Action-FRET of a Gaseous Protein

    NASA Astrophysics Data System (ADS)

    Daly, Steven; Knight, Geoffrey; Halim, Mohamed Abdul; Kulesza, Alexander; Choi, Chang Min; Chirot, Fabien; MacAleese, Luke; Antoine, Rodolphe; Dugourd, Philippe

    2017-01-01

    Mass spectrometry is an extremely powerful technique for analysis of biological molecules, in particular proteins. One aspect that has been contentious is how much native solution-phase structure is preserved upon transposition to the gas phase by soft ionization methods such as electrospray ionization. To address this question—and thus further develop mass spectrometry as a tool for structural biology—structure-sensitive techniques must be developed to probe the gas-phase conformations of proteins. Here, we report Förster resonance energy transfer (FRET) measurements on a ubiquitin mutant using specific photofragmentation as a reporter of the FRET efficiency. The FRET data is interpreted in the context of circular dichroism, molecular dynamics simulation, and ion mobility data. Both the dependence of the FRET efficiency on the charge state—where a systematic decrease is observed—and on methanol concentration are considered. In the latter case, a decrease in FRET efficiency with methanol concentration is taken as evidence that the conformational ensemble of gaseous protein cations retains a memory of the solution phase conformational ensemble upon electrospray ionization.

  9. Performance-enhancing methods for Au film over nanosphere surface-enhanced Raman scattering substrate and melamine detection application.

    PubMed

    Wang, Jun Feng; Wu, Xue Zhong; Xiao, Rui; Dong, Pei Tao; Wang, Chao Guang

    2014-01-01

    A new high-performance surface-enhanced Raman scattering (SERS) substrate with extremely high SERS activity was produced. This SERS substrate combines the advantages of Au film over nanosphere (AuFON) substrate and Ag nanoparticles (AgNPs). A three order enhancement of SERS was observed when Rhodamine 6G (R6G) was used as a probe molecule to compare the SERS effects of the new substrate and commonly used AuFON substrate. These new SERS substrates can detect R6G down to 1 nM. The new substrate was also utilized to detect melamine, and the limit of detection (LOD) is 1 ppb. A linear relationship was also observed between the SERS intensity at Raman peak 682 cm(-1) and the logarithm of melamine concentrations ranging from 10 ppm to 1 ppb. This ultrasensitive SERS substrate is a promising tool for detecting trace chemical molecules because of its simple and effective fabrication procedure, high sensitivity and high reproducibility of the SERS effect.

  10. Quantum dynamics simulations in an ultraslow bath using hierarchy of stochastic Schrödinger equations

    NASA Astrophysics Data System (ADS)

    Ke, Yaling; Zhao, Yi

    2018-04-01

    The hierarchy of stochastic Schrödinger equation, previously developed under the unpolarised initial bath states, is extended in this paper for open quantum dynamics under polarised initial bath conditions. The method is proved to be a powerful tool in investigating quantum dynamics exposed to an ultraslow Ohmic bath, as in this case the hierarchical truncation level and the random sampling number can be kept at a relatively small extent. By systematically increasing the system-bath coupling strength, the symmetric Ohmic spin-boson dynamics is investigated at finite temperature, with a very small cut-off frequency. It is confirmed that the slow bath makes the system dynamics extremely sensitive to the initial bath conditions. The localisation tendency is stronger in the polarised initial bath conditions. Besides, the oscillatory coherent dynamics persists even when the system-bath coupling is very strong, in correspondence with what is found recently in the deep sub-Ohmic bath, where also the low-frequency modes dominate.

  11. Simple Nutrition Screening Tool for Pediatric Inpatients.

    PubMed

    White, Melinda; Lawson, Karen; Ramsey, Rebecca; Dennis, Nicole; Hutchinson, Zoe; Soh, Xin Ying; Matsuyama, Misa; Doolan, Annabel; Todd, Alwyn; Elliott, Aoife; Bell, Kristie; Littlewood, Robyn

    2016-03-01

    Pediatric nutrition risk screening tools are not routinely implemented throughout many hospitals, despite prevalence studies demonstrating malnutrition is common in hospitalized children. Existing tools lack the simplicity of those used to assess nutrition risk in the adult population. This study reports the accuracy of a new, quick, and simple pediatric nutrition screening tool (PNST) designed to be used for pediatric inpatients. The pediatric Subjective Global Nutrition Assessment (SGNA) and anthropometric measures were used to develop and assess the validity of 4 simple nutrition screening questions comprising the PNST. Participants were pediatric inpatients in 2 tertiary pediatric hospitals and 1 regional hospital. Two affirmative answers to the PNST questions were found to maximize the specificity and sensitivity to the pediatric SGNA and body mass index (BMI) z scores for malnutrition in 295 patients. The PNST identified 37.6% of patients as being at nutrition risk, whereas the pediatric SGNA identified 34.2%. The sensitivity and specificity of the PNST compared with the pediatric SGNA were 77.8% and 82.1%, respectively. The sensitivity of the PNST at detecting patients with a BMI z score of less than -2 was 89.3%, and the specificity was 66.2%. Both the PNST and pediatric SGNA were relatively poor at detecting patients who were stunted or overweight, with the sensitivity and specificity being less than 69%. The PNST provides a sensitive, valid, and simpler alternative to existing pediatric nutrition screening tools such as Screening Tool for the Assessment of Malnutrition in Pediatrics (STAMP), Screening Tool Risk on Nutritional status and Growth (STRONGkids), and Paediatric Yorkhill Malnutrition Score (PYMS) to ensure the early detection of hospitalized children at nutrition risk. © 2014 American Society for Parenteral and Enteral Nutrition.

  12. Observation of Anderson localization in disordered nanophotonic structures

    NASA Astrophysics Data System (ADS)

    Sheinfux, Hanan Herzig; Lumer, Yaakov; Ankonina, Guy; Genack, Azriel Z.; Bartal, Guy; Segev, Mordechai

    2017-06-01

    Anderson localization is an interference effect crucial to the understanding of waves in disordered media. However, localization is expected to become negligible when the features of the disordered structure are much smaller than the wavelength. Here we experimentally demonstrate the localization of light in a disordered dielectric multilayer with an average layer thickness of 15 nanometers, deep into the subwavelength regime. We observe strong disorder-induced reflections that show that the interplay of localization and evanescence can lead to a substantial decrease in transmission, or the opposite feature of enhanced transmission. This deep-subwavelength Anderson localization exhibits extreme sensitivity: Varying the thickness of a single layer by 2 nanometers changes the reflection appreciably. This sensitivity, approaching the atomic scale, holds the promise of extreme subwavelength sensing.

  13. A sub-sampled approach to extremely low-dose STEM

    DOE PAGES

    Stevens, A.; Luzi, L.; Yang, H.; ...

    2018-01-22

    The inpainting of deliberately and randomly sub-sampled images offers a potential means to image specimens at a high resolution and under extremely low-dose conditions (≤1 e -/Å 2) using a scanning transmission electron microscope. We show that deliberate sub-sampling acquires images at least an order of magnitude faster than conventional low-dose methods for an equivalent electron dose. More importantly, when adaptive sub-sampling is implemented to acquire the images, there is a significant increase in the resolution and sensitivity which accompanies the increase in imaging speed. Lastly, we demonstrate the potential of this method for beam sensitive materials and in-situ observationsmore » by experimentally imaging the node distribution in a metal-organic framework.« less

  14. Sensitive detection of strong acidic condition by a novel rhodamine-based fluorescent pH chemosensor.

    PubMed

    Tan, Jia-Lian; Yang, Ting-Ting; Liu, Yu; Zhang, Xue; Cheng, Shu-Jin; Zuo, Hua; He, Huawei

    2016-05-01

    A novel rhodamine-based fluorescent pH probe responding to extremely low pH values has been synthesized and characterized. This probe showed an excellent photophysical response to pH on the basis that the colorless spirocyclic structure under basic conditions opened to a colored and highly fluorescent form under extreme acidity. The quantitative relationship between fluorescence intensity and pH value (1.75-2.62) was consistent with the equilibrium equation pH = pKa + log[(Imax - I)/(I - Imin)]. This sensitive pH probe was also characterized with good reversibility and no interaction with interfering metal ions, and was successfully applied to image Escherichia coli under strong acidity. Copyright © 2015 John Wiley & Sons, Ltd.

  15. A sub-sampled approach to extremely low-dose STEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stevens, A.; Luzi, L.; Yang, H.

    The inpainting of deliberately and randomly sub-sampled images offers a potential means to image specimens at a high resolution and under extremely low-dose conditions (≤1 e -/Å 2) using a scanning transmission electron microscope. We show that deliberate sub-sampling acquires images at least an order of magnitude faster than conventional low-dose methods for an equivalent electron dose. More importantly, when adaptive sub-sampling is implemented to acquire the images, there is a significant increase in the resolution and sensitivity which accompanies the increase in imaging speed. Lastly, we demonstrate the potential of this method for beam sensitive materials and in-situ observationsmore » by experimentally imaging the node distribution in a metal-organic framework.« less

  16. Uncertainty Modeling for Robustness Analysis of Control Upset Prevention and Recovery Systems

    NASA Technical Reports Server (NTRS)

    Belcastro, Christine M.; Khong, Thuan H.; Shin, Jong-Yeob; Kwatny, Harry; Chang, Bor-Chin; Balas, Gary J.

    2005-01-01

    Formal robustness analysis of aircraft control upset prevention and recovery systems could play an important role in their validation and ultimate certification. Such systems (developed for failure detection, identification, and reconfiguration, as well as upset recovery) need to be evaluated over broad regions of the flight envelope and under extreme flight conditions, and should include various sources of uncertainty. However, formulation of linear fractional transformation (LFT) models for representing system uncertainty can be very difficult for complex parameter-dependent systems. This paper describes a preliminary LFT modeling software tool which uses a matrix-based computational approach that can be directly applied to parametric uncertainty problems involving multivariate matrix polynomial dependencies. Several examples are presented (including an F-16 at an extreme flight condition, a missile model, and a generic example with numerous crossproduct terms), and comparisons are given with other LFT modeling tools that are currently available. The LFT modeling method and preliminary software tool presented in this paper are shown to compare favorably with these methods.

  17. Physical Exam Risk Factors for Lower Extremity Injury in High School Athletes: A Systematic Review

    PubMed Central

    Onate, James A.; Everhart, Joshua S.; Clifton, Daniel R.; Best, Thomas M.; Borchers, James R.; Chaudhari, Ajit M.W.

    2016-01-01

    Objective A stated goal of the preparticipation physical evaluation (PPE) is to reduce musculoskeletal injury, yet the musculoskeletal portion of the PPE is reportedly of questionable use in assessing lower extremity injury risk in high school-aged athletes. The objectives of this study are: (1) identify clinical assessment tools demonstrated to effectively determine lower extremity injury risk in a prospective setting, and (2) critically assess the methodological quality of prospective lower extremity risk assessment studies that use these tools. Data Sources A systematic search was performed in PubMed, CINAHL, UptoDate, Google Scholar, Cochrane Reviews, and SportDiscus. Inclusion criteria were prospective injury risk assessment studies involving athletes primarily ages 13 to 19 that used screening methods that did not require highly specialized equipment. Methodological quality was evaluated with a modified physiotherapy evidence database (PEDro) scale. Main Results Nine studies were included. The mean modified PEDro score was 6.0/10 (SD, 1.5). Multidirectional balance (odds ratio [OR], 3.0; CI, 1.5–6.1; P < 0.05) and physical maturation status (P < 0.05) were predictive of overall injury risk, knee hyperextension was predictive of anterior cruciate ligament injury (OR, 5.0; CI, 1.2–18.4; P < 0.05), hip external: internal rotator strength ratio of patellofemoral pain syndrome (P = 0.02), and foot posture index of ankle sprain (r = −0.339, P = 0.008). Conclusions Minimal prospective evidence supports or refutes the use of the functional musculoskeletal exam portion of the current PPE to assess lower extremity injury risk in high school athletes. Limited evidence does support inclusion of multidirectional balance assessment and physical maturation status in a musculoskeletal exam as both are generalizable risk factors for lower extremity injury. PMID:26978166

  18. Modelling the occurrence of heat waves in maximum and minimum temperatures over Spain and projections for the period 2031-60

    NASA Astrophysics Data System (ADS)

    Abaurrea, J.; Asín, J.; Cebrián, A. C.

    2018-02-01

    The occurrence of extreme heat events in maximum and minimum daily temperatures is modelled using a non-homogeneous common Poisson shock process. It is applied to five Spanish locations, representative of the most common climates over the Iberian Peninsula. The model is based on an excess over threshold approach and distinguishes three types of extreme events: only in maximum temperature, only in minimum temperature and in both of them (simultaneous events). It takes into account the dependence between the occurrence of extreme events in both temperatures and its parameters are expressed as functions of time and temperature related covariates. The fitted models allow us to characterize the occurrence of extreme heat events and to compare their evolution in the different climates during the observed period. This model is also a useful tool for obtaining local projections of the occurrence rate of extreme heat events under climate change conditions, using the future downscaled temperature trajectories generated by Earth System Models. The projections for 2031-60 under scenarios RCP4.5, RCP6.0 and RCP8.5 are obtained and analysed using the trajectories from four earth system models which have successfully passed a preliminary control analysis. Different graphical tools and summary measures of the projected daily intensities are used to quantify the climate change on a local scale. A high increase in the occurrence of extreme heat events, mainly in July and August, is projected in all the locations, all types of event and in the three scenarios, although in 2051-60 the increase is higher under RCP8.5. However, relevant differences are found between the evolution in the different climates and the types of event, with a specially high increase in the simultaneous ones.

  19. ECMWF Extreme Forecast Index for water vapor transport: A forecast tool for atmospheric rivers and extreme precipitation

    NASA Astrophysics Data System (ADS)

    Lavers, David A.; Pappenberger, Florian; Richardson, David S.; Zsoter, Ervin

    2016-11-01

    In winter, heavy precipitation and floods along the west coasts of midlatitude continents are largely caused by intense water vapor transport (integrated vapor transport (IVT)) within the atmospheric river of extratropical cyclones. This study builds on previous findings that showed that forecasts of IVT have higher predictability than precipitation, by applying and evaluating the European Centre for Medium-Range Weather Forecasts Extreme Forecast Index (EFI) for IVT in ensemble forecasts during three winters across Europe. We show that the IVT EFI is more able (than the precipitation EFI) to capture extreme precipitation in forecast week 2 during forecasts initialized in a positive North Atlantic Oscillation (NAO) phase; conversely, the precipitation EFI is better during the negative NAO phase and at shorter leads. An IVT EFI example for storm Desmond in December 2015 highlights its potential to identify upcoming hydrometeorological extremes, which may prove useful to the user and forecasting communities.

  20. The Sensitive Infrared Signal Detection by Sum Frequency Generation

    NASA Technical Reports Server (NTRS)

    Wong, Teh-Hwa; Yu, Jirong; Bai, Yingxin

    2013-01-01

    An up-conversion device that converts 2.05-micron light to 700 nm signal by sum frequency generation using a periodically poled lithium niobate crystal is demonstrated. The achieved 92% up-conversion efficiency paves the path to detect extremely weak 2.05-micron signal with well established silicon avalanche photodiode detector for sensitive lidar applications.

  1. Rational synthesis of an exceptionally stable Zn(II) metal-organic framework for the highly selective and sensitive detection of picric acid.

    PubMed

    Hu, Yingli; Ding, Meili; Liu, Xiao-Qin; Sun, Lin-Bing; Jiang, Hai-Long

    2016-04-28

    Based on an organic ligand involving both carboxylate and tetrazole groups, a chemically stable Zn(II) metal-organic framework has been rationally synthesized and behaves as a fluorescence chemosensor for the highly selective and sensitive detection of picric acid, an extremely hazardous and strong explosive.

  2. A pilot study to evaluate the role of the Spinal Cord Impairment Pressure Ulcer Monitoring Tool (SCI-PUMT) in clinical decisions for pressure ulcer treatment.

    PubMed

    Thomason, Susan S; Graves, Barbara Ann; Madaris, Linda

    2014-12-01

    The Spinal Cord Impairment Pressure Ulcer Monitoring Tool (SCI-PUMT) was designed to assess pressure ulcer (PrU) healing in the spinal cord impaired (SCI) population. The tool contains 7 variables: wound surface area, depth, edges, tunneling, undermining, exudate type, and necrotic tissue amount. A 2-phased, quantitative pilot study based on the Theory of Reasoned Action and Theory of Planned Behavior was conducted at a large SCI/Disorders Center in the Department of Veterans Affairs (VA). In the first phase of the study, a convenience sample of 5 physicians, 3 advanced practice registered nurses, and 3 certified wound care nurses (CWCN) was surveyed using a 2-part questionnaire to assess use of the SCI-PUMT instrument, its anticipated improvement in PrU assessment, and intent to use the SCI-PUMT in clinical practice. Attitudes, subjective norms, perceived behavioral controls, and barriers related to the intent to use the SCI-PUMT were evaluated using a 5-point Likert scale (range: 1= extremely likely, 5 = extremely unlikely). In the second phase of the study, the electronic health records (EHR) of 24 veterans (with 30 PrUs) who had at least 2 completed SCI-PUMT scores during a 4-week period were used to evaluate whether an association existed between magnitudes of change of total SCI-PUMT scores and ordered changes in PrU treatment. The overall mean score for intent to use SCI-PUMT was 1.80 (SD 0.75). The least favorable scores were for convenience and motivation to use the SCI-PUMT. Analysis of EHR data showed no significant difference in magnitudes of change in the SCI-PUMT score and changes in PrU treatment recommendations made by the CWCNs. The significance was not affected regardless of an increase or no change in the score (χ2 with 1 degree of freedom = 1.158, P = 0.282) or for a decrease in the score (χ2 with 1 degree of freedom = 0.5, P = 0.478). In this pilot study, the expressed intent to use the SCI-PUMT in making clinical decisions was generally positive but reservations remain. Additional research is being conducted to determine the barriers and facilitators to SCI-PUMT implementation. The SCI-PUMT was the first tool found to be valid, reliable, and sensitive to assess PrU healing in persons with SCI, and studies to examine the prospective validity of using this instrument on ulcer treatment decisions and outcomes are warranted.

  3. TAxonomy of Self-reported Sedentary behaviour Tools (TASST) framework for development, comparison and evaluation of self-report tools: content analysis and systematic review.

    PubMed

    Dall, P M; Coulter, E H; Fitzsimons, C F; Skelton, D A; Chastin, Sfm

    2017-04-08

    Sedentary behaviour (SB) has distinct deleterious health outcomes, yet there is no consensus on best practice for measurement. This study aimed to identify the optimal self-report tool for population surveillance of SB, using a systematic framework. A framework, TAxonomy of Self-reported Sedentary behaviour Tools (TASST), consisting of four domains (type of assessment, recall period, temporal unit and assessment period), was developed based on a systematic inventory of existing tools. The inventory was achieved through a systematic review of studies reporting SB and tracing back to the original description. A systematic review of the accuracy and sensitivity to change of these tools was then mapped against TASST domains. Systematic searches were conducted via EBSCO, reference lists and expert opinion. The inventory included tools measuring SB in adults that could be self-completed at one sitting, and excluded tools measuring SB in specific populations or contexts. The systematic review included studies reporting on the accuracy against an objective measure of SB and/or sensitivity to change of a tool in the inventory. The systematic review initially identified 32 distinct tools (141 questions), which were used to develop the TASST framework. Twenty-two studies evaluated accuracy and/or sensitivity to change representing only eight taxa. Assessing SB as a sum of behaviours and using a previous day recall were the most promising features of existing tools. Accuracy was poor for all existing tools, with underestimation and overestimation of SB. There was a lack of evidence about sensitivity to change. Despite the limited evidence, mapping existing SB tools onto the TASST framework has enabled informed recommendations to be made about the most promising features for a surveillance tool, identified aspects on which future research and development of SB surveillance tools should focus. International prospective register of systematic reviews (PROPSPERO)/CRD42014009851. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  4. WRF model sensitivity to land surface model and cumulus parameterization under short-term climate extremes over the southern Great Plains of the United States

    Treesearch

    Lisi Pei; Nathan Moore; Shiyuan Zhong; Lifeng Luo; David W. Hyndman; Warren E. Heilman; Zhiqiu Gao

    2014-01-01

    Extreme weather and climate events, especially short-term excessive drought and wet periods over agricultural areas, have received increased attention. The Southern Great Plains (SGP) is one of the largest agricultural regions in North America and features the underlying Ogallala-High Plains Aquifer system worth great economic value in large part due to production...

  5. Comparing regional precipitation and temperature extremes in climate model and reanalysis products

    DOE PAGES

    Angélil, Oliver; Perkins-Kirkpatrick, Sarah; Alexander, Lisa V.; ...

    2016-07-12

    A growing field of research aims to characterise the contribution of anthropogenic emissions to the likelihood of extreme weather and climate events. These analyses can be sensitive to the shapes of the tails of simulated distributions. If tails are found to be unrealistically short or long, the anthropogenic signal emerges more or less clearly, respectively, from the noise of possible weather. Here we compare the chance of daily land-surface precipitation and near-surface temperature extremes generated by three Atmospheric Global Climate Models typically used for event attribution, with distributions from six reanalysis products. The likelihoods of extremes are compared for area-averagesmore » over grid cell and regional sized spatial domains. Results suggest a bias favouring overly strong attribution estimates for hot and cold events over many regions of Africa and Australia, and a bias favouring overly weak attribution estimates over regions of North America and Asia. For rainfall, results are more sensitive to geographic location. Although the three models show similar results over many regions, they do disagree over others. Equally, results highlight the discrepancy amongst reanalyses products. This emphasises the importance of using multiple reanalysis and/or observation products, as well as multiple models in event attribution studies.« less

  6. Extreme Weather Events and Climate Change Attribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomas, Katherine

    A report from the National Academies of Sciences, Engineering, and Medicine concludes it is now possible to estimate the influence of climate change on some types of extreme events. The science of extreme event attribution has advanced rapidly in recent years, giving new insight to the ways that human-caused climate change can influence the magnitude or frequency of some extreme weather events. This report examines the current state of science of extreme weather attribution, and identifies ways to move the science forward to improve attribution capabilities. Confidence is strongest in attributing types of extreme events that are influenced by climatemore » change through a well-understood physical mechanism, such as, the more frequent heat waves that are closely connected to human-caused global temperature increases, the report finds. Confidence is lower for other types of events, such as hurricanes, whose relationship to climate change is more complex and less understood at present. For any extreme event, the results of attribution studies hinge on how questions about the event's causes are posed, and on the data, modeling approaches, and statistical tools chosen for the analysis.« less

  7. Exercise Black Skies 2008: Enhancing Live Training Through Virtual Preparation -- Part Two: An Evaluation of Tools and Techniques

    DTIC Science & Technology

    2009-06-01

    visualisation tool. These tools are currently in use at the Surveillance and Control Training Unit (SACTU) in Williamtown, New South Wales, and the School...itself by facilitating the brevity and sharpness of learning points. The playback of video and audio was considered an extremely useful method of...The task assessor’s comments were supported by wall projections and audio replays of relevant mission segments that were controlled by an AAR

  8. Sliding into happiness: A new tool for measuring affective responses to words.

    PubMed

    Warriner, Amy Beth; Shore, David I; Schmidt, Louis A; Imbault, Constance L; Kuperman, Victor

    2017-03-01

    Reliable measurement of affective responses is critical for research into human emotion. Affective evaluation of words is most commonly gauged on multiple dimensions-including valence (positivity) and arousal-using a rating scale. Despite its popularity, this scale is open to criticism: It generates ordinal data that is often misinterpreted as interval, it does not provide the fine resolution that is essential by recent theoretical accounts of emotion, and its extremes may not be properly calibrated. In 5 experiments, the authors introduce a new slider tool for affective evaluation of words on a continuous, well-calibrated and high-resolution scale. In Experiment 1, participants were shown a word and asked to move a manikin representing themselves closer to or farther away from the word. The manikin's distance from the word strongly correlated with the word's valence. In Experiment 2, individual differences in shyness and sociability elicited reliable differences in distance from the words. Experiment 3 validated the results of Experiments 1 and 2 using a demographically more diverse population of responders. Finally, Experiment 4 (along with Experiment 2) suggested that task demand is not a potential cause for scale recalibration. In Experiment 5, men and women placed a manikin closer or farther from words that showed sex differences in valence, highlighting the sensitivity of this measure to group differences. These findings shed a new light on interactions among affect, language, and individual differences, and demonstrate the utility of a new tool for measuring word affect. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  9. Evaluation of Intersection Traffic Control Measures through Simulation

    NASA Astrophysics Data System (ADS)

    Asaithambi, Gowri; Sivanandan, R.

    2015-12-01

    Modeling traffic flow is stochastic in nature due to randomness in variables such as vehicle arrivals and speeds. Due to this and due to complex vehicular interactions and their manoeuvres, it is extremely difficult to model the traffic flow through analytical methods. To study this type of complex traffic system and vehicle interactions, simulation is considered as an effective tool. Application of homogeneous traffic models to heterogeneous traffic may not be able to capture the complex manoeuvres and interactions in such flows. Hence, a microscopic simulation model for heterogeneous traffic is developed using object oriented concepts. This simulation model acts as a tool for evaluating various control measures at signalized intersections. The present study focuses on the evaluation of Right Turn Lane (RTL) and Channelised Left Turn Lane (CLTL). A sensitivity analysis was performed to evaluate RTL and CLTL by varying the approach volumes, turn proportions and turn lane lengths. RTL is found to be advantageous only up to certain approach volumes and right-turn proportions, beyond which it is counter-productive. CLTL is found to be advantageous for lower approach volumes for all turn proportions, signifying the benefits of CLTL. It is counter-productive for higher approach volume and lower turn proportions. This study pinpoints the break-even points for various scenarios. The developed simulation model can be used as an appropriate intersection lane control tool for enhancing the efficiency of flow at intersections. This model can also be employed for scenario analysis and can be valuable to field traffic engineers in implementing vehicle-type based and lane-based traffic control measures.

  10. Flood Change Assessment and Attribution in Austrian alpine Basins

    NASA Astrophysics Data System (ADS)

    Claps, Pierluigi; Allamano, Paola; Como, Anastasia; Viglione, Alberto

    2016-04-01

    The present paper aims to investigate the sensitivity of flood peaks to global warming in the Austrian alpine basins. A group of 97 Austrian watersheds, with areas ranging from 14 to 6000 km2 and with average elevation ranging from 1000 to 2900 m a.s.l. have been considered. Annual maximum floods are available for the basins from 1890 to 2007 with two densities of observation. In a first period, until 1950, an average of 42 records of flood peaks are available. From 1951 to 2007 the density of observation increases to an average amount of contemporary peaks of 85. This information is very important with reference to the statistical tools used for the empirical assessment of change over time, that is linear quantile regressions. Application of this tool to the data set unveils trends in extreme events, confirmed by statistical testing, for the 0.75 and 0.95 empirical quantiles. All applications are made with specific (discharges/area) values . Similarly of what done in a previous approach, multiple quantile regressions have also been applied, confirming the presence of trends even when the possible interference of the specific discharge and morphoclimatic parameters (i.e. mean elevation and catchment area). Application of a geomorphoclimatic model by Allamano et al (2009) can allow to mimic to which extent the empirically available increase in air temperature and annual rainfall can justify the attribution of change derived by the empirical statistical tools. An comparison with data from Swiss alpine basins treated in a previous paper is finally undertaken.

  11. A Review of Recent Advances in Research on Extreme Heat Events

    NASA Technical Reports Server (NTRS)

    Horton, Radley M.; Mankin, Justin S.; Lesk, Corey; Coffel, Ethan; Raymond, Colin

    2016-01-01

    Reviewing recent literature, we report that changes in extreme heat event characteristics such as magnitude, frequency, and duration are highly sensitive to changes in mean global-scale warming. Numerous studies have detected significant changes in the observed occurrence of extreme heat events, irrespective of how such events are defined. Further, a number of these studies have attributed present-day changes in the risk of individual heat events and the documented global-scale increase in such events to anthropogenic-driven warming. Advances in process-based studies of heat events have focused on the proximate land-atmosphere interactions through soil moisture anomalies, and changes in occurrence of the underlying atmospheric circulation associated with heat events in the mid-latitudes. While evidence for a number of hypotheses remains limited, climate change nevertheless points to tail risks of possible changes in heat extremes that could exceed estimates generated from model outputs of mean temperature. We also explore risks associated with compound extreme events and nonlinear impacts associated with extreme heat.

  12. Indications of 24-h esophageal pH monitoring, capsule pH monitoring, combined pH monitoring with multichannel impedance, esophageal manometry, radiology and scintigraphy in gastroesophageal reflux disease?

    PubMed

    Vardar, Rukiye; Keskin, Muharrem

    2017-12-01

    Ambulatory esophageal pH monitoring is an essential method in patients exhibiting signs of non-erosive reflux disease (NERD) to make an objective diagnosis. Intra-esophageal pH monitoring is important in patients who are non-responsive to medications and in those with extraesophageal symptoms, particularly in NERD, before surgical interventions. With the help of the wireless capsule pH monitoring, measurements can be made under more physiological conditions as well as longer recordings can be performed because the investigation can be better tolerated by patients. Ambulatory esophageal pH monitoring can be detected within normal limits in 17%-31.4% of the patients with endoscopic esophagitis; therefore, normal pH monitoring cannot exclude the diagnosis of gastroesophageal reflux disease (GERD). Multi-channel intraluminal impedance pH (MII-pH) technology have been developed and currently the most sensitive tool to evaluate patients with both typical and atypical reflux symptoms. The sensitivity of a pH catheter test is 58% for the detection of acid reflux compared with MII-pH monitoring; further, its sensitivity is 28% for the detection of weak acid reflux compared with MII-pH monitoring. By adding impedance to pH catheter in patients with reflux symptoms, particularly in those receiving PPIs, it has been demonstrated that higher rates of diagnoses and symptom analyses can be obtained than those using only pH catheter. Esophageal manometry is used in the evaluation of patients with functional dysphagia and unexplained noncardiac chest pain and prior to antireflux surgery. The use of esophageal manometry is suitable for the detection of esophageal motor patterns and extreme motor abnormalities (e.g., achalasia and extreme hypomotility). Esophageal manometry and ambulatory pH monitoring are often used in assessments prior to laparoscopic antireflux surgery and in patients with reflux symptoms refractory to medical treatment. Although the esophageal motility is predominantly normal in patients with non-acid reflux, ineffective esophageal motility is often monitored in patients with acid reflux. In the literature, there are contradictory and an insufficient number of studies regarding radiological methods for the diagnosis of GERD. There are inconsistent values for sensitivity and specificity among the barium studies. There are inadequate studies in the literature involving scintigraphic examinations in the diagnosis of GERD, and a majority of existing studies have been conducted in the pediatric group. The results of a few studies do not provide sufficient contribution toward the implementation in clinical practice.

  13. Evaluation of extreme temperature events in northern Spain based on process control charts

    NASA Astrophysics Data System (ADS)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  14. Predictive markers for the response to 5-fluorouracil therapy in cancer cells: Constant-field gel electrophoresis as a tool for prediction of response to 5-fluorouracil-based chemotherapy

    PubMed Central

    SALEH, E. M.; EL-AWADY, R. A.; ANIS, N.

    2013-01-01

    The prediction of response or severe toxicity and therapy individualisation are extremely important in cancer chemotherapy. There are few tools to predict chemoresponse or toxicity in cancer patients. We investigated the correlation between the induction and repair of DNA double-strand breaks (DSBs) using constant-field gel electrophoresis (CFGE) and evaluating cell cycle progression and the sensitivity of four cancer cell lines to 5-fluorouracil (5FU). Using a sulphorhodamine-B assay, colon carcinoma cells (HCT116) were found to be the most sensitive to 5FU, followed by liver carcinoma cells (HepG2) and breast carcinoma cells (MCF-7). Cervical carcinoma cells (HeLa) were the most resistant. As measured by CFGE, DSB induction, but not residual DSBs, exhibited a significant correlation with the sensitivity of the cell lines to 5FU. Flow cytometric cell cycle analysis revealed that 14% of HCT116 or HepG2 cells and 2% of MCF-7 cells shifted to sub-G1 phase after a 96-h incubation with 5FU. Another 5FU-induced cell cycle change in HCT116, HepG2 and MCF-7 cells was the mild arrest of cells in G1 and/or G2/M phases of the cell cycle. In addition, 5FU treatment resulted in the accumulation of HeLa cells in the S and G2/M phases. Determination of Fas ligand (Fas-L) and caspase 9 as representative markers for the extrinsic and intrinsic pathways of apoptosis, respectively, revealed that 5FU-induced apoptosis in HCT116 and HepG2 results from the expression of Fas-L (extrinsic pathway). Therefore, the induction of DNA DSBs by 5FU, detected using CFGE, and the induction of apoptosis are candidate predictive markers that may distinguish cancer cells which are likely to benefit from 5FU treatment and the measurement of DSBs using CFGE may aid the prediction of clinical outcome. PMID:23255942

  15. Integrating forest stand projections with wildlife occupancy models to develop a decision support tool

    Treesearch

    Michelle F. Tacconelli; Edward F. Loewenstein

    2012-01-01

    Natural resource managers must often balance multiple objectives on a single property. When these objectives are seemingly conflicting, the manager’s job can be extremely difficult and complex. This paper presents a decision support tool, designed to aid land managers in optimizing wildlife habitat needs while accomplishing additional objectives such as ecosystem...

  16. WOrk-Related Questionnaire for UPper extremity disorders (WORQ-UP): Factor Analysis and Internal Consistency.

    PubMed

    Aerts, Bas R; Kuijer, P Paul; Beumer, Annechien; Eygendaal, Denise; Frings-Dresen, Monique H

    2018-04-17

    To test a 17-item questionnaire, the WOrk-Related Questionnaire for UPper extremity disorders (WORQ-UP), for dimensionality of the items (factor analysis) and internal consistency. Cross-sectional study. Outpatient clinic. A consecutive sample of patients (N=150) consisting of all new referral patients (either from a general physician or other hospital) who visited the orthopedic outpatient clinic because of an upper extremity musculoskeletal disorder. Not applicable. Number and dimensionality of the factors in the WORQ-UP. Four factors with eigenvalues (EVs) >1.0 were found. The factors were named exertion, dexterity, tools & equipment, and mobility. The EVs of the factors were, respectively, 5.78, 2.38, 1.81, and 1.24. The factors together explained 65.9% of the variance. The Cronbach alpha values for these factors were, respectively, .88, .74, .87, and .66. The 17 items of the WORQ-UP resemble 4 factors-exertion, dexterity, tools & equipment, and mobility-with a good internal consistency. Copyright © 2018 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  17. Force Field Accelerated Density Functional Theory Molecular Dynamics for Simulation of Reactive Systems at Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Lindsey, Rebecca; Goldman, Nir; Fried, Laurence

    2017-06-01

    Atomistic modeling of chemistry at extreme conditions remains a challenge, despite continuing advances in computing resources and simulation tools. While first principles methods provide a powerful predictive tool, the time and length scales associated with chemistry at extreme conditions (ns and μm, respectively) largely preclude extension of such models to molecular dynamics. In this work, we develop a simulation approach that retains the accuracy of density functional theory (DFT) while decreasing computational effort by several orders of magnitude. We generate n-body descriptions for atomic interactions by mapping forces arising from short density functional theory (DFT) trajectories on to simple Chebyshev polynomial series. We examine the importance of including greater than 2-body interactions, model transferability to different state points, and discuss approaches to ensure smooth and reasonable model shape outside of the distance domain sampled by the DFT training set. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  18. Streamflow variability in the Chilean Temperate-Mediterranean climate transition (35°S-42°S) during the last 400 years inferred from tree-ring records

    NASA Astrophysics Data System (ADS)

    Muñoz, Ariel A.; González-Reyes, Alvaro; Lara, Antonio; Sauchyn, David; Christie, Duncan; Puchi, Paulina; Urrutia-Jalabert, Rocío; Toledo-Guerrero, Isadora; Aguilera-Betti, Isabella; Mundo, Ignacio; Sheppard, Paul R.; Stahle, Daniel; Villalba, Ricardo; Szejner, Paul; LeQuesne, Carlos; Vanstone, Jessica

    2016-12-01

    As rainfall in South-Central Chile has decreased in recent decades, local communities and industries have developed an understandable concern about their threatened water supply. Reconstructing streamflows from tree-ring data has been recognized as a useful paleoclimatic tool in providing long-term perspectives on the temporal characteristics of hydroclimate systems. Multi-century long streamflow reconstructions can be compared to relatively short instrumental observations in order to analyze the frequency of low and high water availability through time. In this work, we have developed a Biobío River streamflow reconstruction to explore the long-term hydroclimate variability at the confluence of the Mediterranean-subtropical and the Temperate-humid climate zones, two regions represented by previous reconstructions of the Maule and Puelo Rivers, respectively. In a suite of analyses, the Biobío River reconstruction proves to be more similar to the Puelo River than the Maule River, despite its closer geographic proximity to the latter. This finding corroborates other studies with instrumental data that identify 37.5°S as a latitudinal confluence of two climate zones. The analyzed rivers are affected by climate forcings on interannual and interdecadal time-scales, Tropical (El Niño Southern Oscillation) and Antarctic (Southern Annular Mode; SAM). Longer cycles found, around 80-years, are well correlated only with SAM variation, which explains most of the variance in the Biobío and Puelo rivers. This cycle also has been attributed to orbital forcing by other authors. All three rivers showed an increase in the frequency of extreme high and low flow events in the twentieth century. The most extreme dry and wet years in the instrumental record (1943-2000) were not the most extreme of the past 400-years reconstructed for the three rivers (1600-2000), yet both instrumental record years did rank in the five most extreme of the streamflow reconstructions as a whole. These findings suggest a high level of natural variability in the hydro-climatic conditions of the region, where extremes characterized the twentieth century. This information is particularly useful when evaluating and improving a wide variety of water management models that apply to water resources that are sensitive to agricultural and hydropower industries.

  19. Plasmon Ruler with Ångstrom Length Resolution

    PubMed Central

    Hill, Ryan T.; Mock, Jack J.; Hucknall, Angus; Wolter, Scott D.; Jokerst, Nan M.; Smith, David R.; Chilkoti, Ashutosh

    2012-01-01

    We demonstrate a plasmon nanoruler using a coupled film-nanoparticle (film-NP) format that is well suited for investigating the sensitivity extremes of plasmonic coupling. Because it is relatively straightforward to functionalize bulk, surface plasmon supporting films such as gold, we are able to precisely control plasmonic gap dimensions by creating ultra-thin molecular spacer layers on the gold films, on top of which we immobilize plasmon resonant nanoparticles (NPs). Each immobilized NP becomes coupled to the underlying film and functions as a plasmon nanoruler, exhibiting a distance-dependent resonance red-shift in its peak plasmon wavelength as it approaches the film. Due to the uniformity of response from the film-NPs to separation distance, we are able to use extinction and scattering measurements from ensembles of film-NPs to characterize the coupling effect over a series of very short separation distances – ranging from 5 – 20 Å – and combine these measurements with similar data from larger separation distances extending out to 27 nm. We find that the film-NP plasmon nanoruler is extremely sensitive at very short film-NP separation distances, yielding spectral shifts as large as 5 nm for every 1 Å change in separation distance. The film-NP coupling at extremely small spacings is so uniform and reliable that we are able to usefully probe gap dimensions where the classical Drude model of the conducting electrons in the metals is no longer descriptive; for gap sizes smaller than a few nanometers, either quantum or semi-classical models of the carrier response must be employed to predict the observed wavelength shifts. We find that, despite the limitations, large field enhancements and extreme sensitivity persist down to even the smallest gap sizes. PMID:22966857

  20. Plasmon ruler with angstrom length resolution.

    PubMed

    Hill, Ryan T; Mock, Jack J; Hucknall, Angus; Wolter, Scott D; Jokerst, Nan M; Smith, David R; Chilkoti, Ashutosh

    2012-10-23

    We demonstrate a plasmon nanoruler using a coupled film nanoparticle (film-NP) format that is well-suited for investigating the sensitivity extremes of plasmonic coupling. Because it is relatively straightforward to functionalize bulk surface plasmon supporting films, such as gold, we are able to precisely control plasmonic gap dimensions by creating ultrathin molecular spacer layers on the gold films, on top of which we immobilize plasmon resonant nanoparticles (NPs). Each immobilized NP becomes coupled to the underlying film and functions as a plasmon nanoruler, exhibiting a distance-dependent resonance red shift in its peak plasmon wavelength as it approaches the film. Due to the uniformity of response from the film-NPs to separation distance, we are able to use extinction and scattering measurements from ensembles of film-NPs to characterize the coupling effect over a series of very short separation distances-ranging from 5 to 20 Å-and combine these measurements with similar data from larger separation distances extending out to 27 nm. We find that the film-NP plasmon nanoruler is extremely sensitive at very short film-NP separation distances, yielding spectral shifts as large as 5 nm for every 1 Å change in separation distance. The film-NP coupling at extremely small spacings is so uniform and reliable that we are able to usefully probe gap dimensions where the classical Drude model of the conducting electrons in the metals is no longer descriptive; for gap sizes smaller than a few nanometers, either quantum or semiclassical models of the carrier response must be employed to predict the observed wavelength shifts. We find that, despite the limitations, large field enhancements and extreme sensitivity persist down to even the smallest gap sizes.

  1. Victimization, social anxiety, and body dysmorphic concerns: appearance-based rejection sensitivity as a mediator.

    PubMed

    Lavell, Cassie H; Zimmer-Gembeck, Melanie J; Farrell, Lara J; Webb, Haley

    2014-09-01

    Body dysmorphic disorder (BDD) is characterized by extreme preoccupation with perceived deficits in physical appearance, and sufferers experience severe impairment in functioning. Previous research has indicated that individuals with BDD are high in social anxiety, and often report being the victims of appearance-based teasing. However, there is little research into the possible mechanisms that might explain these relationships. The current study examined appearance-based rejection sensitivity as a mediator between perceived appearance-based victimization, social anxiety, and body dysmorphic symptoms in a sample of 237 Australian undergraduate psychology students. Appearance-based rejection sensitivity fully mediated the relationship between appearance-based victimization and body dysmorphic symptoms, and partially mediated the relationship between social anxiety and body dysmorphic symptoms. Findings suggest that individuals high in social anxiety or those who have a history of more appearance-based victimization may have a bias towards interpreting further appearance-based rejection, which may contribute to extreme appearance concerns such as BDD. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Extreme ultraviolet patterning of tin-oxo cages

    NASA Astrophysics Data System (ADS)

    Haitjema, Jarich; Zhang, Yu; Vockenhuber, Michaela; Kazazis, Dimitrios; Ekinci, Yasin; Brouwer, Albert M.

    2017-07-01

    We report on the extreme ultraviolet (EUV) patterning performance of tin-oxo cages. These cage molecules were already known to function as a negative tone photoresist for EUV radiation, but in this work, we significantly optimized their performance. Our results show that sensitivity and resolution are only meaningful photoresist parameters if the process conditions are optimized. We focus on contrast curves of the materials using large area EUV exposures and patterning of the cages using EUV interference lithography. It is shown that baking steps, such as postexposure baking, can significantly affect both the sensitivity and contrast in the open-frame experiments as well as the patterning experiments. A layer thickness increase reduced the necessary dose to induce a solubility change but decreased the patterning quality. The patterning experiments were affected by minor changes in processing conditions such as an increased rinsing time. In addition, we show that the anions of the cage can influence the sensitivity and quality of the patterning, probably through their effect on physical properties of the materials.

  3. Fiber-optic refractometer based on an etched high-Q π-phase-shifted fiber-Bragg-grating.

    PubMed

    Zhang, Qi; Ianno, Natale J; Han, Ming

    2013-07-10

    We present a compact and highly-sensitive fiber-optic refractometer based on a high-Q π-phase-shifted fiber-Bragg-grating (πFBG) that is chemically etched to the core of the fiber. Due to the p phase-shift, a strong πFBG forms a high-Q optical resonator and the reflection spectrum features an extremely narrow notch that can be used for highly sensitivity refractive index measurement. The etched πFBG demonstrated here has a diameter of ~9.3 μm and a length of only 7 mm, leading to a refractive index responsivity of 2.9 nm/RIU (RIU: refractive index unit) at an ambient refractive index of 1.318. The reflection spectrum of the etched πFBG features an extremely narrow notch with a linewidth of only 2.1 pm in water centered at ~1,550 nm, corresponding to a Q-factor of 7.4 × 10(5), which allows for potentially significantly improved sensitivity over refractometers based on regular fiber Bragg gratings.

  4. Fast Coherent Differential Imaging for Exoplanet Imaging

    NASA Astrophysics Data System (ADS)

    Gerard, Benjamin; Marois, Christian; Galicher, Raphael; Veran, Jean-Pierre; Macintosh, B.; Guyon, O.; Lozi, J.; Pathak, P.; Sahoo, A.

    2018-06-01

    Direct detection and detailed characterization of exoplanets using extreme adaptive optics (ExAO) is a key science goal of future extremely large telescopes and space observatories. However, quasi-static wavefront errors will limit the sensitivity of this endeavor. Additional limitations for ground-based telescopes arise from residual AO-corrected atmospheric wavefront errors, generating short-lived aberrations that will average into a halo over a long exposure, also limiting the sensitivity of exoplanet detection. We develop the framework for a solution to both of these problems using the self-coherent camera (SCC), to be applied to ground-based telescopes, called Fast Atmospheric SCC Technique (FAST). Simulations show that for typical ExAO targets the FAST approach can reach ~100 times better in raw contrast than what is currently achieved with ExAO instruments if we extrapolate for an hour of observing time, illustrating that the sensitivity improvement from this method could play an essential role in the future ground-based detection and characterization of lower mass/colder exoplanets.

  5. Development of a music therapy assessment tool for patients in low awareness states.

    PubMed

    Magee, Wendy L

    2007-01-01

    People in low awareness states following profound brain injury typically demonstrate subtle changes in functional behaviors which challenge the sensitivity of measurement tools. Failure to identify and measure changes in functioning can lead to misdiagnosis and withdrawal of treatment with this population. Thus, the development of tools which are sensitive to responsiveness is of central concern. As the auditory modality has been found to be particularly sensitive in identifying responses indicating awareness, a convincing case can be made for music therapy as a treatment medium. However, little has been recommended about protocols for intervention or tools for measuring patient responses within the music therapy setting. This paper presents the rationale for an assessment tool specifically designed to measure responses in the music therapy setting with patients who are diagnosed as minimally conscious or in a vegetative state. Developed over fourteen years as part of interdisciplinary assessment and treatment, the music therapy assessment tool for low awareness states (MATLAS) contains fourteen items which rate behavioral responses across a number of domains. The tool can provide important information for interdisciplinary assessment and treatment particularly in the auditory and communication domains. Recommendations are made for testing its reliability and validity through research.

  6. A single pH fluorescent probe for biosensing and imaging of extreme acidity and extreme alkalinity.

    PubMed

    Chao, Jian-Bin; Wang, Hui-Juan; Zhang, Yong-Bin; Li, Zhi-Qing; Liu, Yu-Hong; Huo, Fang-Jun; Yin, Cai-Xia; Shi, Ya-Wei; Wang, Juan-Juan

    2017-07-04

    A simple tailor-made pH fluorescent probe 2-benzothiazole (N-ethylcarbazole-3-yl) hydrazone (Probe) is facilely synthesized by the condensation reaction of 2-hydrazinobenzothiazole with N-ethylcarbazole-3-formaldehyde, which is a useful fluorescent probe for monitoring extremely acidic and alkaline pH, quantitatively. The pH titrations indicate that Probe displays a remarkable emission enhancement with a pK a of 2.73 and responds linearly to minor pH fluctuations within the extremely acidic range of 2.21-3.30. Interestingly, Probe also exhibits strong pH-dependent characteristics with pK a 11.28 and linear response to extreme-alkalinity range of 10.41-12.43. In addition, Probe shows a large Stokes shift of 84 nm under extremely acidic and alkaline conditions, high selectivity, excellent sensitivity, good water-solubility and fine stability, all of which are favorable for intracellular pH imaging. The probe is further successfully applied to image extremely acidic and alkaline pH values fluctuations in E. coli cells. Copyright © 2017 Elsevier B.V. All rights reserved.

  7. Hypersensitivity of skin fibroblasts from basal cell nevus syndrome patients to killing by ultraviolet B but not by ultraviolet C radiation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Applegate, L.A.; Goldberg, L.H.; Ley, R.D.

    Basal cell nevus syndrome (BCNS) is an autosomal dominant genetic disorder in which the afflicted individuals are extremely susceptible to sunlight-induced skin cancers, particularly basal cell carcinomas. However, the cellular and molecular basis for BCNS is unknown. To ascertain whether there is any relationship between genetic predisposition to skin cancer and increased sensitivity of somatic cells from BCNS patients to killing by UV radiation, we exposed skin fibroblasts established from unexposed skin biopsies of several BCNS and age- and sex-matched normal individuals to either UV-B (280-320 nm) or UV-C (254 nm) radiation and determined their survival. The results indicated thatmore » skin fibroblasts from BCNS patients were hypersensitive to killing by UV-B but not UV-C radiation as compared to skin fibroblasts from normal individuals. DNA repair studies indicated that the increased sensitivity of BCNS skin fibroblasts to killing by UV-B radiation was not due to a defect in the excision repair of pyrimidine dimers. These results indicate that there is an association between hypersensitivity of somatic cells to killing by UV-B radiation and the genetic predisposition to skin cancer in BCNS patients. In addition, these results suggest that DNA lesions (and repair processes) other than the pyrimidine dimer are also involved in the pathogenesis of sunlight-induced skin cancers in BCNS patients. More important, the UV-B sensitivity assay described here may be used as a diagnostic tool to identify presymptomatic individuals with BCNS.« less

  8. A simple Bird Sensitivity to Oil Index as a management tool in coastal and marine areas subject to oil spills when few biological information is available.

    PubMed

    Romero, A F; Oliveira, M; Abessa, D M S

    2018-03-01

    This study sought to develop a simple index for ranking birds' environmental sensitivity to oil in which birds are used as biological indicators. The study area consisted of both the Santos Estuarine System (SES), and the Laje de Santos Marine State Park (LSMSP), located in Southeastern Brazil. Information on the bird species and their feeding and nesting behaviors were obtained from the literature and were the basis of the sensitivity index created. The SES had a higher number of species, but only about 30% were found to be highly sensitive. The LSMSP presented a much lower number of species, but all of them were considered to be highly sensitive to oil. Due to its simplicity, this index can be employed worldwide as a decision-making tool that may be integrated into other management tools, particularly when robust information on the biology of birds is lacking. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Measurement and control of bias in patient reported outcomes using multidimensional item response theory.

    PubMed

    Dowling, N Maritza; Bolt, Daniel M; Deng, Sien; Li, Chenxi

    2016-05-26

    Patient-reported outcome (PRO) measures play a key role in the advancement of patient-centered care research. The accuracy of inferences, relevance of predictions, and the true nature of the associations made with PRO data depend on the validity of these measures. Errors inherent to self-report measures can seriously bias the estimation of constructs assessed by the scale. A well-documented disadvantage of self-report measures is their sensitivity to response style (RS) effects such as the respondent's tendency to select the extremes of a rating scale. Although the biasing effect of extreme responding on constructs measured by self-reported tools has been widely acknowledged and studied across disciplines, little attention has been given to the development and systematic application of methodologies to assess and control for this effect in PRO measures. We review the methodological approaches that have been proposed to study extreme RS effects (ERS). We applied a multidimensional item response theory model to simultaneously estimate and correct for the impact of ERS on trait estimation in a PRO instrument. Model estimates were used to study the biasing effects of ERS on sum scores for individuals with the same amount of the targeted trait but different levels of ERS. We evaluated the effect of joint estimation of multiple scales and ERS on trait estimates and demonstrated the biasing effects of ERS on these trait estimates when used as explanatory variables. A four-dimensional model accounting for ERS bias provided a better fit to the response data. Increasing levels of ERS showed bias in total scores as a function of trait estimates. The effect of ERS was greater when the pattern of extreme responding was the same across multiple scales modeled jointly. The estimated item category intercepts provided evidence of content independent category selection. Uncorrected trait estimates used as explanatory variables in prediction models showed downward bias. A comprehensive evaluation of the psychometric quality and soundness of PRO assessment measures should incorporate the study of ERS as a potential nuisance dimension affecting the accuracy and validity of scores and the impact of PRO data in clinical research and decision making.

  10. Evolution of a genetic polymorphism with climate change in a Mediterranean landscape

    PubMed Central

    Thompson, John; Charpentier, Anne; Bouguet, Guillaume; Charmasson, Faustine; Roset, Stephanie; Buatois, Bruno; Vernet, Philippe; Gouyon, Pierre-Henri

    2013-01-01

    Many species show changes in distribution and phenotypic trait variation in response to climatic warming. Evidence of genetically based trait responses to climate change is, however, less common. Here, we detected evolutionary variation in the landscape-scale distribution of a genetically based chemical polymorphism in Mediterranean wild thyme (Thymus vulgaris) in association with modified extreme winter freezing events. By comparing current data on morph distribution with that observed in the early 1970s, we detected a significant increase in the proportion of morphs that are sensitive to winter freezing. This increase in frequency was observed in 17 of the 24 populations in which, since the 1970s, annual extreme winter freezing temperatures have risen above the thresholds that cause mortality of freezing-sensitive morphs. Our results provide an original example of rapid ongoing evolutionary change associated with relaxed selection (less extreme freezing events) on a local landscape scale. In species whose distribution and genetic variability are shaped by strong selection gradients, there may be little time lag associated with their ecological and evolutionary response to long-term environmental change. PMID:23382198

  11. The Extreme Ultraviolet Explorer Mission

    NASA Technical Reports Server (NTRS)

    Bowyer, S.; Malina, R. F.

    1991-01-01

    The Extreme Ultraviolet Explorer (EUVE) mission, currently scheduled from launch in September 1991, is described. The primary purpose of the mission is to survey the celestial sphere for astronomical sources of extreme ultraviolet (EUV) radiation with the use of three EUV telescope, each sensitive to a different segment of the EUV band. A fourth telescope is planned to perform a high-sensitivity search of a limited sample of the sky in the shortest wavelength bands. The all-sky survey is planned to be carried out in the first six months of the mission in four bands, or colors, 70-180 A, 170-250 A, 400-600 A, and 500-700 A. The second phase of the mission is devoted to spectroscopic observations of EUV sources. A high-efficiency grazing-incidence spectrometer using variable line-space gratings is planned to provide spectral data with about 1-A resolution. An end-to-end model of the mission, from a stellar source to the resulting scientific data, is presented. Hypothetical data from astronomical sources were processed through this model and are shown.

  12. Funding of community-based interventions for HIV prevention.

    PubMed

    Poku, Nana K; Bonnel, René

    2016-07-01

    Since the start of the HIV epidemic, community responses have been at the forefront of the response. Following the extraordinary expansion of global resources, the funding of community responses rose to reach at least US$690 million per year in the period 2005-2009. Since then, many civil society organisations (CSOs) have reported a drop in funding. Yet, the need for strong community responses is even more urgent, as shown by their role in reaching the Joint United Nations Programme on HIV/AIDS (UNAIDS) Fast-Track targets. In the case of antiretroviral treatment, interventions need to be adopted by most people at risk of HIV in order to have a substantial effect on the prevention of HIV at the population level. This paper reviews the published literature on community responses, funding and effectiveness. Additional funding is certainly needed to increase the coverage of community-based interventions (CBIs), but current evidence on their effectiveness is extremely mixed, which does not provide clear guidance to policy makers. This is especially an issue for adolescent girls and young women in Eastern and Southern Africa, who face extremely high infection risk, but the biomedical prevention tools that have been proven effective for the general population still remain pilot projects for this group. Research is especially needed to isolate the factors affecting the likelihood that interventions targeting this group are consistently successful. Such work could be focused on the community organisations that are currently involved in delivering gender-sensitive interventions.

  13. Peritoneal sarcomatosis: site of origin for the establishment of an in vitro and in vivo cell line model to study therapeutic resistance in dedifferentiated liposarcoma.

    PubMed

    Mersch, Sabrina; Riemer, Jasmin C; Schlünder, Philipp M; Ghadimi, Markus P; Ashmawy, Hany; Möhlendick, Birte; Topp, Stefan A; Arent, Tanja; Kröpil, Patric; Stoecklein, Nikolas H; Gabbert, Helmut E; Knoefel, Wolfram T; Krieg, Andreas

    2016-02-01

    Approximately 50-70 % of patients with retroperitoneal or intraabdominal sarcoma develop a relapse after surgical therapy, including peritoneal sarcomatosis, an extremely rare site of metastatic disease which is associated with an extremely poor prognosis. Accordingly, the establishment of a permanent cell line derived from peritoneal sarcomatosis might provide a helpful tool to understand the biological behavior and to develop new therapeutic strategies. Thus, we established and characterized a liposarcoma cell line (Lipo-DUE1) from a peritoneal sarcomatosis that was permanently cultured without showing any morphological changes. Lipo-DUE1 cells exhibited a spindle-shaped morphology and positive staining for S100. Tumorigenicity was demonstrated in vitro by invasion and migration assays and in vivo by using a subcutaneous xenograft mouse model. In addition, aCGH analysis revealed concordant copy number variations on chromosome 12q in the primary tumor, peritoneal sarcomatosis, and Lipo-DUE1 cells that are commonly observed in liposarcoma. Chemotherapeutic sensitivity assays revealed a pronounced drug-resistant phenotype of Lipo-DUE1 cells to conventionally used chemotherapeutic agents. In conclusion, we describe for the first time the establishment and characterization of a liposarcoma cell line derived from a peritoneal sarcomatosis. Hence, in the future, the newly established cell line Lipo-DUE1 might serve as a useful in vitro and in vivo model to investigate the biological behavior of liposarcoma and to assess novel targeted therapies.

  14. Probing the electronic and spintronic properties of buried interfaces by extremely low energy photoemission spectroscopy

    PubMed Central

    Fetzer, Roman; Stadtmüller, Benjamin; Ohdaira, Yusuke; Naganuma, Hiroshi; Oogane, Mikihiko; Ando, Yasuo; Taira, Tomoyuki; Uemura, Tetsuya; Yamamoto, Masafumi; Aeschlimann, Martin; Cinchetti, Mirko

    2015-01-01

    Ultraviolet photoemission spectroscopy (UPS) is a powerful tool to study the electronic spin and symmetry features at both surfaces and interfaces to ultrathin top layers. However, the very low mean free path of the photoelectrons usually prevents a direct access to the properties of buried interfaces. The latter are of particular interest since they crucially influence the performance of spintronic devices like magnetic tunnel junctions (MTJs). Here, we introduce spin-resolved extremely low energy photoemission spectroscopy (ELEPS) to provide a powerful way for overcoming this limitation. We apply ELEPS to the interface formed between the half-metallic Heusler compound Co2MnSi and the insulator MgO, prepared as in state-of-the-art Co2MnSi/MgO-based MTJs. The high accordance between the spintronic fingerprint of the free Co2MnSi surface and the Co2MnSi/MgO interface buried below up to 4 nm MgO provides clear evidence for the high interface sensitivity of ELEPS to buried interfaces. Although the absolute values of the interface spin polarization are well below 100%, the now accessible spin- and symmetry-resolved wave functions are in line with the predicted existence of non-collinear spin moments at the Co2MnSi/MgO interface, one of the mechanisms evoked to explain the controversially discussed performance loss of Heusler-based MTJs at room temperature. PMID:25702631

  15. Performance verification of the Gravity and Extreme Magnetism Small explorer (GEMS) x-ray polarimeter

    NASA Astrophysics Data System (ADS)

    Enoto, Teruaki; Black, J. Kevin; Kitaguchi, Takao; Hayato, Asami; Hill, Joanne E.; Jahoda, Keith; Tamagawa, Toru; Kaneko, Kenta; Takeuchi, Yoko; Yoshikawa, Akifumi; Marlowe, Hannah; Griffiths, Scott; Kaaret, Philip E.; Kenward, David; Khalid, Syed

    2014-07-01

    Polarimetry is a powerful tool for astrophysical observations that has yet to be exploited in the X-ray band. For satellite-borne and sounding rocket experiments, we have developed a photoelectric gas polarimeter to measure X-ray polarization in the 2-10 keV range utilizing a time projection chamber (TPC) and advanced micro-pattern gas electron multiplier (GEM) techniques. We carried out performance verification of a flight equivalent unit (1/4 model) which was planned to be launched on the NASA Gravity and Extreme Magnetism Small Explorer (GEMS) satellite. The test was performed at Brookhaven National Laboratory, National Synchrotron Light Source (NSLS) facility in April 2013. The polarimeter was irradiated with linearly-polarized monochromatic X-rays between 2.3 and 10.0 keV and scanned with a collimated beam at 5 different detector positions. After a systematic investigation of the detector response, a modulation factor >=35% above 4 keV was obtained with the expected polarization angle. At energies below 4 keV where the photoelectron track becomes short, diffusion in the region between the GEM and readout strips leaves an asymmetric photoelectron image. A correction method retrieves an expected modulation angle, and the expected modulation factor, ~20% at 2.7 keV. Folding the measured values of modulation through an instrument model gives sensitivity, parameterized by minimum detectable polarization (MDP), nearly identical to that assumed at the preliminary design review (PDR).

  16. Attributing extreme precipitation in the Black Sea region to sea surface warming

    NASA Astrophysics Data System (ADS)

    Meredith, Edmund; Semenov, Vladimir; Maraun, Douglas; Park, Wonsun; Chernokulsky, Alexander

    2016-04-01

    Higher sea surface temperatures (SSTs) warm and moisten the overlying atmosphere, increasing the low-level atmospheric instability, the moisture available to precipitating systems and, hence, the potential for intense convective systems. Both the Mediterranean and Black Sea regions have seen a steady increase in summertime SSTs since the early 1980s, by over 2 K in places. This raises the question of how this SST increase has affected convective precipitation extremes in the region, and through which mechanisms any effects are manifested. In particular, the Black Sea town of Krymsk suffered an unprecedented precipitation extreme in July 2012, which may have been influenced by Black Sea warming, causing over 170 deaths. To address this question, we adopt two distinct modelling approaches to event attribution and compare their relative merits. In the first, we use the traditional probabilistic event attribution approach involving global climate model ensembles representative of the present and a counterfactual past climate where regional SSTs have not increased. In the second, we use the conditional event attribution approach, taking the 2012 Krymsk precipitation extreme as a showcase example. Under the second approach, we carry out ensemble sensitivity experiments of the Krymsk event at convection-permitting resolution with the WRF regional model, and test the sensitivity of the event to a range of SST forcings. Both experiments show the crucial role of recent Black Sea warming in amplifying the 2012 Krymsk precipitation extreme. In the conditional event attribution approach, though, the explicit simulation of convective processes provides detailed insight into the physical mechanisms behind the extremeness of the event, revealing the dominant role of dynamical (i.e. static stability and vertical motions) over thermodynamical (i.e. increased atmospheric moisture) changes. Additionally, the wide range of SST states tested in the regional setup, which would be infeasible under the global modelling approach, reveal that the intensity of the Krymsk event responds highly nonlinearly to Black Sea warming and suggests a role for regional SST thresholds in more intense coastal convective extremes.

  17. Autoerythrocyte sensitization syndrome presenting with general neurodermatitis

    PubMed Central

    Oh, In Young; Ko, Eun Jung

    2013-01-01

    Autoerythrocyte sensitization syndrome (AES) was first described by Gardner and Diamond in 1955, when four women with painful bruising were depicted. Patients with AES typically present with the development of recurrent, spontaneous, painful ecchymosis, frequently preceded by a prodrome of pain or itching of the skin. The patients are sensitive to their own red blood cells injected intradermally, and underlying coagulopathies are thought to be absent. We introduce a 70-year-old woman presenting with recurrent episodes of painful bruising on the trunk and extremities. PMID:23956968

  18. The Complementary Role of High Sensitivity C-Reactive Protein in the Diagnosis and Severity Assessment of Autism

    ERIC Educational Resources Information Center

    Khakzad, Mohammad Reza; Javanbakht, Maryam; Shayegan, Mohammad Reza; Kianoush, Sina; Omid, Fatemeh; Hojati, Maryam; Meshkat, Mojtaba

    2012-01-01

    C-reactive protein (CRP) is a beneficial diagnostic test for the evaluation of inflammatory response. Extremely low levels of CRP can be detected using high-sensitivity CRP (hs-CRP) test. A considerable body of evidence has demonstrated that inflammatory response has an important role in the pathophysiology of autism. In this study, we evaluated…

  19. Assessing Instructional Sensitivity Using the Pre-Post Difference Index: A Nontechnical Tool for Extension Educators

    ERIC Educational Resources Information Center

    Adedokun, Omolola A.

    2018-01-01

    This article provides an illustrative description of the pre-post difference index (PPDI), a simple, nontechnical yet robust tool for examining the instructional sensitivity of assessment items. Extension educators often design pretest-posttest instruments to assess the impact of their curricula on participants' knowledge and understanding of the…

  20. The enhanced effects of antibiotics irradiated of extremely high frequency electromagnetic field on Escherichia coli growth properties.

    PubMed

    Torgomyan, Heghine; Trchounian, Armen

    2015-01-01

    The effects of extremely high frequency electromagnetic irradiation and antibiotics on Escherichia coli can create new opportunities for applications in different areas—medicine, agriculture, and food industry. Previously was shown that irradiated bacterial sensitivity against antibiotics was changed. In this work, it was presented the results that irradiation of antibiotics and then adding into growth medium was more effective compared with non-irradiated antibiotics bactericidal action. The selected antibiotics (tetracycline, kanamycin, chloramphenicol, and ceftriaxone) were from different groups. Antibiotics irradiation was performed with low intensity 53 GHz frequency during 1 h. The E. coli growth properties—lag-phase duration and specific growth rate—were markedly changed. Enhanced bacterial sensitivity to irradiated antibiotics is similar to the effects of antibiotics of higher concentrations.

  1. A study of the stress wave factor technique for nondestructive evaluation of composite materials

    NASA Technical Reports Server (NTRS)

    Sarrafzadeh-Khoee, A.; Kiernan, M. T.; Duke, J. C., Jr.; Henneke, E. G., II

    1986-01-01

    The acousto-ultrasonic method of nondestructive evaluation is an extremely sensitive means of assessing material response. Efforts continue to complete the understanding of this method. In order to achieve the full sensitivity of the technique, extreme care must be taken in its performance. This report provides an update of the efforts to advance the understanding of this method and to increase its application to the nondestructive evaluation of composite materials. Included are descriptions of a novel optical system that is capable of measuring in-plane and out-of-plane displacements, an IBM PC-based data acquisition system, an extensive data analysis software package, the azimuthal variation of acousto-ultrasonic behavior in graphite/epoxy laminates, and preliminary examination of processing variation in graphite-aluminum tubes.

  2. Sugar nanowires based on cyclodextrin on quartz crystal microbalance for gas sensing with ultra-high sensitivity

    NASA Astrophysics Data System (ADS)

    Asano, Atsushi; Maeyoshi, Yuta; Watanabe, Shogo; Saeki, Akinori; Sugimoto, Masaki; Yoshikawa, Masahito; Nanto, Hidehito; Tsukuda, Satoshi; Tanaka, Shun-Ichiro; Seki, Shu

    2013-03-01

    Cyclodextrins (CDs), hosting selectively a wide range of guest molecules in their hydrophobic cavity, were directly fabricated into 1-dimensional nanostructures with extremely wide surface area by single particle nanofabrication technique in the present paper. The copolymers of acrylamide and mono(6-allyl)-β-CD were synthesized, and the crosslinking reaction of the polymer alloys with poly(4-bromostyrene) (PBrS) in SPNT gave nanowires on the quarts substrate with high number density of 5×109 cm-2. Quartz crystal microbalance (QCM) measurement suggested 320 fold high sensitivity for formic acid vapor adsorption in the nanowire fabricated surfaces compared with that in the thin solid film of PBrS, due to the incorporation of CD units and extremely wide surface area of the nanowires.

  3. Climate extremes in the Pacific: improving seasonal prediction of tropical cyclones and extreme ocean temperatures to improve resilience

    NASA Astrophysics Data System (ADS)

    Kuleshov, Y.; Jones, D.; Spillman, C. M.

    2012-04-01

    Climate change and climate extremes have a major impact on Australia and Pacific Island countries. Of particular concern are tropical cyclones and extreme ocean temperatures, the first being the most destructive events for terrestrial systems, while the latter has the potential to devastate ocean ecosystems through coral bleaching. As a practical response to climate change, under the Pacific-Australia Climate Change Science and Adaptation Planning program (PACCSAP), we are developing enhanced web-based information tools for providing seasonal forecasts for climatic extremes in the Western Pacific. Tropical cyclones are the most destructive weather systems that impact on coastal areas. Interannual variability in the intensity and distribution of tropical cyclones is large, and presently greater than any trends that are ascribable to climate change. In the warming environment, predicting tropical cyclone occurrence based on historical relationships, with predictors such as sea surface temperatures (SSTs) now frequently lying outside of the range of past variability meaning that it is not possible to find historical analogues for the seasonal conditions often faced by Pacific countries. Elevated SSTs are the primary trigger for mass coral bleaching events, which can lead to widespread damage and mortality on reef systems. Degraded coral reefs present many problems, including long-term loss of tourism and potential loss or degradation of fisheries. The monitoring and prediction of thermal stress events enables the support of a range of adaptive and management activities that could improve reef resilience to extreme conditions. Using the climate model POAMA (Predictive Ocean-Atmosphere Model for Australia), we aim to improve accuracy of seasonal forecasts of tropical cyclone activity and extreme SSTs for the regions of Western Pacific. Improved knowledge of extreme climatic events, with the assistance of tailored forecast tools, will help enhance the resilience and adaptive capacity of Australia and Pacific Island Countries under climate change. Acknowledgement The research discussed in this paper was conducted with the support of the PACCSAP supported by the AusAID and Department of Climate Change and Energy Efficiency and delivered by the Bureau of Meteorology and CSIRO.

  4. [Bactericidal activity of serum and chemotherapy in sensitive and resistant exciter (author's transl)].

    PubMed

    Eyer, H; Metz, H; Preac-Mursic, V

    1975-11-21

    Comparing examinations with Ampicillin sensitive and resistant bacteria-strains show that the bactericidal activity of serum is dependent on the bacteria-strains, on the Ampicillin sensitivity of the particular exciter and on the number of bacteria/ml (germ count). Bactericide effect could always be obtained with sensitive strains as a result of additional chemotherapy. With several resistant strains a bactericide effect could not be obtained in this case the continuous optimal Ampicillin addition was the decisive factor. Because of the extremely complicated process of the bactericide one should not make general conclusions from the individual experimental results.

  5. Using synchrotron light to accelerate EUV resist and mask materials learning

    NASA Astrophysics Data System (ADS)

    Naulleau, Patrick; Anderson, Christopher N.; Baclea-an, Lorie-Mae; Denham, Paul; George, Simi; Goldberg, Kenneth A.; Jones, Gideon; McClinton, Brittany; Miyakawa, Ryan; Mochi, Iacopo; Montgomery, Warren; Rekawa, Seno; Wallow, Tom

    2011-03-01

    As commercialization of extreme ultraviolet lithography (EUVL) progresses, direct industry activities are being focused on near term concerns. The question of long term extendibility of EUVL, however, remains crucial given the magnitude of the investments yet required to make EUVL a reality. Extendibility questions are best addressed using advanced research tools such as the SEMATECH Berkeley microfield exposure tool (MET) and actinic inspection tool (AIT). Utilizing Lawrence Berkeley National Laboratory's Advanced Light Source facility as the light source, these tools benefit from the unique properties of synchrotron light enabling research at nodes generations ahead of what is possible with commercial tools. The MET for example uses extremely bright undulator radiation to enable a lossless fully programmable coherence illuminator. Using such a system, resolution enhancing illuminations achieving k1 factors of 0.25 can readily be attained. Given the MET numerical aperture of 0.3, this translates to an ultimate resolution capability of 12 nm. Using such methods, the SEMATECH Berkeley MET has demonstrated resolution in resist to 16-nm half pitch and below in an imageable spin-on hard mask. At a half pitch of 16 nm, this material achieves a line-edge roughness of 2 nm with a correlation length of 6 nm. These new results demonstrate that the observed stall in ultimate resolution progress in chemically amplified resists is a materials issue rather than a tool limitation. With a resolution limit of 20-22 nm, the CAR champion from 2008 remains as the highest performing CAR tested to date. To enable continued advanced learning in EUV resists, SEMATECH has initiated a plan to implement a 0.5 NA microfield tool at the Advanced Light Source synchrotron facility. This tool will be capable of printing down to 8-nm half pitch.

  6. On uses, misuses and potential abuses of fractal analysis in zooplankton behavioral studies: A review, a critique and a few recommendations

    NASA Astrophysics Data System (ADS)

    Seuront, Laurent

    2015-08-01

    Fractal analysis is increasingly used to describe, and provide further understanding to, zooplankton swimming behavior. This may be related to the fact that fractal analysis and the related fractal dimension D have the desirable properties to be independent of measurement scale and to be very sensitive to even subtle behavioral changes that may be undetectable to other behavioral variables. As early claimed by Coughlin et al. (1992), this creates "the need for fractal analysis" in behavioral studies, which has hence the potential to become a valuable tool in zooplankton behavioral ecology. However, this paper stresses that fractal analysis, as well as the more elaborated multifractal analysis, is also a risky business that may lead to irrelevant results, without paying extreme attention to a series of both conceptual and practical steps that are all likely to bias the results of any analysis. These biases are reviewed and exemplified on the basis of the published literature, and remedial procedures are provided not only for geometric and stochastic fractal analyses, but also for the more complicated multifractal analysis. The concept of multifractals is finally introduced as a direct, objective and quantitative tool to identify models of motion behavior, such as Brownian motion, fractional Brownian motion, ballistic motion, Lévy flight/walk and multifractal random walk. I finally briefly review the state of this emerging field in zooplankton behavioral research.

  7. Novel Screening Tool for Stroke Using Artificial Neural Network.

    PubMed

    Abedi, Vida; Goyal, Nitin; Tsivgoulis, Georgios; Hosseinichimeh, Niyousha; Hontecillas, Raquel; Bassaganya-Riera, Josep; Elijovich, Lucas; Metter, Jeffrey E; Alexandrov, Anne W; Liebeskind, David S; Alexandrov, Andrei V; Zand, Ramin

    2017-06-01

    The timely diagnosis of stroke at the initial examination is extremely important given the disease morbidity and narrow time window for intervention. The goal of this study was to develop a supervised learning method to recognize acute cerebral ischemia (ACI) and differentiate that from stroke mimics in an emergency setting. Consecutive patients presenting to the emergency department with stroke-like symptoms, within 4.5 hours of symptoms onset, in 2 tertiary care stroke centers were randomized for inclusion in the model. We developed an artificial neural network (ANN) model. The learning algorithm was based on backpropagation. To validate the model, we used a 10-fold cross-validation method. A total of 260 patients (equal number of stroke mimics and ACIs) were enrolled for the development and validation of our ANN model. Our analysis indicated that the average sensitivity and specificity of ANN for the diagnosis of ACI based on the 10-fold cross-validation analysis was 80.0% (95% confidence interval, 71.8-86.3) and 86.2% (95% confidence interval, 78.7-91.4), respectively. The median precision of ANN for the diagnosis of ACI was 92% (95% confidence interval, 88.7-95.3). Our results show that ANN can be an effective tool for the recognition of ACI and differentiation of ACI from stroke mimics at the initial examination. © 2017 American Heart Association, Inc.

  8. SURA-IOOS Coastal Inundation Testbed Inter-Model Evaluation of Tides, Waves, and Hurricane Surge in the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Kerr, P. C.; Donahue, A.; Westerink, J. J.; Luettich, R.; Zheng, L.; Weisberg, R. H.; Wang, H. V.; Slinn, D. N.; Davis, J. R.; Huang, Y.; Teng, Y.; Forrest, D.; Haase, A.; Kramer, A.; Rhome, J.; Feyen, J. C.; Signell, R. P.; Hanson, J. L.; Taylor, A.; Hope, M.; Kennedy, A. B.; Smith, J. M.; Powell, M. D.; Cardone, V. J.; Cox, A. T.

    2012-12-01

    The Southeastern Universities Research Association (SURA), in collaboration with the NOAA Integrated Ocean Observing System program and other federal partners, developed a testbed to help accelerate progress in both research and the transition to operational use of models for both coastal and estuarine prediction. This testbed facilitates cyber-based sharing of data and tools, archival of observation data, and the development of cross-platform tools to efficiently access, visualize, skill assess, and evaluate model results. In addition, this testbed enables the modeling community to quantitatively assess the behavior (e.g., skill, robustness, execution speed) and implementation requirements (e.g. resolution, parameterization, computer capacity) that characterize the suitability and performance of selected models from both operational and fundamental science perspectives. This presentation focuses on the tropical coastal inundation component of the testbed and compares a variety of model platforms as well as grids in simulating tides, and the wave and surge environments for two extremely well documented historical hurricanes, Hurricanes Rita (2005) and Ike (2008). Model platforms included are ADCIRC, FVCOM, SELFE, SLOSH, SWAN, and WWMII. Model validation assessments were performed on simulation results using numerous station observation data in the form of decomposed harmonic constituents, water level high water marks and hydrographs of water level and wave data. In addition, execution speed, inundation extents defined by differences in wetting/drying schemes, resolution and parameterization sensitivities are also explored.

  9. Ultrasensitive liquid chromatography-tandem mass spectrometric methodologies for quantification of five HIV-1 integrase inhibitors in plasma for a microdose clinical trial.

    PubMed

    Sun, Li; Li, Hankun; Willson, Kenneth; Breidinger, Sheila; Rizk, Matthew L; Wenning, Larissa; Woolf, Eric J

    2012-10-16

    HIV-1 integrase strand transfer inhibitors are an important class of compounds targeted for the treatment of HIV-1 infection. Microdosing has emerged as an attractive tool to assist in drug candidate screening for clinical development, but necessitates extremely sensitive bioanalytical assays, typically in the pg/mL concentration range. Currently, accelerator mass spectrometry is the predominant tool for microdosing support, which requires a specialized facility and synthesis of radiolabeled compounds. There have been few studies attempted to comprehensively assess a liquid chromatography-tandem mass spectrometry (LC-MS/MS) approach in the context of microdosing applications. Herein, we describe the development of automated LC-MS/MS methods to quantify five integrase inhibitors in plasma with the limits of quantification at 1 pg/mL for raltegravir and 2 pg/mL for four proprietary compounds. The assays involved double extractions followed by UPLC coupled with negative ion electrospray MS/MS analysis. All methods were fully validated to the rigor of regulated bioanalysis requirements, with intraday precision between 1.20 and 14.1% and accuracy between 93.8 and 107% at the standard curve concentration range. These methods were successfully applied to a human microdose study and demonstrated to be accurate, reproducible, and cost-effective. Results of the study indicate that raltegravir displayed linear pharmacokinetics between a microdose and a pharmacologically active dose.

  10. Infrared imaging: a potential powerful tool for neuroimaging and neurodiagnostics

    PubMed Central

    Khoshakhlagh, Arezou; Gunapala, Sarath D.

    2017-01-01

    Abstract. Infrared (IR) imaging is used to detect the subtle changes in temperature needed to accurately detect and monitor disease. Technological advances have made IR a highly sensitive and reliable detection tool with strong potential in medical and neurophotonics applications. An overview of IR imaging specifically investigating quantum well IR detectors developed at Jet Propulsion Laboratory for a noninvasive, nonradiating imaging tool is provided, which could be applied for neuroscience and neurosurgery where it involves sensitive cellular temperature change. PMID:28382311

  11. Tool Integration Framework for Bio-Informatics

    DTIC Science & Technology

    2007-04-01

    Java NetBeans [11] based Integrated Development Environment (IDE) for developing modules and packaging computational tools. The framework is extremely...integrate an Eclipse front-end for Desktop Integration. Eclipse was chosen over Netbeans owing to a higher acceptance, better infrastructure...5.0. This version of Dashboard ran with NetBeans IDE 3.6 requiring Java Runtime 1.4 on a machine with Windows XP. The toolchain is executed by

  12. An approach to value-based simulator selection: The creation and evaluation of the simulator value index tool.

    PubMed

    Rooney, Deborah M; Hananel, David M; Covington, Benjamin J; Dionise, Patrick L; Nykamp, Michael T; Pederson, Melvin; Sahloul, Jamal M; Vasquez, Rachael; Seagull, F Jacob; Pinsky, Harold M; Sweier, Domenica G; Cooke, James M

    2018-04-01

    Currently there is no reliable, standardized mechanism to support health care professionals during the evaluation of and procurement processes for simulators. A tool founded on best practices could facilitate simulator purchase processes. In a 3-phase process, we identified top factors considered during the simulator purchase process through expert consensus (n = 127), created the Simulator Value Index (SVI) tool, evaluated targeted validity evidence, and evaluated the practical value of this SVI. A web-based survey was sent to simulation professionals. Participants (n = 79) used the SVI and provided feedback. We evaluated the practical value of 4 tool variations by calculating their sensitivity to predict a preferred simulator. Seventeen top factors were identified and ranked. The top 2 were technical stability/reliability of the simulator and customer service, with no practical differences in rank across institution or stakeholder role. Full SVI variations predicted successfully the preferred simulator with good (87%) sensitivity, whereas the sensitivity of variations in cost and customer service and cost and technical stability decreased (≤54%). The majority (73%) of participants agreed that the SVI was helpful at guiding simulator purchase decisions, and 88% agreed the SVI tool would help facilitate discussion with peers and leadership. Our findings indicate the SVI supports the process of simulator purchase using a standardized framework. Sensitivity of the tool improved when factors extend beyond traditionally targeted factors. We propose the tool will facilitate discussion amongst simulation professionals dealing with simulation, provide essential information for finance and procurement professionals, and improve the long-term value of simulation solutions. Limitations and application of the tool are discussed. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. WEC Design Response Toolbox v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey

    2016-03-30

    The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.

  14. Early Diagnosis and Intervention Strategies for Post-Traumatic Heterotopic Ossification in Severely Injured Extremities

    DTIC Science & Technology

    2013-10-01

    study will recruit wounded warriors with severe extremity trauma, which places them at high risk for heterotopic ossification (HO); bone formation at...involved in HO; 2) to define accurate and practical methods to predict where HO will develop; and 3) to define potential therapies for prevention or...elicit HO. These tools also need to provide effective methods for early diagnosis or risk assessment (prediction) so that therapies for prevention or

  15. TSUNAMI Primer: A Primer for Sensitivity/Uncertainty Calculations with SCALE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rearden, Bradley T; Mueller, Don; Bowman, Stephen M

    2009-01-01

    This primer presents examples in the application of the SCALE/TSUNAMI tools to generate k{sub eff} sensitivity data for one- and three-dimensional models using TSUNAMI-1D and -3D and to examine uncertainties in the computed k{sub eff} values due to uncertainties in the cross-section data used in their calculation. The proper use of unit cell data and need for confirming the appropriate selection of input parameters through direct perturbations are described. The uses of sensitivity and uncertainty data to identify and rank potential sources of computational bias in an application system and TSUNAMI tools for assessment of system similarity using sensitivity andmore » uncertainty criteria are demonstrated. Uses of these criteria in trending analyses to assess computational biases, bias uncertainties, and gap analyses are also described. Additionally, an application of the data adjustment tool TSURFER is provided, including identification of specific details of sources of computational bias.« less

  16. State-of-the-Art Fusion-Finder Algorithms Sensitivity and Specificity

    PubMed Central

    Carrara, Matteo; Beccuti, Marco; Lazzarato, Fulvio; Cavallo, Federica; Cordero, Francesca; Donatelli, Susanna; Calogero, Raffaele A.

    2013-01-01

    Background. Gene fusions arising from chromosomal translocations have been implicated in cancer. RNA-seq has the potential to discover such rearrangements generating functional proteins (chimera/fusion). Recently, many methods for chimeras detection have been published. However, specificity and sensitivity of those tools were not extensively investigated in a comparative way. Results. We tested eight fusion-detection tools (FusionHunter, FusionMap, FusionFinder, MapSplice, deFuse, Bellerophontes, ChimeraScan, and TopHat-fusion) to detect fusion events using synthetic and real datasets encompassing chimeras. The comparison analysis run only on synthetic data could generate misleading results since we found no counterpart on real dataset. Furthermore, most tools report a very high number of false positive chimeras. In particular, the most sensitive tool, ChimeraScan, reports a large number of false positives that we were able to significantly reduce by devising and applying two filters to remove fusions not supported by fusion junction-spanning reads or encompassing large intronic regions. Conclusions. The discordant results obtained using synthetic and real datasets suggest that synthetic datasets encompassing fusion events may not fully catch the complexity of RNA-seq experiment. Moreover, fusion detection tools are still limited in sensitivity or specificity; thus, there is space for further improvement in the fusion-finder algorithms. PMID:23555082

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Favorite, Jeffrey A.

    SENSMG is a tool for computing first-order sensitivities of neutron reaction rates, reaction-rate ratios, leakage, k eff, and α using the PARTISN multigroup discrete-ordinates code. SENSMG computes sensitivities to all of the transport cross sections and data (total, fission, nu, chi, and all scattering moments), two edit cross sections (absorption and capture), and the density for every isotope and energy group. It also computes sensitivities to the mass density for every material and derivatives with respect to all interface locations. The tool can be used for one-dimensional spherical (r) and two-dimensional cylindrical (r-z) geometries. The tool can be used formore » fixed-source and eigenvalue problems. The tool implements Generalized Perturbation Theory (GPT) as discussed by Williams and Stacey. Section II of this report describes the theory behind adjoint-based sensitivities, gives the equations that SENSMG solves, and defines the sensitivities that are output. Section III describes the user interface, including the input file and command line options. Section IV describes the output. Section V gives some notes about the coding that may be of interest. Section VI discusses verification, which is ongoing. Section VII lists needs and ideas for future work. Appendix A lists all of the input files whose results are presented in Sec. VI.« less

  18. msgbsR: An R package for analysing methylation-sensitive restriction enzyme sequencing data.

    PubMed

    Mayne, Benjamin T; Leemaqz, Shalem Y; Buckberry, Sam; Rodriguez Lopez, Carlos M; Roberts, Claire T; Bianco-Miotto, Tina; Breen, James

    2018-02-01

    Genotyping-by-sequencing (GBS) or restriction-site associated DNA marker sequencing (RAD-seq) is a practical and cost-effective method for analysing large genomes from high diversity species. This method of sequencing, coupled with methylation-sensitive enzymes (often referred to as methylation-sensitive restriction enzyme sequencing or MRE-seq), is an effective tool to study DNA methylation in parts of the genome that are inaccessible in other sequencing techniques or are not annotated in microarray technologies. Current software tools do not fulfil all methylation-sensitive restriction sequencing assays for determining differences in DNA methylation between samples. To fill this computational need, we present msgbsR, an R package that contains tools for the analysis of methylation-sensitive restriction enzyme sequencing experiments. msgbsR can be used to identify and quantify read counts at methylated sites directly from alignment files (BAM files) and enables verification of restriction enzyme cut sites with the correct recognition sequence of the individual enzyme. In addition, msgbsR assesses DNA methylation based on read coverage, similar to RNA sequencing experiments, rather than methylation proportion and is a useful tool in analysing differential methylation on large populations. The package is fully documented and available freely online as a Bioconductor package ( https://bioconductor.org/packages/release/bioc/html/msgbsR.html ).

  19. Sensitivities of the hydrologic cycle to model physics, grid resolution, and ocean type in the aquaplanet Community Atmosphere Model

    NASA Astrophysics Data System (ADS)

    Benedict, James J.; Medeiros, Brian; Clement, Amy C.; Pendergrass, Angeline G.

    2017-06-01

    Precipitation distributions and extremes play a fundamental role in shaping Earth's climate and yet are poorly represented in many global climate models. Here, a suite of idealized Community Atmosphere Model (CAM) aquaplanet simulations is examined to assess the aquaplanet's ability to reproduce hydroclimate statistics of real-Earth configurations and to investigate sensitivities of precipitation distributions and extremes to model physics, horizontal grid resolution, and ocean type. Little difference in precipitation statistics is found between aquaplanets using time-constant sea-surface temperatures and those implementing a slab ocean model with a 50 m mixed-layer depth. In contrast, CAM version 5.3 (CAM5.3) produces more time mean, zonally averaged precipitation than CAM version 4 (CAM4), while CAM4 generates significantly larger precipitation variance and frequencies of extremely intense precipitation events. The largest model configuration-based precipitation sensitivities relate to choice of horizontal grid resolution in the selected range 1-2°. Refining grid resolution has significant physics-dependent effects on tropical precipitation: for CAM4, time mean zonal mean precipitation increases along the Equator and the intertropical convergence zone (ITCZ) narrows, while for CAM5.3 precipitation decreases along the Equator and the twin branches of the ITCZ shift poleward. Increased grid resolution also reduces light precipitation frequencies and enhances extreme precipitation for both CAM4 and CAM5.3 resulting in better alignment with observational estimates. A discussion of the potential implications these hydrologic cycle sensitivities have on the interpretation of precipitation statistics in future climate projections is also presented.Plain Language SummaryPrecipitation plays a fundamental role in shaping Earth's climate. Global climate models predict the average precipitation reasonably well but often struggle to accurately represent how often it precipitates and at what intensity. Model precipitation errors are closely tied to imperfect representations of physical processes too small to be resolved on the model grid. The problem is compounded by the complexity of contemporary climate models and the many model configuration options available. In this study, we use an aquaplanet, a simplified global climate model entirely devoid of land masses, to explore the response of precipitation to several aspects of model configuration in a present-day climate state. Our results suggest that critical precipitation patterns, including extreme precipitation events that have large socio-economic impacts, are strongly sensitive to horizontal grid resolution and the representation of unresolved physical processes. Identification and understanding of such model configuration-related precipitation responses in the present-day climate will provide a more accurate estimate of model uncertainty necessary for an improved interpretation of precipitation changes in global warming projections.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19840017643','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19840017643"><span>Fabrication development for ODS-superalloy, air-cooled turbine blades</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Moracz, D. J.</p> <p>1984-01-01</p> <p>MA-600 is a gamma prime and oxide dispersion strengthened superalloy made by mechanical alloying. At the initiation of this program, MA-6000 was available as an experimental alloy only and did not go into production until late in the program. The objective of this program was to develop a thermal-mechanical-processing approach which would yield the necessary elongated grain structure and desirable mechanical properties after conventional press forging. Forging evaluations were performed to select optimum thermal-mechanical-processing conditions. These forging evaluations indicated that MA-6000 was extremely sensitive to die chilling. In order to conventionally hot forge the alloy, an adherent cladding, either the original extrusion can or a thick plating, was required to prevent cracking of the workpiece. Die design must reflect the requirement of cladding. MA-6000 was found to be sensitive to the forging temperature. The correct temperature required to obtain the proper grain structure after recrystallization was found to be between 1010-1065 C (1850-1950 F). The deformation level did not affect subsequent crystallization; however, sharp transition areas in tooling designs should be avoided in forming a blade shape because of the potential for grain structure discontinuities. Starting material to be used for forging should be processed so that it is capable of being zone annealed to a coarse elongated grain structure as bar stock. This conclusion means that standard processed bar materials can be used.</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li class="active"><span>23</span></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_23 --> <div id="page_24" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="461"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25109979','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25109979"><span>About the dangers, costs and benefits of living an aerobic lifestyle.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Knoefler, Daniela; Leichert, Lars I O; Thamsen, Maike; Cremers, Claudia M; Reichmann, Dana; Gray, Michael J; Wholey, Wei-Yun; Jakob, Ursula</p> <p>2014-08-01</p> <p>The era in which ROS (reactive oxygen species) were simply the 'bad boys of biology' is clearly over. High levels of ROS are still rightfully considered to be toxic to many cellular processes and, as such, contribute to disease conditions and cell death. However, the high toxicity of ROS is also extremely beneficial, particularly as it is used to kill invading micro-organisms during mammalian host defence. Moreover, a transient, often more localized, increase in ROS levels appears to play a major role in signal transduction processes and positively affects cell growth, development and differentiation. At the heart of all these processes are redox-regulated proteins, which use oxidation-sensitive cysteine residues to control their function and by extension the function of the pathways that they are part of. Our work has contributed to changing the view about ROS through: (i) our characterization of Hsp33 (heat-shock protein 33), one of the first redox-regulated proteins identified, whose function is specifically activated by ROS, (ii) the development of quantitative tools that reveal extensive redox-sensitive processes in bacteria and eukaryotes, and (iii) the discovery of a link between early exposure to oxidants and aging. Our future research programme aims to generate an integrated and system-wide view of the beneficial and deleterious effects of ROS with the central goal to develop more effective antioxidant strategies and more powerful antimicrobial agents.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/6188758','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/6188758"><span>Toxic ligand conjugates as tools in the study of receptor-ligand interactions.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Herschman, H R; Simpson, D L; Cawley, D B</p> <p>1982-01-01</p> <p>We have constructed hybrid proteins in which the toxic A chains of ricin or diptheria toxin have been linked to either asialofetuin, fetuin, or epidermal growth factor (EGF). Both ASF-RTA and ASF-DTA are potent toxins on cultured rat hepatocytes, cells that display the asialoglycoprotein receptor. Toxicity of these two compounds is restricted to hepatocytes and can be blocked by asialoglycoproteins but not the native glycoproteins or asialoagalactoglycoprotein derivatives, indicating that the toxicity of the conjugates is mediated by the hepatic asialoglycoprotein receptor. The EGF-RTA conjugate is an extremely potent toxin on cells that can bind the hormone, but is only poorly effective on cells that are unable to bind EGF. The EGF-DTA conjugate, in contrast, is unable to kill 3T3 cells and is at least two orders of magnitude less effective than EGF-RTA on A431 cells, a cell line with 1-2 X 10(6) EGF receptors per cell. However, when EGF-RTA and EGF-DTA were tested on primary liver hepatocyte cultures, which were susceptible to both ASF-RTA and ASF-DTA, both EGF conjugates were potent toxins. Sensitivity of the hepatocyte cultures to ricin toxicity increases slightly during a 52-hr culture period. In contrast, sensitivity to EGF-RTA and ASF-RTA decline dramatically during this period. Receptors for both ligands remain plentiful on the cell surface during this time.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017JPCM...29X3001T','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017JPCM...29X3001T"><span>Conformational effects in photoelectron circular dichroism</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Turchini, S.</p> <p>2017-12-01</p> <p>Photoelectron circular dichroism (PECD) is a novel type of spectroscopy, which presents surprising sensitivity to conformational effects in chiral systems. While classical photoelectron spectroscopy mainly responds to conformational effects in terms of energy level shifts, PECD provides a rich and detailed response to tiny changes in electronic and structural properties by means of the intensity dispersion of the circular dichroism as a function of photoelectron kinetic energy. In this work, the basics of PECD will be outlined, emphasizing the role of interference from the l,l+/- 1 outgoing partial wave of the photoelectron in the PECD transition matrix element, which is responsible for the extreme sensitivity to conformational effects. Examples using molecular systems and interfaces will shed light on the powerful application of PECD to classical conformational effects such as group substitution, isomerism, conformer population and clustering. Moreover, the PECD results will be reported in challenging new fields where conformations play a key role, such as vibrational effects, transient chirality and time- resolved experiments. To date, PECD has mostly been based on synchrotron radiation facilities, but it also has a future as a table-top lab experiment by means of multiphoton ionization. An important application of PECD as an analytical tool will be reported. The aim of this review is to illustrate that in PECD, the presence of conformational effects is essential for understanding a wide range of effects from a new perspective, making it different from classical spectroscopy.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://cfpub.epa.gov/si/si_public_record_report.cfm?direntryid=335068&showcriteria=2&timstype=journal&datebeginpublishedpresented=02/22/2012&dateendpublishedpresented=02/22/2017&sortby=pubdateyear&','PESTICIDES'); return false;" href="https://cfpub.epa.gov/si/si_public_record_report.cfm?direntryid=335068&showcriteria=2&timstype=journal&datebeginpublishedpresented=02/22/2012&dateendpublishedpresented=02/22/2017&sortby=pubdateyear&"><span>Canine olfaction as an alternative to analytical instruments for ...</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.epa.gov/pesticides/search.htm">EPA Pesticide Factsheets</a></p> <p></p> <p></p> <p>Recent literature has touted the use of canine olfaction as a diagnostic tool for identifying pre-clinical disease status, especially cancer and infection from biological media samples. Studies have shown a wide range of outcomes, ranging from almost perfect discrimination, all the way to essentially random results. This disparity is not likely to be a detection issue; dogs have been shown to have extremely sensitive noses as proven by their use for tracking, bomb detection and search and rescue. However, in contrast to analytical instruments, dogs are subject to boredom, fatigue, hunger and external distractions. These challenges are of particular importance in a clinical environment where task repetition is prized, but not as entertaining for a dog as chasing odours outdoors. The question addressed here is how to exploit the intrinsic sensitivity and simplicity of having a dog simply sniff out disease, in the face of variability in behavior and response. There is no argument that living cells emanate a variety of gas- and liquid-phase compounds as waste from normal metabolism, and that these compounds become easureable from various biological media including skin, blood, urine, breath, feces, etc. [1, 2] The overarching term for this phenomenon from the perspective of systems biology analysis is “cellular respiration”, which has become an important topic for the interpretation and documentation of the human exposome, the chemical counterpart to the genome.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2013AGUSMGC23A..05P','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2013AGUSMGC23A..05P"><span>About climate variabilitiy leading the hydric condition of the soil in the rainfed region of Argentina</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Pántano, V. C.; Penalba, O. C.</p> <p>2013-05-01</p> <p>Extreme events of temperature and rainfall have a socio-economic impact in the rainfed agriculture production region in Argentina. The magnitude of the impact can be analyzed through the water balance which integrates the characteristics of the soil and climate conditions. Changes observed in climate variables during the last decades affected the components of the water balance. As a result, a displacement of the agriculture border towards the west was produced, improving the agricultural production of the region. The objective of this work is to analyze how the variability of rainfall and temperature leads the hydric condition of the soil, with special focus on extreme events. The hydric conditions of the soil (HC= Excess- Deficit) were estimated from the monthly water balance (Thornthwaite and Mather method, 1957), using monthly potential evapotranspiration (PET) and monthly accumulated rainfall (R) for 33 stations (period 1970-2006). Information of temperature and rainfall was provided by National Weather Service and the effective capacity of soil water was considered from Forte Lay and Spescha (2001). An agricultural extreme condition occurs when soil moisture and rainfall are inadequate or excessive for the development of the crops. In this study, we define an extreme event when the variable is less (greater) than its 20% and 10% (80% and 90%) percentile. In order to evaluate how sensitive is the HC to water and heat stress in the region, different conditional probabilities were evaluated. There is a weaker response of HC to extreme low PET while extreme low R leads high values of HC. However, this behavior is not always observed, especially in the western region where extreme high and low PET show a stronger influence over the HC. Finally, to analyze the temporal variability of extreme PET and R, leading hydric condition of the soil, the number of stations presenting extreme conditions was computed for each month. As an example, interesting results were observed for April. During this month, the water recharge of the soil is crucial to let the winter crops manage with the scarce rainfalls occurring in the following months. In 1970, 1974, 1977, 1978 and 1997 more than 50% of the stations were under extreme high PET; while 1970, 1974, 1978 and 1988 presented more than 40% under extreme low R. Thus, the 70s was the more threatened decade of the period. Since the 80s (except for 1997), extreme dry events due to one variable or the other are mostly presented separately, over smaller areas. The response of the spatial distribution of HC is stronger when both variables present extreme conditions. In particular, during 1997 the region presents extreme low values of HC as a consequence of extreme low R and high PET. Communities dependent on agriculture are highly sensitive to climate variability and its extremes. In the studied region, it was shown that scarce water and heat stress contribute to the resulting hydric condition, producing strong impact over different productive activities. Extreme temperature seems to have a stronger influence over extreme unfavorable hydric conditions.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/21718117','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/21718117"><span>Costs of necrotizing enterocolitis and cost-effectiveness of exclusively human milk-based products in feeding extremely premature infants.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ganapathy, Vaidyanathan; Hay, Joel W; Kim, Jae H</p> <p>2012-02-01</p> <p>This study evaluated the cost-effectiveness of a 100% human milk-based diet composed of mother's milk fortified with a donor human milk-based human milk fortifier (HMF) versus mother's milk fortified with bovine milk-based HMF to initiate enteral nutrition among extremely premature infants in the neonatal intensive care unit (NICU). A net expected costs calculator was developed to compare the total NICU costs among extremely premature infants who were fed either a bovine milk-based HMF-fortified diet or a 100% human milk-based diet, based on the previously observed risks of overall necrotizing enterocolitis (NEC) and surgical NEC in a randomized controlled study that compared outcomes of these two feeding strategies among 207 very low birth weight infants. The average NICU costs for an extremely premature infant without NEC and the incremental costs due to medical and surgical NEC were derived from a separate analysis of hospital discharges in the state of California in 2007. The sensitivity of cost-effectiveness results to the risks and costs of NEC and to prices of milk supplements was studied. The adjusted incremental costs of medical NEC and surgical NEC over and above the average costs incurred for extremely premature infants without NEC, in 2011 US$, were $74,004 (95% confidence interval, $47,051-$100,957) and $198,040 (95% confidence interval, $159,261-$236,819) per infant, respectively. Extremely premature infants fed with 100% human-milk based products had lower expected NICU length of stay and total expected costs of hospitalization, resulting in net direct savings of 3.9 NICU days and $8,167.17 (95% confidence interval, $4,405-$11,930) per extremely premature infant (p < 0.0001). Costs savings from the donor HMF strategy were sensitive to price and quantity of donor HMF, percentage reduction in risk of overall NEC and surgical NEC achieved, and incremental costs of surgical NEC. Compared with feeding extremely premature infants with mother's milk fortified with bovine milk-based supplements, a 100% human milk-based diet that includes mother's milk fortified with donor human milk-based HMF may result in potential net savings on medical care resources by preventing NEC.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/9268820','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/9268820"><span>Strategic thinking for radiology.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Schilling, R B</p> <p>1997-08-01</p> <p>We have now analyzed the use and benefits of four Strategic Thinking Tools for Radiology: the Vision Statement, the High Five, the Two-by-Two, and Real-Win-Worth. Additional tools will be provided during the tutorial. The tools provided above should be considered as examples. They all contain the 10 benefits outlined earlier to varying degrees. It is extremely important that the tools be used in a manner consistent with the Vision Statement of the organization. The specific situation, the effectiveness of the team, and the experience developed with the tools over time will determine the true benefits of the process. It has also been shown that with active use of the types of tools provided above, teams have learned to modify the tools for increased effectiveness and have created additional tools for specific purposes. Once individuals in the organization become committed to improving communication and to using tools/frameworks for solving problems as a team, effectiveness becomes boundless.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://kidshealth.org/en/parents/vision.html','NIH-MEDLINEPLUS'); return false;" href="https://kidshealth.org/en/parents/vision.html"><span>Your Child's Vision</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://medlineplus.gov/">MedlinePlus</a></p> <p></p> <p></p> <p>... 3½, kids should have eye health screenings and visual acuity tests (tests that measure sharpness of vision) ... eye rubbing extreme light sensitivity poor focusing poor visual tracking (following an object) abnormal alignment or movement ...</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://eric.ed.gov/?q=BDI&pg=2&id=EJ866812','ERIC'); return false;" href="https://eric.ed.gov/?q=BDI&pg=2&id=EJ866812"><span>The Sensitivity and Specificity of Depression Screening Tools among Adults with Intellectual Disabilities</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Ailey, Sarah H.</p> <p>2009-01-01</p> <p>This study describes the validity and the sensitivity and specificity of depression screening tools among adults with intellectual and disabilities (ID). Subjects (N = 75) were interviewed with the Beck Depression Inventory II (BDI-II) and the Glasgow Depression Scale for People with a Learning Disability (GDS-LD) and also completed a clinical…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/28159441','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/28159441"><span>Identifying substance misuse in primary care: TAPS Tool compared to the WHO ASSIST.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Schwartz, R P; McNeely, J; Wu, L T; Sharma, G; Wahle, A; Cushing, C; Nordeck, C D; Sharma, A; O'Grady, K E; Gryczynski, J; Mitchell, S G; Ali, R L; Marsden, J; Subramaniam, G A</p> <p>2017-05-01</p> <p>There is a need for screening and brief assessment instruments to identify primary care patients with substance use problems. This study's aim was to examine the performance of a two-step screening and brief assessment instrument, the TAPS Tool, compared to the WHO ASSIST. Two thousand adult primary care patients recruited from five primary care clinics in four Eastern US states completed the TAPS Tool followed by the ASSIST. The ability of the TAPS Tool to identify moderate- and high-risk use scores on the ASSIST was examined using sensitivity and specificity analyses. The interviewer and self-administered computer tablet versions of the TAPS Tool generated similar results. The interviewer-administered version (at cut-off of 2), had acceptable sensitivity and specificity for high-risk tobacco (0.90 and 0.77) and alcohol (0.87 and 0.80) use. For illicit drugs, sensitivities were >0.82 and specificities >0.92. The TAPS (at a cut-off of 1) had good sensitivity and specificity for moderate-risk tobacco use (0.83 and 0.97) and alcohol (0.83 and 0.74). Among illicit drugs, sensitivity was acceptable for moderate-risk of marijuana (0.71), while it was low for all other illicit drugs and non-medical use of prescription medications. Specificities were 0.97 or higher for all illicit drugs and prescription medications. The TAPS Tool identified adult primary care patients with high-risk ASSIST scores for all substances as well moderate-risk users of tobacco, alcohol, and marijuana, although it did not perform well in identifying patients with moderate-risk use of other drugs or non-medical use of prescription medications. The advantages of the TAPS Tool over the ASSIST are its more limited number of items and focus solely on substance use in the past 3months. Copyright © 2017 Elsevier Inc. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20100012809','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20100012809"><span>Sideband-Separating, Millimeter-Wave Heterodyne Receiver</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Ward, John S.; Bumble, Bruce; Lee, Karen A.; Kawamura, Jonathan H.; Chattopadhyay, Goutam; Stek, paul; Stek, Paul</p> <p>2010-01-01</p> <p>Researchers have demonstrated a submillimeter-wave spectrometer that combines extremely broad bandwidth with extremely high sensitivity and spectral resolution to enable future spacecraft to measure the composition of the Earth s troposphere in three dimensions many times per day at spatial resolutions as high as a few kilometers. Microwave limb sounding is a proven remote-sensing technique that measures thermal emission spectra from molecular gases along limb views of the Earth s atmosphere against a cold space background.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17502314','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17502314"><span>Feasibility study of extremity dosemeter based on polyallyldiglycolcarbonate (CR-39) for neutron exposure.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Chau, Q; Bruguier, P</p> <p>2007-01-01</p> <p>In nuclear facilities, some activities such as reprocessing, recycling and production of bare fuel rods expose the workers to mixed neutron-photon fields. For several workplaces, particularly in glove boxes, some workers expose their hands to mixed fields. The mastery of the photon extremity dosimetry is relatively good, whereas the neutron dosimetry still raises difficulties. In this context, the Institute for Radiological Protection and Nuclear Safety (IRSN) has proposed a study on a passive neutron extremity dosemeter based on chemically etched CR-39 (PADC: polyallyldiglycolcarbonate), named PN-3, already used in routine practice for whole body dosimetry. This dosemeter is a chip of plastic sensitive to recoil protons. The chemical etching process amplifies the size of the impact. The reading system for tracks counting is composed of a microscope, a video camera and an image analyser. This system is combined with the dose evaluation algorithm. The performance of the dosemeter PN-3 has been largely studied and proved by several laboratories in terms of passive individual neutron dosemeter which is used in routine production by different companies. This study focuses on the sensitivity of the extremity dosemeter, as well as its performance in the function of the level of the neutron energy. The dosemeter was exposed to monoenergetic neutron fields in laboratory conditions and to mixed fields in glove boxes at workplaces.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/25967940','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/25967940"><span>Assessment of the predictive accuracy of five in silico prediction tools, alone or in combination, and two metaservers to classify long QT syndrome gene mutations.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Leong, Ivone U S; Stuckey, Alexander; Lai, Daniel; Skinner, Jonathan R; Love, Donald R</p> <p>2015-05-13</p> <p>Long QT syndrome (LQTS) is an autosomal dominant condition predisposing to sudden death from malignant arrhythmia. Genetic testing identifies many missense single nucleotide variants of uncertain pathogenicity. Establishing genetic pathogenicity is an essential prerequisite to family cascade screening. Many laboratories use in silico prediction tools, either alone or in combination, or metaservers, in order to predict pathogenicity; however, their accuracy in the context of LQTS is unknown. We evaluated the accuracy of five in silico programs and two metaservers in the analysis of LQTS 1-3 gene variants. The in silico tools SIFT, PolyPhen-2, PROVEAN, SNPs&GO and SNAP, either alone or in all possible combinations, and the metaservers Meta-SNP and PredictSNP, were tested on 312 KCNQ1, KCNH2 and SCN5A gene variants that have previously been characterised by either in vitro or co-segregation studies as either "pathogenic" (283) or "benign" (29). The accuracy, sensitivity, specificity and Matthews Correlation Coefficient (MCC) were calculated to determine the best combination of in silico tools for each LQTS gene, and when all genes are combined. The best combination of in silico tools for KCNQ1 is PROVEAN, SNPs&GO and SIFT (accuracy 92.7%, sensitivity 93.1%, specificity 100% and MCC 0.70). The best combination of in silico tools for KCNH2 is SIFT and PROVEAN or PROVEAN, SNPs&GO and SIFT. Both combinations have the same scores for accuracy (91.1%), sensitivity (91.5%), specificity (87.5%) and MCC (0.62). In the case of SCN5A, SNAP and PROVEAN provided the best combination (accuracy 81.4%, sensitivity 86.9%, specificity 50.0%, and MCC 0.32). When all three LQT genes are combined, SIFT, PROVEAN and SNAP is the combination with the best performance (accuracy 82.7%, sensitivity 83.0%, specificity 80.0%, and MCC 0.44). Both metaservers performed better than the single in silico tools; however, they did not perform better than the best performing combination of in silico tools. The combination of in silico tools with the best performance is gene-dependent. The in silico tools reported here may have some value in assessing variants in the KCNQ1 and KCNH2 genes, but caution should be taken when the analysis is applied to SCN5A gene variants.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19990049209','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19990049209"><span>High-Speed TCP Testing</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Brooks, David E.; Gassman, Holly; Beering, Dave R.; Welch, Arun; Hoder, Douglas J.; Ivancic, William D.</p> <p>1999-01-01</p> <p>Transmission Control Protocol (TCP) is the underlying protocol used within the Internet for reliable information transfer. As such, there is great interest to have all implementations of TCP efficiently interoperate. This is particularly important for links exhibiting long bandwidth-delay products. The tools exist to perform TCP analysis at low rates and low delays. However, for extremely high-rate and lone-delay links such as 622 Mbps over geosynchronous satellites, new tools and testing techniques are required. This paper describes the tools and techniques used to analyze and debug various TCP implementations over high-speed, long-delay links.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24462937','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24462937"><span>Does oxytocin modulate variation in maternal caregiving in healthy new mothers?</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Elmadih, Alya; Wan, Ming Wai; Numan, Michael; Elliott, Rebecca; Downey, Darragh; Abel, Kathryn M</p> <p>2014-09-11</p> <p>Maternal sensitivity to infant cues and developmental needs may be pivotal for social and cognitive development. Animal and recent human studies emphasise a major role for Oxytocin (OT) in mediating sensitive caregiving but no study has examined the relationship between OT and extreme variation in human maternal sensitivity. From 105 expectant mothers, 80 underwent blind-rating of maternal sensitivity at 4-6 months postpartum through free-play interaction with their infants. At 7-9 months postpartum, 30 mothers at extremes of maternal sensitivity: 15 'sensitive mothers' (high sensitivity mothers - HSMs, mean=4.47; SD=0.74) and 15 'less sensitive mothers' (low sensitivity mothers - LSMs, mean=2.13; SD=0.52) underwent plasma OT measurements before and after 10 min infant play. Baseline and post-interaction plasma OT was higher in LSMs than HSMs [F(1, 26)=8.42; p=0.01]. HSMs showed a trend towards significant reduction in plasma OT [t(14)=2.01; p=0.06] following play-interaction; no change was shown by LSMs [t(13)=-0.14; p=0.89]. Conclusion Higher baseline OT levels in healthy LSMs may imply greater stress responses to the demands of caring for an infant, or past deficiencies in own parenting relationship and act as a biomarker for poor parental sensitivity. OT may be acting to reduce stress and anxiety in LSMs consistent with studies of plasma OT and stress in women. By contrast, in HSMs, play interaction with their infants maybe relaxing as indicated by significant reduction in plasma OT from baseline. Ascertainment of mothers in well-defined sensitivity groups might facilitate examination of distinct coping strategies in parents and better understanding of variation in parental caregiving behaviour and its potential for modulation by OT. This article is part of a Special Issue entitled Oxytocin and Social Behav. Copyright © 2014 Elsevier B.V. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1041442-benchmark-sensitivity-calculation-phase-iii','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1041442-benchmark-sensitivity-calculation-phase-iii"><span>Benchmark On Sensitivity Calculation (Phase III)</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Ivanova, Tatiana; Laville, Cedric; Dyrda, James</p> <p>2012-01-01</p> <p>The sensitivities of the keff eigenvalue to neutron cross sections have become commonly used in similarity studies and as part of the validation algorithm for criticality safety assessments. To test calculations of the sensitivity coefficients, a benchmark study (Phase III) has been established by the OECD-NEA/WPNCS/EG UACSA (Expert Group on Uncertainty Analysis for Criticality Safety Assessment). This paper presents some sensitivity results generated by the benchmark participants using various computational tools based upon different computational methods: SCALE/TSUNAMI-3D and -1D, MONK, APOLLO2-MORET 5, DRAGON-SUSD3D and MMKKENO. The study demonstrates the performance of the tools. It also illustrates how model simplifications impactmore » the sensitivity results and demonstrates the importance of 'implicit' (self-shielding) sensitivities. This work has been a useful step towards verification of the existing and developed sensitivity analysis methods.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2012AGUFM.U22A..02F','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2012AGUFM.U22A..02F"><span>Climate change impacts: The challenge of quantifying multi-factor causation, multi-component responses, and leveraging from extremes</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Field, C. B.</p> <p>2012-12-01</p> <p>Modeling climate change impacts is challenging for a variety of reasons. Some of these are related to causation. A weather or climate event is rarely the sole cause of an impact, and, for many impacts, social, economic, cultural, or ecological factors may play a larger role than climate. Other challenges are related to outcomes. Consequences of an event are often most severe when several kinds of responses interact, typically in unexpected ways. Many kinds of consequences are difficult to quantify, especially when they include a mix of market, cultural, personal, and ecological values. In addition, scale can be tremendously important. Modest impacts over large areas present very different challenges than severe but very local impacts. Finally, impacts may respond non-linearly to forcing, with behavior that changes qualitatively at one or more thresholds and with unexpected outcomes in extremes. Modeling these potentially complex interactions between drivers and impacts presents one set of challenges. Evaluating the models presents another. At least five kinds of approaches can contribute to the evaluation of impact models designed to provide insights in multi-driver, multi-responder, multi-scale, and extreme-driven contexts, even though none of these approaches is a complete or "silver-bullet" solution. The starting point for much of the evaluation in this space is case studies. Case studies can help illustrate links between processes and scales. They can highlight factors that amplify or suppress sensitivity to climate drivers, and they can suggest the consequences of intervening at different points. While case studies rarely provide concrete evidence about mechanisms, they can help move a mechanistic case from circumstantial to sound. Novel approaches to data collection, including crowd sourcing, can potentially provide tools and the number of relevant examples to develop case studies as statistically robust data sources. A critical condition for progress in this area is the ability to utilize data of uneven quality and standards. Novel approaches to meta-analysis provide other options for taking advantage of diverse case studies. Techniques for summarizing responses across impacts, drivers, and scales can play a huge role in increasing the value of information from case studies. In some cases, expert elicitation may provide alternatives for identifying mechanisms or for interpreting multi-factor drivers or responses. Especially when designed to focus on a well-defined set of observations, a sophisticated elicitation can establish formal confidence limits on responses that are otherwise difficult to constrain. A final possible approach involves a focus on the mechanisms contributing to an impact, rather than the impact itself. Approaches based on quantified mechanisms are especially appealing in the context of models where the number of interactions makes it difficult to intuitively understand the chain of connections from cause to effect, when actors differ in goals or sensitivities, or when scale affects parts of the system differently. With all of these approaches, useful evidence may not conform to traditional levels of statistical confidence. Some of the biggest challenges in taking advantage of the potential tools will involve defining what constitutes a meaningful evaluation.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://ntrs.nasa.gov/search.jsp?R=19860049876&hterms=singularities&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dsingularities','NASA-TRS'); return false;" href="https://ntrs.nasa.gov/search.jsp?R=19860049876&hterms=singularities&qs=Ntx%3Dmode%2Bmatchall%26Ntk%3DAll%26N%3D0%26No%3D50%26Ntt%3Dsingularities"><span>Singularity problems of the power law for modeling creep compliance</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Dillard, D. A.; Hiel, C.</p> <p>1985-01-01</p> <p>An explanation is offered for the extreme sensitivity that has been observed in the power law parameters of the T300/934 graphite epoxy material systems during experiments to evaluate the system's viscoelastic response. It is shown that the singularity associated with the power law can explain the sensitivity as well as the observed variability in the calculated parameters. Techniques for minimizing errors are suggested.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://files.eric.ed.gov/fulltext/EJ1098614.pdf','ERIC'); return false;" href="http://files.eric.ed.gov/fulltext/EJ1098614.pdf"><span>A Study to Assess the Achievement Motivation of Higher Secondary Students in Relation to Their Noise Sensitivity</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.eric.ed.gov/ERICWebPortal/search/extended.jsp?_pageLabel=advanced">ERIC Educational Resources Information Center</a></p> <p>Latha, Prema</p> <p>2014-01-01</p> <p>Disturbing sounds are often referred to as noise, and if extreme enough in degree, intensity or frequency, it is referred to as noise pollution. Achievement refers to a change in study behavior in relation to their noise sensitivity and learning in the educational sense by achieving results in changed responses to certain types of stimuli like…</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA544874','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA544874"><span>Development of a Micro-Fabricated Total-Field Magnetometer</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2011-03-01</p> <p>are made with fluxgate technologies. Fluxgates have lower sensitivity than Cs magnetometers , yet they continue to be used in small wands simply...extraction process by providing the sensitivity of a Cs magnetometer with the convenience and low cost of a fluxgate wand. Extremely small and low cost...FINAL REPORT Development of a Micro-Fabricated Total-Field Magnetometer SERDP Project MR-1512 MARCH 2011 Mark Prouty Geometrics, Inc</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li class="active"><span>24</span></li> <li><a href="#" onclick='return showDiv("page_25");'>25</a></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_24 --> <div id="page_25" class="hiddenDiv"> <div class="row"> <div class="col-sm-12"> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div> </div> <div class="row"> <div class="col-sm-12"> <ol class="result-class" start="481"> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/AD1029811','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/AD1029811"><span>Analysis of Sensitivity Experiments - An Expanded Primer</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>2017-03-08</p> <p>diehard practitioners. The difficulty associated with mastering statistical inference presents a true dilemma. Statistics is an extremely applied...lost, perhaps forever. In other words, when on this safari, you need a guide. This report is designed to be a guide, of sorts. It focuses on analytical...estimated accurately if our analysis is to have real meaning. For this reason, the sensitivity test procedure is designed to concentrate measurements</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/24515004','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/24515004"><span>25-Gbit/s burst-mode optical receiver using high-speed avalanche photodiode for 100-Gbit/s optical packet switching.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Nada, Masahiro; Nakamura, Makoto; Matsuzaki, Hideaki</p> <p>2014-01-13</p> <p>25-Gbit/s error-free operation of an optical receiver is successfully demonstrated against burst-mode optical input signals without preambles. The receiver, with a high-sensitivity avalanche photodiode and burst-mode transimpedance amplifier, exhibits sufficient receiver sensitivity and an extremely quick response suitable for burst-mode operation in 100-Gbit/s optical packet switching.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/servlets/purl/1419717','SCIGOV-STC'); return false;" href="https://www.osti.gov/servlets/purl/1419717"><span>Sensitivity Analysis and Requirements for Temporally and Spatially Resolved Thermometry Using Neutron Resonance Spectroscopy</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Fernandez, Juan Carlos; Barnes, Cris William; Mocko, Michael Jeffrey</p> <p></p> <p>This report is intended to examine the use of neutron resonance spectroscopy (NRS) to make time- dependent and spatially-resolved temperature measurements of materials in extreme conditions. Specifically, the sensitivities of the temperature estimate on neutron-beam and diagnostic parameters is examined. Based on that examination, requirements are set on a pulsed neutron-source and diagnostics to make a meaningful measurement.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/20423404','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/20423404"><span>Accidental falls in hospital inpatients: evaluation of sensitivity and specificity of two risk assessment tools.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Lovallo, Carmela; Rolandi, Stefano; Rossetti, Anna Maria; Lusignani, Maura</p> <p>2010-03-01</p> <p>This paper is a report of a study comparing the effectiveness of two falls risk assessment tools (Conley Scale and Hendrich Risk Model) by using them simultaneously with the same sample of hospital inpatients. Different risk assessment tools are available in literature. However, neither recent critical reviews nor international guidelines on fall prevention have identified tools that can be generalized to all categories of hospitalized patients. A prospective observational study was carried out in acute medical, surgical wards and rehabilitation units. From October 2007 to January 2008, 1148 patients were assessed with both instruments, subsequently noting the occurrence of falls. The sensitivity, specificity, positive and negative predictive values, and Receiver Operating Characteristics curves were calculated. The number of patients correctly identified with the Conley Scale (n = 41) was higher than with the Hendrich Model (n = 27). The Conley Scale gave sensitivity and specificity values of 69.49% and 61% respectively. The Hendrich Model gave a sensitivity value of 45.76% and a specificity value of 71%. Positive and negative predictive values were comparable. The Conley Scale is indicated for use in the medical sector, on the strength of its high sensitivity. However, since its specificity is very low, it is deemed useful to submit individual patients giving positive results to more in-depth clinical evaluation in order to decide whether preventive measures need to be taken. In surgical sectors, the low sensitivity values given by both scales suggest that further studies are warranted.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/1987SPIE..851..173M','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/1987SPIE..851..173M"><span>Telescience - Concepts And Contributions To The Extreme Ultraviolet Explorer Mission</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Marchant, Will; Dobson, Carl; Chakrabarti, Supriya; Malina, Roger F.</p> <p>1987-10-01</p> <p>A goal of the telescience concept is to allow scientists to use remotely located instruments as they would in their laboratory. Another goal is to increase reliability and scientific return of these instruments. In this paper we discuss the role of transparent software tools in development, integration, and postlaunch environments to achieve hands on access to the instrument. The use of transparent tools helps to reduce the parallel development of capability and to assure that valuable pre-launch experience is not lost in the operations phase. We also discuss the use of simulation as a rapid prototyping technique. Rapid prototyping provides a cost-effective means of using an iterative approach to instrument design. By allowing inexpensive produc-tion of testbeds, scientists can quickly tune the instrument to produce the desired scientific data. Using portions of the Extreme Ultraviolet Explorer (EUVE) system, we examine some of the results of preliminary tests in the use of simulation and tran-sparent tools. Additionally, we discuss our efforts to upgrade our software "EUVE electronics" simulator to emulate a full instrument, and give the pros and cons of the simulation facilities we have developed.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29393698','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29393698"><span>Assessment of olfactory function after traumatic brain injury: comparison of single odour tool with detailed assessment tool.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Humphries, Thomas; Singh, Rajiv</p> <p>2018-01-01</p> <p>Olfactory disturbance (OD) is common after traumatic brain injury (TBI). Screening for OD can be performed by several different methods. While odour identification tools are considered more accurate, they are time consuming. The predominant method in clinical practice remains the use of a single odour. This study aimed to compare a brief single-odour identification tool (BSOIT) with a more detailed 12-odour assessment tool. One hundred seventy consecutive patients with TBI had their olfaction assessed using BSOIT and a 12-item tool at a single time point. The sensitivity and specificity of the BSOIT were calculated. The sensitivity and specificity of the BSOIT as compared to the Burghart tool were 57.5% and 100%, respectively, for all ODs (anosmia and hyposmia). The sensitivity and specificity for anosmia only were 93.5% and 96.7%, respectively. For the two tools, the Cohen's kappa coefficient showed moderate agreement when both anosmia and hyposmia were considered (k = 0.619, p < 0.001) but a very strong agreement when only anosmia was considered (k = 0.844, p < 0.001). For both the tools, anosmia had a significant association with TBI severity (p < 0.001). However, hyposmia showed no such association. The BSOIT is very effective at identifying anosmia but not hyposmia, producing comparable results to a more detailed test. It can be effective in clinical practice and takes considerably less time.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2014AGUFMNH32A..01H','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2014AGUFMNH32A..01H"><span>The Engineering for Climate Extremes Partnership</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Holland, G. J.; Tye, M. R.</p> <p>2014-12-01</p> <p>Hurricane Sandy and the recent floods in Thailand have demonstrated not only how sensitive the urban environment is to the impact of severe weather, but also the associated global reach of the ramifications. These, together with other growing extreme weather impacts and the increasing interdependence of global commercial activities point towards a growing vulnerability to weather and climate extremes. The Engineering for Climate Extremes Partnership brings academia, industry and government together with the goals encouraging joint activities aimed at developing new, robust, and well-communicated responses to this increasing vulnerability. Integral to the approach is the concept of 'graceful failure' in which flexible designs are adopted that protect against failure by combining engineering or network strengths with a plan for efficient and rapid recovery if and when they fail. Such an approach enables optimal planning for both known future scenarios and their assessed uncertainty.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/22140309-noise-adaptive-fuzzy-equalization-method-processing-solar-extreme-ultraviolet-images','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/22140309-noise-adaptive-fuzzy-equalization-method-processing-solar-extreme-ultraviolet-images"><span>A NOISE ADAPTIVE FUZZY EQUALIZATION METHOD FOR PROCESSING SOLAR EXTREME ULTRAVIOLET IMAGES</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Druckmueller, M., E-mail: druckmuller@fme.vutbr.cz</p> <p></p> <p>A new image enhancement tool ideally suited for the visualization of fine structures in extreme ultraviolet images of the corona is presented in this paper. The Noise Adaptive Fuzzy Equalization method is particularly suited for the exceptionally high dynamic range images from the Atmospheric Imaging Assembly instrument on the Solar Dynamics Observatory. This method produces artifact-free images and gives significantly better results than methods based on convolution or Fourier transform which are often used for that purpose.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://www.dtic.mil/docs/citations/ADA079654','DTIC-ST'); return false;" href="http://www.dtic.mil/docs/citations/ADA079654"><span>Wear in Fluid Power Systems.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.dtic.mil/">DTIC Science & Technology</a></p> <p></p> <p>1979-11-30</p> <p>the detection and analysis of this wear is extremely important. In this study, it was determined that ferrography is an effective tool for this...dealt with the practical applications of ferrography to fluid power systems. The first two phases were investigations of the life improvements of...damning evidence that ferrography is not the beneficial tool it was originally thought to be. However, a further analysis of the entire program and the</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/biblio/1132230-atmospheric-moisture-budget-spatial-resolution-dependence-precipitation-extremes-aquaplanet-simulations','SCIGOV-STC'); return false;" href="https://www.osti.gov/biblio/1132230-atmospheric-moisture-budget-spatial-resolution-dependence-precipitation-extremes-aquaplanet-simulations"><span>Atmospheric Moisture Budget and Spatial Resolution Dependence of Precipitation Extremes in Aquaplanet Simulations</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/search">DOE Office of Scientific and Technical Information (OSTI.GOV)</a></p> <p>Yang, Qing; Leung, Lai-Yung R.; Rauscher, Sara</p> <p></p> <p>This study investigates the resolution dependency of precipitation extremes in an aqua-planet framework. Strong resolution dependency of precipitation extremes is seen over both tropics and extra-tropics, and the magnitude of this dependency also varies with dynamical cores. Moisture budget analyses based on aqua-planet simulations with the Community Atmosphere Model (CAM) using the Model for Prediction Across Scales (MPAS) and High Order Method Modeling Environment (HOMME) dynamical cores but the same physics parameterizations suggest that during precipitation extremes moisture supply for surface precipitation is mainly derived from advective moisture convergence. The resolution dependency of precipitation extremes mainly originates from advective moisturemore » transport in the vertical direction. At most vertical levels over the tropics and in the lower atmosphere over the subtropics, the vertical eddy transport of mean moisture field dominates the contribution to precipitation extremes and its resolution dependency. Over the subtropics, the source of moisture, its associated energy, and the resolution dependency during extremes are dominated by eddy transport of eddies moisture at the mid- and upper-troposphere. With both MPAS and HOMME dynamical cores, the resolution dependency of the vertical advective moisture convergence is mainly explained by dynamical changes (related to vertical velocity or omega), although the vertical gradients of moisture act like averaging kernels to determine the sensitivity of the overall resolution dependency to the changes in omega at different vertical levels. The natural reduction of variability with coarser resolution, represented by areal data averaging (aggregation) effect, largely explains the resolution dependency in omega. The thermodynamic changes, which likely result from non-linear feedback in response to the large dynamical changes, are small compared to the overall changes in dynamics (omega). However, after excluding the data aggregation effect in omega, thermodynamic changes become relatively significant in offsetting the effect of dynamics leading to reduce differences between the simulated and aggregated results. Compared to MPAS, the simulated stronger vertical motion with HOMME also results in larger resolution dependency. Compared to the simulation at fine resolution, the vertical motion during extremes is insufficiently resolved/parameterized at the coarser resolution even after accounting for the natural reduction in variability with coarser resolution, and this is more distinct in the simulation with HOMME. To reduce uncertainties in simulated precipitation extremes, future development in cloud parameterizations must address their sensitivity to spatial resolution as well as dynamical cores.« less</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://pubs.er.usgs.gov/publication/70027952','USGSPUBS'); return false;" href="https://pubs.er.usgs.gov/publication/70027952"><span>Analysis of real-time vibration data</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://pubs.er.usgs.gov/pubs/index.jsp?view=adv">USGS Publications Warehouse</a></p> <p>Safak, E.</p> <p>2005-01-01</p> <p>In recent years, a few structures have been instrumented to provide continuous vibration data in real time, recording not only large-amplitude motions generated by extreme loads, but also small-amplitude motions generated by ambient loads. The main objective in continuous recording is to track any changes in structural characteristics, and to detect damage after an extreme event, such as an earthquake or explosion. The Fourier-based spectral analysis methods have been the primary tool to analyze vibration data from structures. In general, such methods do not work well for real-time data, because real-time data are mainly composed of ambient vibrations with very low amplitudes and signal-to-noise ratios. The long duration, linearity, and the stationarity of ambient data, however, allow us to utilize statistical signal processing tools, which can compensate for the adverse effects of low amplitudes and high noise. The analysis of real-time data requires tools and techniques that can be applied in real-time; i.e., data are processed and analyzed while being acquired. This paper presents some of the basic tools and techniques for processing and analyzing real-time vibration data. The topics discussed include utilization of running time windows, tracking mean and mean-square values, filtering, system identification, and damage detection.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://adsabs.harvard.edu/abs/2017EGUGA..1910480G','NASAADS'); return false;" href="http://adsabs.harvard.edu/abs/2017EGUGA..1910480G"><span>Meteorological risks are drivers of environmental innovation in agro-ecosystem management</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://adsabs.harvard.edu/abstract_service.html">NASA Astrophysics Data System (ADS)</a></p> <p>Gobin, Anne; Van de Vijver, Hans; Vanwindekens, Frédéric; de Frutos Cachorro, Julia; Verspecht, Ann; Planchon, Viviane; Buyse, Jeroen</p> <p>2017-04-01</p> <p>Agricultural crop production is to a great extent determined by weather conditions. The research hypothesis is that meteorological risks act as drivers of environmental innovation in agro-ecosystem management. The methodology comprised five major parts: the hazard, its impact on different agro-ecosystems, vulnerability, risk management and risk communication. Generalized Extreme Value (GEV) theory was used to model annual maxima of meteorological variables based on a location-, scale- and shape-parameter that determine the center of the distribution, the deviation of the location-parameter and the upper tail decay, respectively. Spatial interpolation of GEV-derived return levels resulted in spatial temperature extremes, precipitation deficits and wet periods. The temporal overlap between extreme weather conditions and sensitive periods in the agro-ecosystem was realised using a bio-physically based modelling framework that couples phenology, a soil water balance and crop growth. 20-year return values for drought and waterlogging during different crop stages were related to arable yields. The method helped quantify agricultural production risks and rate both weather and crop-based agricultural insurance. The spatial extent of vulnerability is developed on different layers of geo-information to include meteorology, soil-landscapes, crop cover and management. Vulnerability of agroecosystems was mapped based on rules set by experts' knowledge and implemented by Fuzzy Inference System modelling and Geographical Information System tools. The approach was applied for cropland vulnerability to heavy rain and grassland vulnerability to drought. The level of vulnerability and resilience of an agro-ecosystem was also determined by risk management which differed across sectors and farm types. A calibrated agro-economic model demonstrated a marked influence of climate adapted land allocation and crop management on individual utility. The "chain of risk" approach allowed for investigating the hypothesis that meteorological risks act as drivers for agricultural innovation. Risk types were quantified in terms of probability and distribution, and further distinguished according to production type. Examples of strategies and options were provided at field, farm and policy level using different modelling methods.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/29212828','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/29212828"><span>Detecting anxiety in individuals with Parkinson disease: A systematic review.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Mele, Bria; Holroyd-Leduc, Jayna; Smith, Eric E; Pringsheim, Tamara; Ismail, Zahinoor; Goodarzi, Zahra</p> <p>2018-01-02</p> <p>To examine diagnostic accuracy of anxiety detection tools compared with a gold standard in outpatient settings among adults with Parkinson disease (PD). A systematic review was conducted. MEDLINE, EMABASE, PsycINFO, and Cochrane Database of Systematic Reviews were searched to April 7, 2017. Prevalence of anxiety and diagnostic accuracy measures including sensitivity, specificity, and likelihood ratios were gathered. Pooled prevalence of anxiety was calculated using Mantel-Haenszel-weighted DerSimonian and Laird models. A total of 6,300 citations were reviewed with 6 full-text articles included for synthesis. Tools included within this study were the Beck Anxiety Inventory, Geriatric Anxiety Inventory (GAI), Hamilton Anxiety Rating Scale, Hospital Anxiety and Depression Scale-Anxiety, Parkinson's Anxiety Scale (PAS), and Mini-Social Phobia Inventory. Anxiety diagnoses made included generalized anxiety disorder, social phobia, and any anxiety type. Pooled prevalence of anxiety was 30.1% (95% confidence interval 26.1%-34.0%). The GAI had the best-reported sensitivity of 0.86 and specificity of 0.88. The observer-rated PAS had a sensitivity of 0.71 and the highest specificity of 0.91. While there are 6 tools validated for anxiety screening in PD populations, most tools are only validated in single studies. The GAI is brief and easy to use, with a good balance of sensitivity and specificity. The PAS was specifically developed for PD, is brief, and has self-/observer-rated scales, but with lower sensitivity. Health care practitioners involved in PD care need to be aware of available validated tools and choose one that fits their practice. Copyright © 2017 American Academy of Neurology.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/19670000389','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/19670000389"><span>Rugged switch responds to minute pressure differentials</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Friend, L. C.; Shaub, K. D.</p> <p>1967-01-01</p> <p>Pressure responsive switching device exhibits high sensitivity but is extremely rugged and resistant to large amplitude shock and velocity loading. This snap-action, single pole-double throw switch operates over a wide temperature range.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('http://hdl.handle.net/2060/20140007349','NASA-TRS'); return false;" href="http://hdl.handle.net/2060/20140007349"><span>The NASA Langley Multidisciplinary Uncertainty Quantification Challenge</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://ntrs.nasa.gov/search.jsp">NASA Technical Reports Server (NTRS)</a></p> <p>Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.</p> <p>2014-01-01</p> <p>This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26653268','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26653268"><span>Performance characteristics of five triage tools for major incidents involving traumatic injuries to children.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Price, C L; Brace-McDonnell, S J; Stallard, N; Bleetman, A; Maconochie, I; Perkins, G D</p> <p>2016-05-01</p> <p>Context Triage tools are an essential component of the emergency response to a major incident. Although fortunately rare, mass casualty incidents involving children are possible which mandate reliable triage tools to determine the priority of treatment. To determine the performance characteristics of five major incident triage tools amongst paediatric casualties who have sustained traumatic injuries. Retrospective observational cohort study using data from 31,292 patients aged less than 16 years who sustained a traumatic injury. Data were obtained from the UK Trauma Audit and Research Network (TARN) database. Interventions Statistical evaluation of five triage tools (JumpSTART, START, CareFlight, Paediatric Triage Tape/Sieve and Triage Sort) to predict death or severe traumatic injury (injury severity score >15). Main outcome measures Performance characteristics of triage tools (sensitivity, specificity and level of agreement between triage tools) to identify patients at high risk of death or severe injury. Of the 31,292 cases, 1029 died (3.3%), 6842 (21.9%) had major trauma (defined by an injury severity score >15) and 14,711 (47%) were aged 8 years or younger. There was variation in the performance accuracy of the tools to predict major trauma or death (sensitivities ranging between 36.4 and 96.2%; specificities 66.0-89.8%). Performance characteristics varied with the age of the child. CareFlight had the best overall performance at predicting death, with the following sensitivity and specificity (95% CI) respectively: 95.3% (93.8-96.8) and 80.4% (80.0-80.9). JumpSTART was superior for the triaging of children under 8 years; sensitivity and specificity (95% CI) respectively: 86.3% (83.1-89.5) and 84.8% (84.2-85.5). The triage tools were generally better at identifying patients who would die than those with non-fatal severe injury. This statistical evaluation has demonstrated variability in the accuracy of triage tools at predicting outcomes for children who sustain traumatic injuries. No single tool performed consistently well across all evaluated scenarios. Copyright © 2015 Elsevier Ltd. All rights reserved.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/26189574','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/26189574"><span>A web-based study of bipolarity and impulsivity in athletes engaging in extreme and high-risk sports.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dudek, Dominika; Siwek, Marcin; Jaeschke, Rafał; Drozdowicz, Katarzyna; Styczeń, Krzysztof; Arciszewska, Aleksandra; Chrobak, Adrian A; Rybakowski, Janusz K</p> <p>2016-06-01</p> <p>We hypothesised that men and women who engage in extreme or high-risk sports would score higher on standardised measures of bipolarity and impulsivity compared to age and gender matched controls. Four-hundred and eighty extreme or high-risk athletes (255 males and 225 females) and 235 age-matched control persons (107 males and 128 females) were enrolled into the web-based case-control study. The Mood Disorder Questionnaire (MDQ) and Barratt Impulsiveness Scale (BIS-11) were administered to screen for bipolarity and impulsive behaviours, respectively. Results indicated that extreme or high-risk athletes had significantly higher scores of bipolarity and impulsivity, and lower scores on cognitive complexity of the BIS-11, compared to controls. Further, there were positive correlations between the MDQ and BIS-11 scores. These results showed greater rates of bipolarity and impulsivity, in the extreme or high-risk athletes, suggesting these measures are sensitive to high-risk behaviours.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/22909185','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/22909185"><span>Which screening tools can predict injury to the lower extremities in team sports?: a systematic review.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Dallinga, Joan M; Benjaminse, Anne; Lemmink, Koen A P M</p> <p>2012-09-01</p> <p>Injuries to lower extremities are common in team sports such as soccer, basketball, volleyball, football and field hockey. Considering personal grief, disabling consequences and high costs caused by injuries to lower extremities, the importance for the prevention of these injuries is evident. From this point of view it is important to know which screening tools can identify athletes who are at risk of injury to their lower extremities. The aim of this article is to determine the predictive values of anthropometric and/or physical screening tests for injuries to the leg, anterior cruciate ligament (ACL), knee, hamstring, groin and ankle in team sports. A systematic review was conducted in MEDLINE (1966 to September 2011), EMBASE (1989 to September 2011) and CINAHL (1982 to September 2011). Based on inclusion criteria defined a priori, titles, abstracts and full texts were analysed to find relevant studies. The analysis showed that different screening tools can be predictive for injuries to the knee, ACL, hamstring, groin and ankle. For injuries in general there is some support in the literature to suggest that general joint laxity is a predictive measure for leg injuries. The anterior right/left reach distance >4 cm and the composite reach distance <4.0% of limb length in girls measured with the star excursion balance test (SEBT) may predict leg injuries. Furthermore, an increasing age, a lower hamstring/quadriceps (H : Q) ratio and a decreased range of motion (ROM) of hip abduction may predict the occurrence of leg injuries. Hyperextension of the knee, side-to-side differences in anterior-posterior knee laxity and differences in knee abduction moment between both legs are suggested to be predictive tests for sustaining an ACL injury and height was a predictive screening tool for knee ligament injuries. There is some evidence that when age increases, the probability of sustaining a hamstring injury increases. Debate exists in the analysed literature regarding measurement of the flexibility of the hamstring as a predictive screening tool, as well as using the H : Q ratio. Hip-adduction-to-abduction strength is a predictive test for hip adductor muscle strain. Studies do not agree on whether ROM of the hamstring is a predictive screening tool for groin injury. Body mass index and the age of an athlete could contribute to an ankle sprain. There is support in the literature to suggest that greater strength of the plantar flexors may be a predictive measure for sustaining an ankle injury. Furthermore, there is some agreement that the measurement of postural sway is a predictive test for an ankle injury. The screening tools mentioned above can be recommended to medical staff and coaches for screening their athletes. Future research should focus on prospective studies in larger groups and should follow athletes over several seasons.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.ncbi.nlm.nih.gov/pubmed/17448357','PUBMED'); return false;" href="https://www.ncbi.nlm.nih.gov/pubmed/17448357"><span>Climate change and children.</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="https://www.ncbi.nlm.nih.gov/entrez/query.fcgi?DB=pubmed">PubMed</a></p> <p>Ebi, Kristie L; Paulson, Jerome A</p> <p>2007-04-01</p> <p>Climate change is increasing the burden of climate-sensitive health determinants and outcomes worldwide. Acting through increasing temperature, changes in the hydrologic cycle, and sea level rise, climate change is projected to increase the frequency and intensity of heat events and extreme events (floods and droughts), change the geographic range and incidence of climate-sensitive vector-, food-, and waterborne diseases, and increase diseases associated with air pollution and aeroallergens. Children are particularly vulnerable to these health outcomes because of their potentially greater exposures, greater sensitivity to certain exposures, and their dependence on caregivers.</p> </li> <li> <p><a target="_blank" rel="noopener noreferrer" onclick="trackOutboundLink('https://www.osti.gov/pages/biblio/1435432-mean-annual-precipitation-predicts-primary-production-resistance-resilience-extreme-drought','SCIGOV-DOEP'); return false;" href="https://www.osti.gov/pages/biblio/1435432-mean-annual-precipitation-predicts-primary-production-resistance-resilience-extreme-drought"><span>Mean annual precipitation predicts primary production resistance and resilience to extreme drought</span></a></p> <p><a target="_blank" rel="noopener noreferrer" href="http://www.osti.gov/pages">DOE PAGES</a></p> <p>Stuart-Haëntjens, Ellen; De Boeck, Hans J.; Lemoine, Nathan P.; ...</p> <p>2018-09-01</p> <p>Extreme drought is increasing in frequency and intensity in many regions globally, with uncertain consequences for the resistance and resilience of ecosystem functions, including primary production. Primary production resistance, the capacity to withstand change during extreme drought, and resilience, the degree to which production recovers, vary among and within ecosystem types, obscuring generalized patterns of ecological stability. Theory and many observations suggest forest production is more resistant but less resilient than grassland production to extreme drought; however, studies of production sensitivity to precipitation variability indicate that the processes controlling resistance and resilience may be influenced more by mean annual precipitationmore » (MAP) than ecosystem type. Here, we conducted a global meta-analysis to investigate primary production resistance and resilience to extreme drought in 64 forests and grasslands across a broad MAP gradient. We found resistance to extreme drought was predicted by MAP; however, grasslands (positive) and forests (negative) exhibited opposing resilience relationships with MAP. Our findings indicate that common plant physiological mechanisms may determine grassland and forest resistance to extreme drought, whereas differences among plant residents in turnover time, plant architecture, and drought adaptive strategies likely underlie divergent resilience patterns. The low resistance and resilience of dry grasslands suggests that these ecosystems are the most vulnerable to extreme drought – a vulnerability that is expected to compound as extreme drought frequency increases in the future.« less</p> </li> </ol> <div class="pull-right"> <ul class="pagination"> <li><a href="#" onclick='return showDiv("page_1");'>«</a></li> <li><a href="#" onclick='return showDiv("page_21");'>21</a></li> <li><a href="#" onclick='return showDiv("page_22");'>22</a></li> <li><a href="#" onclick='return showDiv("page_23");'>23</a></li> <li><a href="#" onclick='return showDiv("page_24");'>24</a></li> <li class="active"><span>25</span></li> <li><a href="#" onclick='return showDiv("page_25");'>»</a></li> </ul> </div> </div><!-- col-sm-12 --> </div><!-- row --> </div><!-- page_25 --> <div class="footer-extlink text-muted" style="margin-bottom:1rem; text-align:center;">Some links on this page may take you to non-federal websites. Their policies may differ from this site.</div> </div><!-- container --> <a id="backToTop" href="#top"> Top </a> <footer> <nav> <ul class="links"> <li><a href="/sitemap.html">Site Map</a></li> <li><a href="/website-policies.html">Website Policies</a></li> <li><a href="https://www.energy.gov/vulnerability-disclosure-policy" target="_blank">Vulnerability Disclosure Program</a></li> <li><a href="/contact.html">Contact Us</a></li> </ul> </nav> </footer> <script type="text/javascript"><!-- // var lastDiv = ""; function showDiv(divName) { // hide last div if (lastDiv) { document.getElementById(lastDiv).className = "hiddenDiv"; } //if value of the box is not nothing and an object with that name exists, then change the class if (divName && document.getElementById(divName)) { document.getElementById(divName).className = "visibleDiv"; lastDiv = divName; } } //--> </script> <script> /** * Function that tracks a click on an outbound link in Google Analytics. * This function takes a valid URL string as an argument, and uses that URL string * as the event label. */ var trackOutboundLink = function(url,collectionCode) { try { h = window.open(url); setTimeout(function() { ga('send', 'event', 'topic-page-click-through', collectionCode, url); }, 1000); } catch(err){} }; </script> <!-- Google Analytics --> <script> (function(i,s,o,g,r,a,m){i['GoogleAnalyticsObject']=r;i[r]=i[r]||function(){ (i[r].q=i[r].q||[]).push(arguments)},i[r].l=1*new Date();a=s.createElement(o), m=s.getElementsByTagName(o)[0];a.async=1;a.src=g;m.parentNode.insertBefore(a,m) })(window,document,'script','//www.google-analytics.com/analytics.js','ga'); ga('create', 'UA-1122789-34', 'auto'); ga('send', 'pageview'); </script> <!-- End Google Analytics --> <script> showDiv('page_1') </script> </body> </html>