Sample records for extremely large number

  1. Data Mining of Extremely Large Ad Hoc Data Sets to Produce Inverted Indices

    DTIC Science & Technology

    2016-06-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS Approved for public release; distribution is unlimited DATA MINING OF...COVERED Master’s Thesis 4. TITLE AND SUBTITLE DATA MINING OF EXTREMELY LARGE AD HOC DATA SETS TO PRODUCE INVERTED INDICES 5. FUNDING NUMBERS 6...INTENTIONALLY LEFT BLANK iii Approved for public release; distribution is unlimited DATA MINING OF EXTREMELY LARGE AD HOC DATA SETS TO PRODUCE

  2. Heavy Tail Behavior of Rainfall Extremes across Germany

    NASA Astrophysics Data System (ADS)

    Castellarin, A.; Kreibich, H.; Vorogushyn, S.; Merz, B.

    2017-12-01

    Distributions are termed heavy-tailed if extreme values are more likely than would be predicted by probability distributions that have exponential asymptotic behavior. Heavy-tail behavior often leads to surprise, because historical observations can be a poor guide for the future. Heavy-tail behavior seems to be widespread for hydro-meteorological extremes, such as extreme rainfall and flood events. To date there have been only vague hints to explain under which conditions these extremes show heavy-tail behavior. We use an observational data set consisting of 11 climate variables at 1440 stations across Germany. This homogenized, gap-free data set covers 110 years (1901-2010) at daily resolution. We estimate the upper tail behavior, including its uncertainty interval, of daily precipitation extremes for the 1,440 stations at the annual and seasonal time scales. Different tail indicators are tested, including the shape parameter of the Generalized Extreme Value distribution, the upper tail ratio and the obesity index. In a further step, we explore to which extent the tail behavior can be explained by geographical and climate factors. A large number of characteristics is derived, such as station elevation, degree of continentality, aridity, measures for quantifying the variability of humidity and wind velocity, or event-triggering large-scale atmospheric situation. The link between the upper tail behavior and these characteristics is investigated via data mining methods capable of detecting non-linear relationships in large data sets. This exceptionally rich observational data set, in terms of number of stations, length of time series and number of explaining variables, allows insights into the upper tail behavior which is rarely possible given the typical observational data sets available.

  3. Characterizing differences in precipitation regimes of extreme wet and dry years: implications for climate change experiments.

    PubMed

    Knapp, Alan K; Hoover, David L; Wilcox, Kevin R; Avolio, Meghan L; Koerner, Sally E; La Pierre, Kimberly J; Loik, Michael E; Luo, Yiqi; Sala, Osvaldo E; Smith, Melinda D

    2015-02-03

    Climate change is intensifying the hydrologic cycle and is expected to increase the frequency of extreme wet and dry years. Beyond precipitation amount, extreme wet and dry years may differ in other ways, such as the number of precipitation events, event size, and the time between events. We assessed 1614 long-term (100 year) precipitation records from around the world to identify key attributes of precipitation regimes, besides amount, that distinguish statistically extreme wet from extreme dry years. In general, in regions where mean annual precipitation (MAP) exceeded 1000 mm, precipitation amounts in extreme wet and dry years differed from average years by ~40% and 30%, respectively. The magnitude of these deviations increased to >60% for dry years and to >150% for wet years in arid regions (MAP<500 mm). Extreme wet years were primarily distinguished from average and extreme dry years by the presence of multiple extreme (large) daily precipitation events (events >99th percentile of all events); these occurred twice as often in extreme wet years compared to average years. In contrast, these large precipitation events were rare in extreme dry years. Less important for distinguishing extreme wet from dry years were mean event size and frequency, or the number of dry days between events. However, extreme dry years were distinguished from average years by an increase in the number of dry days between events. These precipitation regime attributes consistently differed between extreme wet and dry years across 12 major terrestrial ecoregions from around the world, from deserts to the tropics. Thus, we recommend that climate change experiments and model simulations incorporate these differences in key precipitation regime attributes, as well as amount into treatments. This will allow experiments to more realistically simulate extreme precipitation years and more accurately assess the ecological consequences. © 2015 John Wiley & Sons Ltd.

  4. Googols and Infinity

    ERIC Educational Resources Information Center

    Gough, John

    2005-01-01

    In this article, the author presents his tales of very large numbers. He discusses the concept of infinity and extremely large numbers such as "googol" and "googolplex". "Googol" which could be written as 1, followed by one hundred zeros, was popularized by Edward Kasner and James Newman. Moreover, "googol" was coined by Kasner's nine-year old…

  5. A Projection of Changes in Landfilling Atmospheric River Frequency and Extreme Precipitation over Western North America from the Large Ensemble CESM Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hagos, Samson M.; Leung, Lai-Yung R.; Yoon, Jin-Ho

    Simulations from the Community Earth System Model Large Ensemble project are analyzed to investigate the impact of global warming on atmospheric rivers (ARs). The model has notable biases in simulating the subtropical jet position and the relationship between extreme precipitation and moisture transport. After accounting for these biases, the model projects an ensemble mean increase of 35% in the number of landfalling AR days between the last twenty years of the 20th and 21st centuries. However, the number of AR associated extreme precipitation days increases only by 28% because the moisture transport required to produce extreme precipitation also increases withmore » warming. Internal variability introduces an uncertainty of ±8% and ±7% in the projected changes in AR days and associated extreme precipitation days. In contrast, accountings for model biases only change the projected changes by about 1%. The significantly larger mean changes compared to internal variability and to the effects of model biases highlight the robustness of AR responses to global warming.« less

  6. Contribution of large-scale circulation anomalies to changes in extreme precipitation frequency in the United States

    Treesearch

    Lejiang Yu; Shiyuan Zhong; Lisi Pei; Xindi (Randy) Bian; Warren E. Heilman

    2016-01-01

    The mean global climate has warmed as a result of the increasing emission of greenhouse gases induced by human activities. This warming is considered the main reason for the increasing number of extreme precipitation events in the US. While much attention has been given to extreme precipitation events occurring over several days, which are usually responsible for...

  7. Role of quasiresonant planetary wave dynamics in recent boreal spring-to-autumn extreme events

    PubMed Central

    Petoukhov, Vladimir; Petri, Stefan; Rahmstorf, Stefan; Coumou, Dim; Kornhuber, Kai; Schellnhuber, Hans Joachim

    2016-01-01

    In boreal spring-to-autumn (May-to-September) 2012 and 2013, the Northern Hemisphere (NH) has experienced a large number of severe midlatitude regional weather extremes. Here we show that a considerable part of these extremes were accompanied by highly magnified quasistationary midlatitude planetary waves with zonal wave numbers m = 6, 7, and 8. We further show that resonance conditions for these planetary waves were, in many cases, present before the onset of high-amplitude wave events, with a lead time up to 2 wk, suggesting that quasiresonant amplification (QRA) of these waves had occurred. Our results support earlier findings of an important role of the QRA mechanism in amplifying planetary waves, favoring recent NH weather extremes. PMID:27274064

  8. Investigating the Randomness of Numbers

    ERIC Educational Resources Information Center

    Pendleton, Kenn L.

    2009-01-01

    The use of random numbers is pervasive in today's world. Random numbers have practical applications in such far-flung arenas as computer simulations, cryptography, gambling, the legal system, statistical sampling, and even the war on terrorism. Evaluating the randomness of extremely large samples is a complex, intricate process. However, the…

  9. Characterization and prediction of extreme events in turbulence

    NASA Astrophysics Data System (ADS)

    Fonda, Enrico; Iyer, Kartik P.; Sreenivasan, Katepalli R.

    2017-11-01

    Extreme events in Nature such as tornadoes, large floods and strong earthquakes are rare but can have devastating consequences. The predictability of these events is very limited at present. Extreme events in turbulence are the very large events in small scales that are intermittent in character. We examine events in energy dissipation rate and enstrophy which are several tens to hundreds to thousands of times the mean value. To this end we use our DNS database of homogeneous and isotropic turbulence with Taylor Reynolds numbers spanning a decade, computed with different small scale resolutions and different box sizes, and study the predictability of these events using machine learning. We start with an aggressive data augmentation to virtually increase the number of these rare events by two orders of magnitude and train a deep convolutional neural network to predict their occurrence in an independent data set. The goal of the work is to explore whether extreme events can be predicted with greater assurance than can be done by conventional methods (e.g., D.A. Donzis & K.R. Sreenivasan, J. Fluid Mech. 647, 13-26, 2010).

  10. Evaluation of Genetic Algorithm Concepts using Model Problems. Part 1; Single-Objective Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.

    2003-01-01

    A genetic-algorithm-based optimization approach is described and evaluated using a simple hill-climbing model problem. The model problem utilized herein allows for the broad specification of a large number of search spaces including spaces with an arbitrary number of genes or decision variables and an arbitrary number hills or modes. In the present study, only single objective problems are considered. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all problems attempted. The most difficult problems - those with large hyper-volumes and multi-mode search spaces containing a large number of genes - require a large number of function evaluations for GA convergence, but they always converge.

  11. Low Reynolds number numerical solutions of chaotic flow

    NASA Technical Reports Server (NTRS)

    Pulliam, Thomas H.

    1989-01-01

    Numerical computations of two-dimensional flow past an airfoil at low Mach number, large angle of attack, and low Reynolds number are reported which show a sequence of flow states leading from single-period vortex shedding to chaos via the period-doubling mechanism. Analysis of the flow in terms of phase diagrams, Poincare sections, and flowfield variables are used to substantiate these results. The critical Reynolds number for the period-doubling bifurcations is shown to be sensitive to mesh refinement and the influence of large amounts of numerical dissipation. In extreme cases, large amounts of added dissipation can delay or completely eliminate the chaotic response. The effect of artificial dissipation at these low Reynolds numbers is to produce a new effective Reynolds number for the computations.

  12. Atomic and electronic structures of an extremely fragile liquid.

    PubMed

    Kohara, Shinji; Akola, Jaakko; Patrikeev, Leonid; Ropo, Matti; Ohara, Koji; Itou, Masayoshi; Fujiwara, Akihiko; Yahiro, Jumpei; Okada, Junpei T; Ishikawa, Takehiko; Mizuno, Akitoshi; Masuno, Atsunobu; Watanabe, Yasuhiro; Usuki, Takeshi

    2014-12-18

    The structure of high-temperature liquids is an important topic for understanding the fragility of liquids. Here we report the structure of a high-temperature non-glass-forming oxide liquid, ZrO2, at an atomistic and electronic level. The Bhatia-Thornton number-number structure factor of ZrO2 does not show a first sharp diffraction peak. The atomic structure comprises ZrO5, ZrO6 and ZrO7 polyhedra with a significant contribution of edge sharing of oxygen in addition to corner sharing. The variety of large oxygen coordination and polyhedral connections with short Zr-O bond lifetimes, induced by the relatively large ionic radius of zirconium, disturbs the evolution of intermediate-range ordering, which leads to a reduced electronic band gap and increased delocalization in the ionic Zr-O bonding. The details of the chemical bonding explain the extremely low viscosity of the liquid and the absence of a first sharp diffraction peak, and indicate that liquid ZrO2 is an extremely fragile liquid.

  13. Improving the performance of extreme learning machine for hyperspectral image classification

    NASA Astrophysics Data System (ADS)

    Li, Jiaojiao; Du, Qian; Li, Wei; Li, Yunsong

    2015-05-01

    Extreme learning machine (ELM) and kernel ELM (KELM) can offer comparable performance as the standard powerful classifier―support vector machine (SVM), but with much lower computational cost due to extremely simple training step. However, their performance may be sensitive to several parameters, such as the number of hidden neurons. An empirical linear relationship between the number of training samples and the number of hidden neurons is proposed. Such a relationship can be easily estimated with two small training sets and extended to large training sets so as to greatly reduce computational cost. Other parameters, such as the steepness parameter in the sigmodal activation function and regularization parameter in the KELM, are also investigated. The experimental results show that classification performance is sensitive to these parameters; fortunately, simple selections will result in suboptimal performance.

  14. TECA: A Parallel Toolkit for Extreme Climate Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prabhat, Mr; Ruebel, Oliver; Byna, Surendra

    2012-03-12

    We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.

  15. Intervention for First Graders with Limited Number Knowledge: Large-Scale Replication of a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Gersten, Russell; Rolfhus, Eric; Clarke, Ben; Decker, Lauren E.; Wilkins, Chuck; Dimino, Joseph

    2015-01-01

    Replication studies are extremely rare in education. This randomized controlled trial (RCT) is a scale-up replication of Fuchs et al., which in a sample of 139 found a statistically significant positive impact for Number Rockets, a small-group intervention for at-risk first graders that focused on building understanding of number operations. The…

  16. Pulsed beam of extremely large helium droplets

    NASA Astrophysics Data System (ADS)

    Kuma, Susumu; Azuma, Toshiyuki

    2017-12-01

    We generated a pulsed helium droplet beam with average droplet diameters of up to 2 μ m using a solenoid pulsed valve operated at temperatures as low as 7 K. The droplet diameter was controllable over two orders of magnitude, or six orders of the number of atoms per droplet, by lowering the valve temperature from 21 to 7 K. A sudden droplet size change attributed to the so-called ;supercritical expansion; was firstly observed in pulsed mode, which is necessary to obtain the micrometer-scale droplets. This beam source is beneficial for experiments that require extremely large helium droplets in intense, pulsed form.

  17. United States Geological Survey fire science: fire danger monitoring and forecasting

    USGS Publications Warehouse

    Eidenshink, Jeff C.; Howard, Stephen M.

    2012-01-01

    Each day, the U.S. Geological Survey produces 7-day forecasts for all Federal lands of the distributions of number of ignitions, number of fires above a given size, and conditional probabilities of fires growing larger than a specified size. The large fire probability map is an estimate of the likelihood that ignitions will become large fires. The large fire forecast map is a probability estimate of the number of fires on federal lands exceeding 100 acres in the forthcoming week. The ignition forecast map is a probability estimate of the number of fires on Federal land greater than 1 acre in the forthcoming week. The extreme event forecast is the probability estimate of the number of fires on Federal land that may exceed 5,000 acres in the forthcoming week.

  18. Global Weirding? - Using Very Large Ensembles and Extreme Value Theory to assess Changes in Extreme Weather Events Today

    NASA Astrophysics Data System (ADS)

    Otto, F. E. L.; Mitchell, D.; Sippel, S.; Black, M. T.; Dittus, A. J.; Harrington, L. J.; Mohd Saleh, N. H.

    2014-12-01

    A shift in the distribution of socially-relevant climate variables such as daily minimum winter temperatures and daily precipitation extremes, has been attributed to anthropogenic climate change for various mid-latitude regions. However, while there are many process-based arguments suggesting also a change in the shape of these distributions, attribution studies demonstrating this have not currently been undertaken. Here we use a very large initial condition ensemble of ~40,000 members simulating the European winter 2013/2014 using the distributed computing infrastructure under the weather@home project. Two separate scenarios are used:1. current climate conditions, and 2. a counterfactual scenario of "world that might have been" without anthropogenic forcing. Specifically focusing on extreme events, we assess how the estimated parameters of the Generalized Extreme Value (GEV) distribution vary depending on variable-type, sampling frequency (daily, monthly, …) and geographical region. We find that the location parameter changes for most variables but, depending on the region and variables, we also find significant changes in scale and shape parameters. The very large ensemble allows, furthermore, to assess whether such findings in the fitted GEV distributions are consistent with an empirical analysis of the model data, and whether the most extreme data still follow a known underlying distribution that in a small sample size might otherwise be thought of as an out-lier. The ~40,000 member ensemble is simulated using 12 different SST patterns (1 'observed', and 11 best guesses of SSTs with no anthropogenic warming). The range in SSTs, along with the corresponding changings in the NAO and high-latitude blocking inform on the dynamics governing some of these extreme events. While strong tele-connection patterns are not found in this particular experiment, the high number of simulated extreme events allows for a more thorough analysis of the dynamics than has been performed before. Therefore, combining extreme value theory with very large ensemble simulations allows us to understand the dynamics of changes in extreme events which is not possible just using the former but also shows in which cases statistics combined with smaller ensembles give as valid results as very large initial conditions.

  19. Extreme-phenotype genome-wide association study (XP-GWAS): a method for identifying trait-associated variants by sequencing pools of individuals selected from a diversity panel.

    PubMed

    Yang, Jinliang; Jiang, Haiying; Yeh, Cheng-Ting; Yu, Jianming; Jeddeloh, Jeffrey A; Nettleton, Dan; Schnable, Patrick S

    2015-11-01

    Although approaches for performing genome-wide association studies (GWAS) are well developed, conventional GWAS requires high-density genotyping of large numbers of individuals from a diversity panel. Here we report a method for performing GWAS that does not require genotyping of large numbers of individuals. Instead XP-GWAS (extreme-phenotype GWAS) relies on genotyping pools of individuals from a diversity panel that have extreme phenotypes. This analysis measures allele frequencies in the extreme pools, enabling discovery of associations between genetic variants and traits of interest. This method was evaluated in maize (Zea mays) using the well-characterized kernel row number trait, which was selected to enable comparisons between the results of XP-GWAS and conventional GWAS. An exome-sequencing strategy was used to focus sequencing resources on genes and their flanking regions. A total of 0.94 million variants were identified and served as evaluation markers; comparisons among pools showed that 145 of these variants were statistically associated with the kernel row number phenotype. These trait-associated variants were significantly enriched in regions identified by conventional GWAS. XP-GWAS was able to resolve several linked QTL and detect trait-associated variants within a single gene under a QTL peak. XP-GWAS is expected to be particularly valuable for detecting genes or alleles responsible for quantitative variation in species for which extensive genotyping resources are not available, such as wild progenitors of crops, orphan crops, and other poorly characterized species such as those of ecological interest. © 2015 The Authors The Plant Journal published by Society for Experimental Biology and John Wiley & Sons Ltd.

  20. Impact of a Single Unusually Large Rainfall Event on the Level of Risk Used for Infrastructure Design

    NASA Astrophysics Data System (ADS)

    Dhakal, N.; Jain, S.

    2013-12-01

    Rare and unusually large events (such as hurricanes and floods) can create unusual and interesting trends in statistics. Generalized Extreme Value (GEV) distribution is usually used to statistically describe extreme rainfall events. A number of the recent studies have shown that the frequency of extreme rainfall events has increased over the last century and as a result, there has been change in parameters of GEV distribution with the time (non-stationary). But what impact does a single unusually large rainfall event (e.g., hurricane Irene) have on the GEV parameters and consequently on the level of risks or the return periods used in designing the civil infrastructures? In other words, if such a large event occurs today, how will it influence the level of risks (estimated based on past rainfall records) for the civil infrastructures? To answer these questions, we performed sensitivity analysis of the distribution parameters of GEV as well as the return periods to unusually large outlier events. The long-term precipitation records over the period of 1981-2010 from 12 USHCN stations across the state of Maine were used for analysis. For most of the stations, addition of each outlier event caused an increase in the shape parameter with a huge decrease on the corresponding return period. This is a key consideration for time-varying engineering design. These isolated extreme weather events should simultaneously be considered with traditional statistical methodology related to extreme events while designing civil infrastructures (such as dams, bridges, and culverts). Such analysis is also useful in understanding the statistical uncertainty of projecting extreme events into future.

  1. Turbulent pipe flow at extreme Reynolds numbers.

    PubMed

    Hultmark, M; Vallikivi, M; Bailey, S C C; Smits, A J

    2012-03-02

    Both the inherent intractability and complex beauty of turbulence reside in its large range of physical and temporal scales. This range of scales is captured by the Reynolds number, which in nature and in many engineering applications can be as large as 10(5)-10(6). Here, we report turbulence measurements over an unprecedented range of Reynolds numbers using a unique combination of a high-pressure air facility and a new nanoscale anemometry probe. The results reveal previously unknown universal scaling behavior for the turbulent velocity fluctuations, which is remarkably similar to the well-known scaling behavior of the mean velocity distribution.

  2. Contribution of large-scale circulation anomalies to changes in extreme precipitation frequency in the United States

    NASA Astrophysics Data System (ADS)

    Yu, Lejiang; Zhong, Shiyuan; Pei, Lisi; Bian, Xindi; Heilman, Warren E.

    2016-04-01

    The mean global climate has warmed as a result of the increasing emission of greenhouse gases induced by human activities. This warming is considered the main reason for the increasing number of extreme precipitation events in the US. While much attention has been given to extreme precipitation events occurring over several days, which are usually responsible for severe flooding over a large region, little is known about how extreme precipitation events that cause flash flooding and occur at sub-daily time scales have changed over time. Here we use the observed hourly precipitation from the North American Land Data Assimilation System Phase 2 forcing datasets to determine trends in the frequency of extreme precipitation events of short (1 h, 3 h, 6 h, 12 h and 24 h) duration for the period 1979-2013. The results indicate an increasing trend in the central and eastern US. Over most of the western US, especially the Southwest and the Intermountain West, the trends are generally negative. These trends can be largely explained by the interdecadal variability of the Pacific Decadal Oscillation and Atlantic Multidecadal Oscillation (AMO), with the AMO making a greater contribution to the trends in both warm and cold seasons.

  3. Climate Extreme Events over Northern Eurasia in Changing Climate

    NASA Astrophysics Data System (ADS)

    Bulygina, O.; Korshunova, N. N.; Razuvaev, V. N.; Groisman, P. Y.

    2014-12-01

    During the period of widespread instrumental observations in Northern Eurasia, the annual surface air temperature has increased by 1.5°C. Close to the north in the Arctic Ocean, the late summer sea ice extent has decreased by 40% providing a near-infinite source of water vapor for the dry Arctic atmosphere in the early cold season months. The contemporary sea ice changes are especially visible in the Eastern Hemisphere All these factors affect the change extreme events. Daily and sub-daily data of 940 stations to analyze variations in the space time distribution of extreme temperatures, precipitation, and wind over Russia were used. Changing in number of days with thaw over Russia was described. The total seasonal numbers of days, when daily surface air temperatures (wind, precipitation) were found to be above (below) selected thresholds, were used as indices of climate extremes. Changing in difference between maximum and minimum temperature (DTR) may produce a variety of effects on biological systems. All values falling within the intervals ranged from the lowest percentile to the 5th percentile and from the 95th percentile to the highest percentile for the time period of interest were considered as daily extremes. The number of days, N, when daily temperatures (wind, precipitation, DTR) were within the above mentioned intervals, was determined for the seasons of each year. Linear trends in the number of days were calculated for each station and for quasi-homogeneous climatic regions. Regional analysis of extreme events was carried out using quasi-homogeneous climatic regions. Maps (climatology, trends) are presented mostly for visualization purposes. Differences in regional characteristics of extreme events are accounted for over a large extent of the Russian territory and variety of its physical and geographical conditions. The number of days with maximum temperatures higher than the 95% percentile has increased in most of Russia and decreased in Siberia in spring and autumn. Reducing the number of days with extremely low air temperatures dominated in all seasons. At the same time, the number of days with abnormally low air temperatures has increased in Middle Volga region and south of Western Siberia. In most parts of European Russia observed increase in the number of days with heavy snowfalls.

  4. Criminal Intent with Property: A Study of Real Estate Fraud Prediction and Detection

    ERIC Educational Resources Information Center

    Blackman, David H.

    2013-01-01

    The large number of real estate transactions across the United States, combined with closing process complexity, creates extremely large data sets that conceal anomalies indicative of fraud. The quantitative amount of damage due to fraud is immeasurable to the lives of individuals who are victims, not to mention the financial impact to…

  5. 76 FR 20595 - Special Local Regulation; Extreme Sailing Series Boston; Boston Harbor, Boston, MA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-13

    ... rule, the combination of a large number of recreational vessels due to spectators, sailboats traveling... vessels involved directly with the event such as: sailboat race participants, event safety vessels, on...

  6. The nonequilibrium quantum many-body problem as a paradigm for extreme data science

    NASA Astrophysics Data System (ADS)

    Freericks, J. K.; Nikolić, B. K.; Frieder, O.

    2014-12-01

    Generating big data pervades much of physics. But some problems, which we call extreme data problems, are too large to be treated within big data science. The nonequilibrium quantum many-body problem on a lattice is just such a problem, where the Hilbert space grows exponentially with system size and rapidly becomes too large to fit on any computer (and can be effectively thought of as an infinite-sized data set). Nevertheless, much progress has been made with computational methods on this problem, which serve as a paradigm for how one can approach and attack extreme data problems. In addition, viewing these physics problems from a computer-science perspective leads to new approaches that can be tried to solve more accurately and for longer times. We review a number of these different ideas here.

  7. The waviness of the extratropical jet and daily weather extremes

    NASA Astrophysics Data System (ADS)

    Röthlisberger, Matthias; Martius, Olivia; Pfahl, Stephan

    2016-04-01

    In recent years the Northern Hemisphere mid-latitudes have experienced a large number of weather extremes with substantial socio-economic impact, such as the European and Russian heat waves in 2003 and 2010, severe winter floods in the United Kingdom in 2013/2014 and devastating winter storms such as Lothar (1999) and Xynthia (2010) in Central Europe. These have triggered an engaged debate within the scientific community on the role of human induced climate change in the occurrence of such extremes. A key element of this debate is the hypothesis that the waviness of the extratropical jet is linked to the occurrence of weather extremes, with a wavier jet stream favouring more extremes. Previous work on this topic is expanded in this study by analyzing the linkage between a regional measure of jet waviness and daily temperature, precipitation and wind gust extremes. We show that indeed such a linkage exists in many regions of the world, however this waviness-extremes linkage varies spatially in strength and sign. Locally, it is strong only where the relevant weather systems, in which the extremes occur, are affected by the jet waviness. Its sign depends on how the frequency of occurrence of the relevant weather systems is correlated with the occurrence of high and low jet waviness. These results go beyond previous studies by noting that also a decrease in waviness could be associated with an enhanced number of some weather extremes, especially wind gust and precipitation extremes over western Europe.

  8. 76 FR 36311 - Special Local Regulation; Extreme Sailing Series Boston; Boston Harbor, Boston, MA

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-22

    .... Without the rule, the combination of a large number of recreational vessels due to spectators, sailboats... directly with the event such as: sailboat race participants, event safety vessels, on-scene patrol and law...

  9. Restoration of pitcher plant bogs in eastern Texas, USA

    Treesearch

    Ronald Mize; Robert E. Evans; Barbara R. MacRoberts; Michael H. MacRoberts; D. Craig Rudolph

    2005-01-01

    Pitcher plant bogs, also referred to as hillside seepages bogs or hillside bogs, are extremely restricted on the West Gulf Coastal Plain. the number and extent of extant bogs is in the low hundreds, comprising no more than a few thousand hectares of habitat. These bogs support a large number of plant species of significant conservation concern. threats to existing bogs...

  10. Lepton number violation in theories with a large number of standard model copies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovalenko, Sergey; Schmidt, Ivan; Paes, Heinrich

    2011-03-01

    We examine lepton number violation (LNV) in theories with a saturated black hole bound on a large number of species. Such theories have been advocated recently as a possible solution to the hierarchy problem and an explanation of the smallness of neutrino masses. On the other hand, the violation of the lepton number can be a potential phenomenological problem of this N-copy extension of the standard model as due to the low quantum gravity scale black holes may induce TeV scale LNV operators generating unacceptably large rates of LNV processes. We show, however, that this issue can be avoided bymore » introducing a spontaneously broken U{sub 1(B-L)}. Then, due to the existence of a specific compensation mechanism between contributions of different Majorana neutrino states, LNV processes in the standard model copy become extremely suppressed with rates far beyond experimental reach.« less

  11. Extreme Wildlife Declines and Concurrent Increase in Livestock Numbers in Kenya: What Are the Causes?

    PubMed Central

    Ogutu, Joseph O.; Piepho, Hans-Peter; Said, Mohamed Y.; Ojwang, Gordon O.; Njino, Lucy W.; Kifugo, Shem C.; Wargute, Patrick W.

    2016-01-01

    There is growing evidence of escalating wildlife losses worldwide. Extreme wildlife losses have recently been documented for large parts of Africa, including western, Central and Eastern Africa. Here, we report extreme declines in wildlife and contemporaneous increase in livestock numbers in Kenya rangelands between 1977 and 2016. Our analysis uses systematic aerial monitoring survey data collected in rangelands that collectively cover 88% of Kenya’s land surface. Our results show that wildlife numbers declined on average by 68% between 1977 and 2016. The magnitude of decline varied among species but was most extreme (72–88%) and now severely threatens the population viability and persistence of warthog, lesser kudu, Thomson’s gazelle, eland, oryx, topi, hartebeest, impala, Grevy’s zebra and waterbuck in Kenya’s rangelands. The declines were widespread and occurred in most of the 21 rangeland counties. Likewise to wildlife, cattle numbers decreased (25.2%) but numbers of sheep and goats (76.3%), camels (13.1%) and donkeys (6.7%) evidently increased in the same period. As a result, livestock biomass was 8.1 times greater than that of wildlife in 2011–2013 compared to 3.5 times in 1977–1980. Most of Kenya’s wildlife (ca. 30%) occurred in Narok County alone. The proportion of the total “national” wildlife population found in each county increased between 1977 and 2016 substantially only in Taita Taveta and Laikipia but marginally in Garissa and Wajir counties, largely reflecting greater wildlife losses elsewhere. The declines raise very grave concerns about the future of wildlife, the effectiveness of wildlife conservation policies, strategies and practices in Kenya. Causes of the wildlife declines include exponential human population growth, increasing livestock numbers, declining rainfall and a striking rise in temperatures but the fundamental cause seems to be policy, institutional and market failures. Accordingly, we thoroughly evaluate wildlife conservation policy in Kenya. We suggest policy, institutional and management interventions likely to succeed in reducing the declines and restoring rangeland health, most notably through strengthening and investing in community and private wildlife conservancies in the rangelands. PMID:27676077

  12. Extreme Wildlife Declines and Concurrent Increase in Livestock Numbers in Kenya: What Are the Causes?

    PubMed

    Ogutu, Joseph O; Piepho, Hans-Peter; Said, Mohamed Y; Ojwang, Gordon O; Njino, Lucy W; Kifugo, Shem C; Wargute, Patrick W

    There is growing evidence of escalating wildlife losses worldwide. Extreme wildlife losses have recently been documented for large parts of Africa, including western, Central and Eastern Africa. Here, we report extreme declines in wildlife and contemporaneous increase in livestock numbers in Kenya rangelands between 1977 and 2016. Our analysis uses systematic aerial monitoring survey data collected in rangelands that collectively cover 88% of Kenya's land surface. Our results show that wildlife numbers declined on average by 68% between 1977 and 2016. The magnitude of decline varied among species but was most extreme (72-88%) and now severely threatens the population viability and persistence of warthog, lesser kudu, Thomson's gazelle, eland, oryx, topi, hartebeest, impala, Grevy's zebra and waterbuck in Kenya's rangelands. The declines were widespread and occurred in most of the 21 rangeland counties. Likewise to wildlife, cattle numbers decreased (25.2%) but numbers of sheep and goats (76.3%), camels (13.1%) and donkeys (6.7%) evidently increased in the same period. As a result, livestock biomass was 8.1 times greater than that of wildlife in 2011-2013 compared to 3.5 times in 1977-1980. Most of Kenya's wildlife (ca. 30%) occurred in Narok County alone. The proportion of the total "national" wildlife population found in each county increased between 1977 and 2016 substantially only in Taita Taveta and Laikipia but marginally in Garissa and Wajir counties, largely reflecting greater wildlife losses elsewhere. The declines raise very grave concerns about the future of wildlife, the effectiveness of wildlife conservation policies, strategies and practices in Kenya. Causes of the wildlife declines include exponential human population growth, increasing livestock numbers, declining rainfall and a striking rise in temperatures but the fundamental cause seems to be policy, institutional and market failures. Accordingly, we thoroughly evaluate wildlife conservation policy in Kenya. We suggest policy, institutional and management interventions likely to succeed in reducing the declines and restoring rangeland health, most notably through strengthening and investing in community and private wildlife conservancies in the rangelands.

  13. Stevens Institute SYS-625 Final Paper: Busy Parents Need Extremely Fast, Quality Home-Cooked Dinners That Their Kids Will Eat

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meyers, Carol A.

    This study provides a modern take on an age-old need: busy parents need extremely fast, high quality home-cooked dinners that their kids will eat. In the past decade, the number of choices that parents have for filling this need have proliferated, largely due to technological advances. Our study proposes to leverage this technology in building a system geared toward decreasing whining in kids and increasing the sanity of their parents.

  14. Review of space radiation interaction with ZERODUR

    NASA Astrophysics Data System (ADS)

    Carré, Antoine; Westerhoff, Thomas; Hull, Tony; Doyle, D.

    2017-09-01

    ZERODUR has been and is still being successfully used as mirror substrates for a large number of space missions. Improvements in CNC machining at SCHOTT allow to achieve extremely light weighted substrates incorporating very thin ribs and face sheets. This paper is reviewing data published on the interaction of space radiation with ZERODUR. Additionally, this paper reports on considerations and experiments which are needed to confidently apply an updated model on ZERODUR behavior under space radiation for extremely light weighted ZERODUR substrates.

  15. Hologram recording tubes

    NASA Technical Reports Server (NTRS)

    Rajchman, J. H.

    1973-01-01

    Optical memories allow extremely large numbers of bits to be stored and recalled in a matter of microseconds. Two recording tubes, similar to conventional image-converting tubes, but having a soft-glass surface on which hologram is recorded, do not degrade under repeated hologram read/write cycles.

  16. Beyond Traditional Extreme Value Theory Through a Metastatistical Approach: Lessons Learned from Precipitation, Hurricanes, and Storm Surges

    NASA Astrophysics Data System (ADS)

    Marani, M.; Zorzetto, E.; Hosseini, S. R.; Miniussi, A.; Scaioni, M.

    2017-12-01

    The Generalized Extreme Value (GEV) distribution is widely adopted irrespective of the properties of the stochastic process generating the extreme events. However, GEV presents several limitations, both theoretical (asymptotic validity for a large number of events/year or hypothesis of Poisson occurrences of Generalized Pareto events), and practical (fitting uses just yearly maxima or a few values above a high threshold). Here we describe the Metastatistical Extreme Value Distribution (MEVD, Marani & Ignaccolo, 2015), which relaxes asymptotic or Poisson/GPD assumptions and makes use of all available observations. We then illustrate the flexibility of the MEVD by applying it to daily precipitation, hurricane intensity, and storm surge magnitude. Application to daily rainfall from a global raingauge network shows that MEVD estimates are 50% more accurate than those from GEV when the recurrence interval of interest is much greater than the observational period. This makes MEVD suited for application to satellite rainfall observations ( 20 yrs length). Use of MEVD on TRMM data yields extreme event patterns that are in better agreement with surface observations than corresponding GEV estimates.Applied to the HURDAT2 Atlantic hurricane intensity dataset, MEVD significantly outperforms GEV estimates of extreme hurricanes. Interestingly, the Generalized Pareto distribution used for "ordinary" hurricane intensity points to the existence of a maximum limit wind speed that is significantly smaller than corresponding physically-based estimates. Finally, we applied the MEVD approach to water levels generated by tidal fluctuations and storm surges at a set of coastal sites spanning different storm-surge regimes. MEVD yields accurate estimates of large quantiles and inferences on tail thickness (fat vs. thin) of the underlying distribution of "ordinary" surges. In summary, the MEVD approach presents a number of theoretical and practical advantages, and outperforms traditional approaches in several applications. We conclude that the MEVD is a significant contribution to further generalize extreme value theory, with implications for a broad range of Earth Sciences.

  17. Explicit Computations of Instantons and Large Deviations in Beta-Plane Turbulence

    NASA Astrophysics Data System (ADS)

    Laurie, J.; Bouchet, F.; Zaboronski, O.

    2012-12-01

    We use a path integral formalism and instanton theory in order to make explicit analytical predictions about large deviations and rare events in beta-plane turbulence. The path integral formalism is a concise way to get large deviation results in dynamical systems forced by random noise. In the most simple cases, it leads to the same results as the Freidlin-Wentzell theory, but it has a wider range of applicability. This approach is however usually extremely limited, due to the complexity of the theoretical problems. As a consequence it provides explicit results in a fairly limited number of models, often extremely simple ones with only a few degrees of freedom. Few exception exist outside the realm of equilibrium statistical physics. We will show that the barotropic model of beta-plane turbulence is one of these non-equilibrium exceptions. We describe sets of explicit solutions to the instanton equation, and precise derivations of the action functional (or large deviation rate function). The reason why such exact computations are possible is related to the existence of hidden symmetries and conservation laws for the instanton dynamics. We outline several applications of this apporach. For instance, we compute explicitly the very low probability to observe flows with an energy much larger or smaller than the typical one. Moreover, we consider regimes for which the system has multiple attractors (corresponding to different numbers of alternating jets), and discuss the computation of transition probabilities between two such attractors. These extremely rare events are of the utmost importance as the dynamics undergo qualitative macroscopic changes during such transitions.

  18. Dolphin social intelligence: complex alliance relationships in bottlenose dolphins and a consideration of selective environments for extreme brain size evolution in mammals.

    PubMed

    Connor, Richard C

    2007-04-29

    Bottlenose dolphins in Shark Bay, Australia, live in a large, unbounded society with a fission-fusion grouping pattern. Potential cognitive demands include the need to develop social strategies involving the recognition of a large number of individuals and their relationships with others. Patterns of alliance affiliation among males may be more complex than are currently known for any non-human, with individuals participating in 2-3 levels of shifting alliances. Males mediate alliance relationships with gentle contact behaviours such as petting, but synchrony also plays an important role in affiliative interactions. In general, selection for social intelligence in the context of shifting alliances will depend on the extent to which there are strategic options and risk. Extreme brain size evolution may have occurred more than once in the toothed whales, reaching peaks in the dolphin family and the sperm whale. All three 'peaks' of large brain size evolution in mammals (odontocetes, humans and elephants) shared a common selective environment: extreme mutual dependence based on external threats from predators or conspecific groups. In this context, social competition, and consequently selection for greater cognitive abilities and large brain size, was intense.

  19. Rainfall extremes from TRMM data and the Metastatistical Extreme Value Distribution

    NASA Astrophysics Data System (ADS)

    Zorzetto, Enrico; Marani, Marco

    2017-04-01

    A reliable quantification of the probability of weather extremes occurrence is essential for designing resilient water infrastructures and hazard mitigation measures. However, it is increasingly clear that the presence of inter-annual climatic fluctuations determines a substantial long-term variability in the frequency of occurrence of extreme events. This circumstance questions the foundation of the traditional extreme value theory, hinged on stationary Poisson processes or on asymptotic assumptions to derive the Generalized Extreme Value (GEV) distribution. We illustrate here, with application to daily rainfall, a new approach to extreme value analysis, the Metastatistical Extreme Value Distribution (MEVD). The MEVD relaxes the above assumptions and is based on the whole distribution of daily rainfall events, thus allowing optimal use of all available observations. Using a global dataset of rain gauge observations, we show that the MEVD significantly outperforms the Generalized Extreme Value distribution, particularly for long average recurrence intervals and when small samples are available. The latter property suggests MEVD to be particularly suited for applications to satellite rainfall estimates, which only cover two decades, thus making extreme value estimation extremely challenging. Here we apply MEVD to the TRMM TMPA 3B42 product, an 18-year dataset of remotely-sensed daily rainfall providing a quasi-global coverage. Our analyses yield a global scale mapping of daily rainfall extremes and of their distributional tail properties, bridging the existing large gaps in ground-based networks. Finally, we illustrate how our global-scale analysis can provide insight into how properties of local rainfall regimes affect tail estimation uncertainty when using the GEV or MEVD approach. We find a dependence of the estimation uncertainty, for both the GEV- and MEV-based approaches, on the average annual number and on the inter-annual variability of rainy days. In particular, estimation uncertainty decreases 1) as the mean annual number of wet days increases, and 2) as the variability in the number of rainy days, expressed by its coefficient of variation, decreases. We tentatively explain this behavior in terms of the assumptions underlying the two approaches.

  20. Disordered quivers and cold horizons

    DOE PAGES

    Anninos, Dionysios; Anous, Tarek; Denef, Frederik

    2016-12-15

    We analyze the low temperature structure of a supersymmetric quiver quantum mechanics with randomized superpotential coefficients, treating them as quenched disorder. These theories describe features of the low energy dynamics of wrapped branes, which in large number backreact into extremal black holes. We show that the low temperature theory, in the limit of a large number of bifundamentals, exhibits a time reparametrization symmetry as well as a specific heat linear in the temperature. Both these features resemble the behavior of black hole horizons in the zero temperature limit. We demonstrate similarities between the low temperature physics of the random quivermore » model and a theory of large N free fermions with random masses.« less

  1. Extreme cyclone events in the Arctic during wintertime: Variability and Trends

    NASA Astrophysics Data System (ADS)

    Rinke, Annette; Maturilli, Marion; Graham, Robert; Matthes, Heidrun; Handorf, Doerthe; Cohen, Lana; Hudson, Stephen; Moore, John

    2017-04-01

    Extreme cyclone events are of significant interest as they can transport much heat, moisture, and momentum poleward. Associated impacts are warming and sea-ice breakup. Recently, several examples of such extreme weather events occurred in winter (e.g. during the N-ICE2015 campaign north of Svalbard and the Frank North Atlantic storm during the end of December 2015). With Arctic amplification and associated reduced sea-ice cover and warmer sea surface temperatures, the occurrence of extreme cyclones events could be a plausible scenario. We calculate the spatial patterns, and changes and trends of the number of extreme cyclone events in the Arctic based on ERA-Interim six-hourly sea level pressure (SLP) data for winter (November-February) 1979-2015. Further, we analyze the SLP data from the Ny Alesund station for the same 37 year period. We define an extreme cyclone event by a extreme low central pressure (SLP below 985 hPa, which is the 5th percentile of the Ny Alesund/N-ICE2015 SLP data) and a deepening of at least 6 hPa/6 hours. Areas of highest frequency of occurrence of extreme cyclones are south/southeast of Greenland (corresponding to the Islandic low), between Norway and Svalbard and in the Barents/Kara Seas. The time series of the number of occurrence of extreme cyclone events for Ny Alesund/N-ICE show considerable interannual variability. The trend is not consistent through the winter, but we detect an increase in early winter and a slight decrease in late winter. The former is due to the increased occurrence of longer events at the expense of short events. Furthermore, the difference patterns of the frequency of events for months following the September with high and low Arctic sea-ice extent ("Low minus high sea ice") conforms with the change patterns of extreme cyclones numbers (frequency of events "2000-2015 minus 1979-1994") and with the trend patterns. This indicates that the changes in extreme cyclone occurrence in early winter are associated with sea-ice changes (regional feedback). In contrast, different mechanisms via large-scale circulation changes/teleconnections seem to play a role in late winter.

  2. Statistical complexity without explicit reference to underlying probabilities

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.

    2018-06-01

    We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.

  3. Extreme Task-Specificity in Writer’s Cramp

    PubMed Central

    Shamim, Ejaz A.; Chu, Jason; Scheider, Linda H.; Savitt, Joseph; Jinnah, H. A.; Hallett, Mark

    2011-01-01

    Background Focal hand dystonia may be task-specific as is the case with writer’s cramp (WC). In early stages, the task-specificity can be so specific that it may be mistaken for a psychogenic movement disorder. Methods We describe four patients who showed extreme task specificity in WC. They initially only had problems writing either a single letter or number. Although they were largely thought to be psychogenic, they progressed to typical WC. Conclusions Early recognition of this condition may provide an opportunity for early initiation of treatment. PMID:21714006

  4. Atomic and electronic structures of an extremely fragile liquid

    PubMed Central

    Kohara, Shinji; Akola, Jaakko; Patrikeev, Leonid; Ropo, Matti; Ohara, Koji; Itou, Masayoshi; Fujiwara, Akihiko; Yahiro, Jumpei; Okada, Junpei T.; Ishikawa, Takehiko; Mizuno, Akitoshi; Masuno, Atsunobu; Watanabe, Yasuhiro; Usuki, Takeshi

    2014-01-01

    The structure of high-temperature liquids is an important topic for understanding the fragility of liquids. Here we report the structure of a high-temperature non-glass-forming oxide liquid, ZrO2, at an atomistic and electronic level. The Bhatia–Thornton number–number structure factor of ZrO2 does not show a first sharp diffraction peak. The atomic structure comprises ZrO5, ZrO6 and ZrO7 polyhedra with a significant contribution of edge sharing of oxygen in addition to corner sharing. The variety of large oxygen coordination and polyhedral connections with short Zr–O bond lifetimes, induced by the relatively large ionic radius of zirconium, disturbs the evolution of intermediate-range ordering, which leads to a reduced electronic band gap and increased delocalization in the ionic Zr–O bonding. The details of the chemical bonding explain the extremely low viscosity of the liquid and the absence of a first sharp diffraction peak, and indicate that liquid ZrO2 is an extremely fragile liquid. PMID:25520236

  5. Extremely large nonsaturating magnetoresistance and ultrahigh mobility due to topological surface states in the metallic Bi2Te3 topological insulator

    NASA Astrophysics Data System (ADS)

    Shrestha, K.; Chou, M.; Graf, D.; Yang, H. D.; Lorenz, B.; Chu, C. W.

    2017-05-01

    Weak antilocalization (WAL) effects in Bi2Te3 single crystals have been investigated at high and low bulk charge-carrier concentrations. At low charge-carrier density the WAL curves scale with the normal component of the magnetic field, demonstrating the dominance of topological surface states in magnetoconductivity. At high charge-carrier density the WAL curves scale with neither the applied field nor its normal component, implying a mixture of bulk and surface conduction. WAL due to topological surface states shows no dependence on the nature (electrons or holes) of the bulk charge carriers. The observations of an extremely large nonsaturating magnetoresistance and ultrahigh mobility in the samples with lower carrier density further support the presence of surface states. The physical parameters characterizing the WAL effects are calculated using the Hikami-Larkin-Nagaoka formula. At high charge-carrier concentrations, there is a greater number of conduction channels and a decrease in the phase coherence length compared to low charge-carrier concentrations. The extremely large magnetoresistance and high mobility of topological insulators have great technological value and can be exploited in magnetoelectric sensors and memory devices.

  6. Squeezing and its graphical representations in the anharmonic oscillator model

    NASA Astrophysics Data System (ADS)

    Tanaś, R.; Miranowicz, A.; Kielich, S.

    1991-04-01

    The problem of squeezing and its graphical representations in the anharmonic oscillator model is considered. Explicit formulas for squeezing, principal squeezing, and the quasiprobability distribution (QPD) function are given and illustrated graphically. Approximate analytical formulas for the variances, extremal variances, and QPD are obtained for the case of small nonlinearities and large numbers of photons. The possibility of almost perfect squeezing in the model is demonstrated and its graphical representations in the form of variance lemniscates and QPD contours are plotted. For large numbers of photons the crescent shape of the QPD contours is hardly visible and quite regular ellipses are obtained.

  7. Dynamics of molecules in extreme rotational states

    PubMed Central

    Yuan, Liwei; Teitelbaum, Samuel W.; Robinson, Allison; Mullin, Amy S.

    2011-01-01

    We have constructed an optical centrifuge with a pulse energy that is more than 2 orders of magnitude larger than previously reported instruments. This high pulse energy enables us to create large enough number densities of molecules in extreme rotational states to perform high-resolution state-resolved transient IR absorption measurements. Here we report the first studies of energy transfer dynamics involving molecules in extreme rotational states. In these studies, the optical centrifuge drives CO2 molecules into states with J ∼ 220 and we use transient IR probing to monitor the subsequent rotational, translational, and vibrational energy flow dynamics. The results reported here provide the first molecular insights into the relaxation of molecules with rotational energy that is comparable to that of a chemical bond.

  8. Is there an association between astrological data and personality?

    PubMed

    Hume, N

    1977-07-01

    A test was made of the hypothesis that personality characteristics can be predicted on the basis of various features of the individual's astrological chart. Astrological charts were prepared for 196 college-age Ss who also were administered the MMPI and the Leary Interpersonal Check List. Ss were divided into those who had extreme scores on any of the 13 personality variables studied and those who did not. For each personality variable, comparisons were made on a large number of astrological dimensions between distributions of Ss with and without extreme test scores. Six hundred thirty-two such comparisons were made and evaluated with chi-square tests. In that the obtained number of statistically significnat chi-squares was less than what would be expected on a chance basis, the hypothesis was rejected.

  9. Fast detection of the fuzzy communities based on leader-driven algorithm

    NASA Astrophysics Data System (ADS)

    Fang, Changjian; Mu, Dejun; Deng, Zhenghong; Hu, Jun; Yi, Chen-He

    2018-03-01

    In this paper, we present the leader-driven algorithm (LDA) for learning community structure in networks. The algorithm allows one to find overlapping clusters in a network, an important aspect of real networks, especially social networks. The algorithm requires no input parameters and learns the number of clusters naturally from the network. It accomplishes this using leadership centrality in a clever manner. It identifies local minima of leadership centrality as followers which belong only to one cluster, and the remaining nodes are leaders which connect clusters. In this way, the number of clusters can be learned using only the network structure. The LDA is also an extremely fast algorithm, having runtime linear in the network size. Thus, this algorithm can be used to efficiently cluster extremely large networks.

  10. Very high-resolution spectroscopy for extremely large telescopes using pupil slicing and adaptive optics.

    PubMed

    Beckers, Jacques M; Andersen, Torben E; Owner-Petersen, Mette

    2007-03-05

    Under seeing limited conditions very high resolution spectroscopy becomes very difficult for extremely large telescopes (ELTs). Using adaptive optics (AO) the stellar image size decreases proportional with the telescope diameter. This makes the spectrograph optics and hence its resolution independent of the telescope diameter. However AO for use with ELTs at visible wavelengths require deformable mirrors with many elements. Those are not likely to be available for quite some time. We propose to use the pupil slicing technique to create a number of sub-pupils each of which having its own deformable mirror. The images from all sub-pupils are combined incoherently with a diameter corresponding to the diffraction limit of the sub-pupil. The technique is referred to as "Pupil Slicing Adaptive Optics" or PSAO.

  11. A fast time-difference inverse solver for 3D EIT with application to lung imaging.

    PubMed

    Javaherian, Ashkan; Soleimani, Manuchehr; Moeller, Knut

    2016-08-01

    A class of sparse optimization techniques that require solely matrix-vector products, rather than an explicit access to the forward matrix and its transpose, has been paid much attention in the recent decade for dealing with large-scale inverse problems. This study tailors application of the so-called Gradient Projection for Sparse Reconstruction (GPSR) to large-scale time-difference three-dimensional electrical impedance tomography (3D EIT). 3D EIT typically suffers from the need for a large number of voxels to cover the whole domain, so its application to real-time imaging, for example monitoring of lung function, remains scarce since the large number of degrees of freedom of the problem extremely increases storage space and reconstruction time. This study shows the great potential of the GPSR for large-size time-difference 3D EIT. Further studies are needed to improve its accuracy for imaging small-size anomalies.

  12. Towards large scale multi-target tracking

    NASA Astrophysics Data System (ADS)

    Vo, Ba-Ngu; Vo, Ba-Tuong; Reuter, Stephan; Lam, Quang; Dietmayer, Klaus

    2014-06-01

    Multi-target tracking is intrinsically an NP-hard problem and the complexity of multi-target tracking solutions usually do not scale gracefully with problem size. Multi-target tracking for on-line applications involving a large number of targets is extremely challenging. This article demonstrates the capability of the random finite set approach to provide large scale multi-target tracking algorithms. In particular it is shown that an approximate filter known as the labeled multi-Bernoulli filter can simultaneously track one thousand five hundred targets in clutter on a standard laptop computer.

  13. Gas-Centered Swirl Coaxial Liquid Injector Evaluations

    NASA Technical Reports Server (NTRS)

    Cohn, A. K.; Strakey, P. A.; Talley, D. G.

    2005-01-01

    Development of Liquid Rocket Engines is expensive. Extensive testing at large scales usually required. In order to verify engine lifetime, large number of tests required. Limited Resources available for development. Sub-scale cold-flow and hot-fire testing is extremely cost effective. Could be a necessary (but not sufficient) condition for long engine lifetime. Reduces overall costs and risk of large scale testing. Goal: Determine knowledge that can be gained from sub-scale cold-flow and hot-fire evaluations of LRE injectors. Determine relationships between cold-flow and hot-fire data.

  14. Evaluation of Genetic Algorithm Concepts Using Model Problems. Part 2; Multi-Objective Optimization

    NASA Technical Reports Server (NTRS)

    Holst, Terry L.; Pulliam, Thomas H.

    2003-01-01

    A genetic algorithm approach suitable for solving multi-objective optimization problems is described and evaluated using a series of simple model problems. Several new features including a binning selection algorithm and a gene-space transformation procedure are included. The genetic algorithm is suitable for finding pareto optimal solutions in search spaces that are defined by any number of genes and that contain any number of local extrema. Results indicate that the genetic algorithm optimization approach is flexible in application and extremely reliable, providing optimal results for all optimization problems attempted. The binning algorithm generally provides pareto front quality enhancements and moderate convergence efficiency improvements for most of the model problems. The gene-space transformation procedure provides a large convergence efficiency enhancement for problems with non-convoluted pareto fronts and a degradation in efficiency for problems with convoluted pareto fronts. The most difficult problems --multi-mode search spaces with a large number of genes and convoluted pareto fronts-- require a large number of function evaluations for GA convergence, but always converge.

  15. Risk and dynamics of unprecedented hot months in South East China

    NASA Astrophysics Data System (ADS)

    Thompson, Vikki; Dunstone, Nick J.; Scaife, Adam A.; Smith, Doug M.; Hardiman, Steven C.; Ren, Hong-Li; Lu, Bo; Belcher, Stephen E.

    2018-06-01

    The Yangtze region of South East China has experienced several extreme hot summer months in recent years. Such events can have devastating socio-economic impacts. We use a large ensemble of initialised climate simulations to assess the current chance of unprecedented hot summer months in the Yangtze River region. We find a 10% chance of an unprecedented hot summer month each year. Our simulations suggest that monthly mean temperatures up to 3 °C hotter than the current record are possible. The dynamics of these unprecedented extremes highlights the occurrence of a stationary atmospheric wave, the Silk Road Pattern, in a significant number of extreme hot events. We present evidence that this atmospheric wave is driven by variability in the Indian summer monsoon. Other extreme events are associated with a westward shift in the western North Pacific subtropical high. The most extreme simulated events exhibit combined characteristics of both the Silk Road Pattern and the shifted western North Pacific subtropical high.

  16. [The University in Crisis.

    ERIC Educational Resources Information Center

    Abram, Morris B.

    The university reflects the revolution in the world. Large numbers of "find out" students are not goal oriented and are affected by malaise; many approve of the use of violence in certain situations. Part of the revolution must be accepted and part rejected. The university is extremely vulnerable to violence and, unless it is contained, American…

  17. The Role of Social Context in Terrorist Attacks

    ERIC Educational Resources Information Center

    Argo, Nichole

    2006-01-01

    An increasing number of studies on suicide bombing suggest that terrorism is not necessarily bound to religious extremism. The authors of this body of work, primarily drawn from political science and social psychology, agree that suicide bombings, with or without the trappings of religion, are largely a response to occupation, or, since September…

  18. One dimensional Linescan x-ray detection of pits in fresh cherries

    USDA-ARS?s Scientific Manuscript database

    The presence of pits in processed cherries is a concern for both processors and consumers, in many cases causing injury and potential lawsuits. While machines used for pitting cherries are extremely efficient, if one or more plungers in a pitting head become misaligned, a large number of pits may p...

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chow, Edmond

    Solving sparse problems is at the core of many DOE computational science applications. We focus on the challenge of developing sparse algorithms that can fully exploit the parallelism in extreme-scale computing systems, in particular systems with massive numbers of cores per node. Our approach is to express a sparse matrix factorization as a large number of bilinear constraint equations, and then solving these equations via an asynchronous iterative method. The unknowns in these equations are the matrix entries of the factorization that is desired.

  20. Do regional methods really help reduce uncertainties in flood frequency analyses?

    NASA Astrophysics Data System (ADS)

    Cong Nguyen, Chi; Payrastre, Olivier; Gaume, Eric

    2013-04-01

    Flood frequency analyses are often based on continuous measured series at gauge sites. However, the length of the available data sets is usually too short to provide reliable estimates of extreme design floods. To reduce the estimation uncertainties, the analyzed data sets have to be extended either in time, making use of historical and paleoflood data, or in space, merging data sets considered as statistically homogeneous to build large regional data samples. Nevertheless, the advantage of the regional analyses, the important increase of the size of the studied data sets, may be counterbalanced by the possible heterogeneities of the merged sets. The application and comparison of four different flood frequency analysis methods to two regions affected by flash floods in the south of France (Ardèche and Var) illustrates how this balance between the number of records and possible heterogeneities plays in real-world applications. The four tested methods are: (1) a local statistical analysis based on the existing series of measured discharges, (2) a local analysis valuating the existing information on historical floods, (3) a standard regional flood frequency analysis based on existing measured series at gauged sites and (4) a modified regional analysis including estimated extreme peak discharges at ungauged sites. Monte Carlo simulations are conducted to simulate a large number of discharge series with characteristics similar to the observed ones (type of statistical distributions, number of sites and records) to evaluate to which extent the results obtained on these case studies can be generalized. These two case studies indicate that even small statistical heterogeneities, which are not detected by the standard homogeneity tests implemented in regional flood frequency studies, may drastically limit the usefulness of such approaches. On the other hand, these result show that the valuation of information on extreme events, either historical flood events at gauged sites or estimated extremes at ungauged sites in the considered region, is an efficient way to reduce uncertainties in flood frequency studies.

  1. Some issues related to the novel spectral acceleration method for the fast computation of radiation/scattering from one-dimensional extremely large scale quasi-planar structures

    NASA Astrophysics Data System (ADS)

    Torrungrueng, Danai; Johnson, Joel T.; Chou, Hsi-Tseng

    2002-03-01

    The novel spectral acceleration (NSA) algorithm has been shown to produce an $[\\mathcal{O}]$(Ntot) efficient iterative method of moments for the computation of radiation/scattering from both one-dimensional (1-D) and two-dimensional large-scale quasi-planar structures, where Ntot is the total number of unknowns to be solved. This method accelerates the matrix-vector multiplication in an iterative method of moments solution and divides contributions between points into ``strong'' (exact matrix elements) and ``weak'' (NSA algorithm) regions. The NSA method is based on a spectral representation of the electromagnetic Green's function and appropriate contour deformation, resulting in a fast multipole-like formulation in which contributions from large numbers of points to a single point are evaluated simultaneously. In the standard NSA algorithm the NSA parameters are derived on the basis of the assumption that the outermost possible saddle point, φs,max, along the real axis in the complex angular domain is small. For given height variations of quasi-planar structures, this assumption can be satisfied by adjusting the size of the strong region Ls. However, for quasi-planar structures with large height variations, the adjusted size of the strong region is typically large, resulting in significant increases in computational time for the computation of the strong-region contribution and degrading overall efficiency of the NSA algorithm. In addition, for the case of extremely large scale structures, studies based on the physical optics approximation and a flat surface assumption show that the given NSA parameters in the standard NSA algorithm may yield inaccurate results. In this paper, analytical formulas associated with the NSA parameters for an arbitrary value of φs,max are presented, resulting in more flexibility in selecting Ls to compromise between the computation of the contributions of the strong and weak regions. In addition, a ``multilevel'' algorithm, decomposing 1-D extremely large scale quasi-planar structures into more than one weak region and appropriately choosing the NSA parameters for each weak region, is incorporated into the original NSA method to improve its accuracy.

  2. HAlign-II: efficient ultra-large multiple sequence alignment and phylogenetic tree reconstruction with distributed and parallel computing.

    PubMed

    Wan, Shixiang; Zou, Quan

    2017-01-01

    Multiple sequence alignment (MSA) plays a key role in biological sequence analyses, especially in phylogenetic tree construction. Extreme increase in next-generation sequencing results in shortage of efficient ultra-large biological sequence alignment approaches for coping with different sequence types. Distributed and parallel computing represents a crucial technique for accelerating ultra-large (e.g. files more than 1 GB) sequence analyses. Based on HAlign and Spark distributed computing system, we implement a highly cost-efficient and time-efficient HAlign-II tool to address ultra-large multiple biological sequence alignment and phylogenetic tree construction. The experiments in the DNA and protein large scale data sets, which are more than 1GB files, showed that HAlign II could save time and space. It outperformed the current software tools. HAlign-II can efficiently carry out MSA and construct phylogenetic trees with ultra-large numbers of biological sequences. HAlign-II shows extremely high memory efficiency and scales well with increases in computing resource. THAlign-II provides a user-friendly web server based on our distributed computing infrastructure. HAlign-II with open-source codes and datasets was established at http://lab.malab.cn/soft/halign.

  3. Climate extremes in urban area and their impact on human health: the summer heat waves

    NASA Astrophysics Data System (ADS)

    Baldi, Marina

    2014-05-01

    In the period 1951-2012 the average global land and ocean temperature has increased by approximately 0.72°C [0.49-0.89] when described by a linear trend, and is projected to rapidly increase. Each of the past three decades has been warmer than all the previous decades, with the decade of the 2000's as the warmest, and, since 1880, nine of the ten warmest years are in the 21st century, the only exception being 1998, which was warmed by the strongest El Niño event of the past century. In parallel an increase in the frequency and intensity of extremely hot days is detected with differences at different scales, which represent an health risk specially in largely populated areas as documented for several regions in the world including the Euro-Mediterranean region. If it is still under discussion if heat wave episodes are a direct result of the warming of the lower troposphere, or if, more likely, they are a regional climate event, however heat episodes have been studied in order to define their correlation with large scale atmospheric patterns and with changes in the regional circulation. Whatever the causes and the spatio-temporal extension of the episodes, epidemiological studies show that these conditions pose increasing health risks inducing heat-related diseases including hyperthermia and heat stress, cardiovascular and respiratory illnesses in susceptible individuals with a significant increase in morbidity and mortality especially in densely populated urban areas. In several Mediterranean cities peaks of mortality associated with extremely high temperature (with simultaneous high humidity levels) have been documented showing that, in some cases, a large increase in daily mortality has been reached compared to the average for the period. The number of fatalities during the summer 2003 heat wave in Europe was estimated to largely exceed the average value of some between 22000 and 50000 cases. In the same summer it was also unusually hot across much of Asia, and Shanghai, which is particularly prone to heat waves, recorded the hottest summer in over 50 years. During the event, the maximum number of daily deaths was 317, 42% above the non-heat day average, even though an heat warning system in operation. In this study results from the analysis of heat waves events in Italian cities is presented. Indices representative of extremely hot conditions have been taken into account and results of the analysis of indices such as the number of summer days (SU), number of tropical nights (TR), maxima and minima of daily maximum and minimum temperatures (TXx, TXn, TNx, TNn, respectively), exceedances over fixed thresholds is presented. Results show a clear increase in the past decades of the numbers of days affected by heat events. Some considerations are also presented about the impact on human health of the longest events occurred in the Country.

  4. The influence of mid-latitude storm tracks on hot, cold, dry and wet extremes

    PubMed Central

    Lehmann, Jascha; Coumou, Dim

    2015-01-01

    Changes in mid-latitude circulation can strongly affect the number and intensity of extreme weather events. In particular, high-amplitude quasi-stationary planetary waves have been linked to prolonged weather extremes at the surface. In contrast, analyses of fast-traveling synoptic-scale waves and their direct influence on heat and cold extremes are scarce though changes in such waves have been detected and are projected for the 21st century. Here we apply regression analyses of synoptic activity with surface temperature and precipitation in monthly gridded observational data. We show that over large parts of mid-latitude continental regions, summer heat extremes are associated with low storm track activity. In winter, the occurrence of cold spells is related to low storm track activity over parts of eastern North America, Europe, and central- to eastern Asia. Storm tracks thus have a moderating effect on continental temperatures. Pronounced storm track activity favors monthly rainfall extremes throughout the year, whereas dry spells are associated with a lack thereof. Trend analyses reveal significant regional changes in recent decades favoring the occurrence of cold spells in the eastern US, droughts in California and heat extremes over Eurasia. PMID:26657163

  5. Evaluation of extreme temperature events in northern Spain based on process control charts

    NASA Astrophysics Data System (ADS)

    Villeta, M.; Valencia, J. L.; Saá, A.; Tarquis, A. M.

    2018-02-01

    Extreme climate events have recently attracted the attention of a growing number of researchers because these events impose a large cost on agriculture and associated insurance planning. This study focuses on extreme temperature events and proposes a new method for their evaluation based on statistical process control tools, which are unusual in climate studies. A series of minimum and maximum daily temperatures for 12 geographical areas of a Spanish region between 1931 and 2009 were evaluated by applying statistical process control charts to statistically test whether evidence existed for an increase or a decrease of extreme temperature events. Specification limits were determined for each geographical area and used to define four types of extreme anomalies: lower and upper extremes for the minimum and maximum anomalies. A new binomial Markov extended process that considers the autocorrelation between extreme temperature events was generated for each geographical area and extreme anomaly type to establish the attribute control charts for the annual fraction of extreme days and to monitor the occurrence of annual extreme days. This method was used to assess the significance of changes and trends of extreme temperature events in the analysed region. The results demonstrate the effectiveness of an attribute control chart for evaluating extreme temperature events. For example, the evaluation of extreme maximum temperature events using the proposed statistical process control charts was consistent with the evidence of an increase in maximum temperatures during the last decades of the last century.

  6. Effect of steady and time-harmonic magnetic fields on macrosegragation in alloy solidification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Incropera, F.P.; Prescott, P.J.

    Buoyancy-induced convection during the solidification of alloys can contribute significantly to the redistribution of alloy constituents, thereby creating large composition gradients in the final ingot. Termed macrosegregation, the condition diminishes the quality of the casting and, in the extreme, may require that the casting be remelted. The deleterious effects of buoyancy-driven flows may be suppressed through application of an external magnetic field, and in this study the effects of both steady and time-harmonic fields have been considered. For a steady magnetic field, extremely large field strengths would be required to effectively dampen convection patterns that contribute to macrosegregation. However, bymore » reducing spatial variations in temperature and composition, turbulent mixing induced by a time-harmonic field reduces the number and severity of segregates in the final casting.« less

  7. Urban Form and Extreme Heat Events: Are Sprawling Cities More Vulnerable to Climate Change Than Compact Cities?

    PubMed Central

    Stone, Brian; Hess, Jeremy J.; Frumkin, Howard

    2010-01-01

    Background Extreme heat events (EHEs) are increasing in frequency in large U.S. cities and are responsible for a greater annual number of climate-related fatalities, on average, than any other form of extreme weather. In addition, low-density, sprawling patterns of urban development have been associated with enhanced surface temperatures in urbanized areas. Objectives In this study. we examined the association between urban form at the level of the metropolitan region and the frequency of EHEs over a five-decade period. Methods We employed a widely published sprawl index to measure the association between urban form in 2000 and the mean annual rate of change in EHEs between 1956 and 2005. Results We found that the rate of increase in the annual number of EHEs between 1956 and 2005 in the most sprawling metropolitan regions was more than double the rate of increase observed in the most compact metropolitan regions. Conclusions The design and management of land use in metropolitan regions may offer an important tool for adapting to the heat-related health effects associated with ongoing climate change. PMID:21114000

  8. Extreme value statistics and finite-size scaling at the ecological extinction/laminar-turbulence transition

    NASA Astrophysics Data System (ADS)

    Shih, Hong-Yan; Goldenfeld, Nigel

    Experiments on transitional turbulence in pipe flow seem to show that turbulence is a transient metastable state since the measured mean lifetime of turbulence puffs does not diverge asymptotically at a critical Reynolds number. Yet measurements reveal that the lifetime scales with Reynolds number in a super-exponential way reminiscent of extreme value statistics, and simulations and experiments in Couette and channel flow exhibit directed percolation type scaling phenomena near a well-defined transition. This universality class arises from the interplay between small-scale turbulence and a large-scale collective zonal flow, which exhibit predator-prey behavior. Why is asymptotically divergent behavior not observed? Using directed percolation and a stochastic individual level model of predator-prey dynamics related to transitional turbulence, we investigate the relation between extreme value statistics and power law critical behavior, and show that the paradox is resolved by carefully defining what is measured in the experiments. We theoretically derive the super-exponential scaling law, and using finite-size scaling, show how the same data can give both super-exponential behavior and power-law critical scaling.

  9. An emerging population of BL Lacs with extreme properties: towards a class of EBL and cosmic magnetic field probes?

    NASA Astrophysics Data System (ADS)

    Bonnoli, G.; Tavecchio, F.; Ghisellini, G.; Sbarrato, T.

    2015-07-01

    High-energy observations of extreme BL Lac objects, such as 1ES 0229+200 or 1ES 0347-121, recently focused interest both for blazar and jet physics and for the implication on the extragalactic background light and intergalactic magnetic field estimate. However, the number of these extreme highly peaked BL Lac objects (EHBL) is still rather small. Aiming at increase their number, we selected a group of EHBL candidates starting from the BL Lac sample of Plotkin et al. (2011), considering those undetected (or only barely detected) by the Large Area Telescope onboard Fermi and characterized by a high X-ray versus radio flux ratio. We assembled the multiwavelength spectral energy distribution of the resulting nine sources, profiting of publicly available archival observations performed by Swift, GALEX, and Fermi satellites, confirming their nature. Through a simple one-zone synchrotron self-Compton model we estimate the expected very high energy flux, finding that in the majority of cases it is within the reach of present generation of Cherenkov arrays or of the forthcoming Cherenkov Telescope Array.

  10. Eating Disorder Symptomotology: The Role of Ethnic Identity in Caucasian and Hispanic College Women

    ERIC Educational Resources Information Center

    Avina, Vanessa

    2011-01-01

    A relative large number of women on college campuses report experiencing eating afflictions. About 61% of college women indicated that they either occasionally or regularly used extreme measures to control their weight (Mintz & Betz, 1988). No clear consensus on the relative prevalence of eating disorder symptoms across ethnic groups has…

  11. The Family in the Structure of Values of Young People

    ERIC Educational Resources Information Center

    Rean, A. A.

    2018-01-01

    Despite the fact that the family is extremely significant in the system of values of young people (in Russia), the number of divorces is increasing in this population group. Our analysis of this contradiction establishes that young people need to be specially prepared for family life. The paper presents the results of a large empirical study…

  12. The two-phase method for finding a great number of eigenpairs of the symmetric or weakly non-symmetric large eigenvalue problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dul, F.A.; Arczewski, K.

    1994-03-01

    Although it has been stated that [open quotes]an attempt to solve (very large problems) by subspace iterations seems futile[close quotes], we will show that the statement is not true, especially for extremely large eigenproblems. In this paper a new two-phase subspace iteration/Rayleigh quotient/conjugate gradient method for generalized, large, symmetric eigenproblems Ax = [lambda]Bx is presented. It has the ability of solving extremely large eigenproblems, N = 216,000, for example, and finding a large number of leftmost or rightmost eigenpairs, up to 1000 or more. Multiple eigenpairs, even those with multiplicity 100, can be easily found. The use of the proposedmore » method for solving the big full eigenproblems (N [approximately] 10[sup 3]), as well as for large weakly non-symmetric eigenproblems, have been considered also. The proposed method is fully iterative; thus the factorization of matrices ins avoided. The key idea consists in joining two methods: subspace and Rayleigh quotient iterations. The systems of indefinite and almost singular linear equations (a - [sigma]B)x = By are solved by various iterative conjugate gradient method can be used without danger of breaking down due to its property that may be called [open quotes]self-correction towards the eigenvector,[close quotes] discovered recently by us. The use of various preconditioners (SSOR and IC) has also been considered. The main features of the proposed method have been analyzed in detail. Comparisons with other methods, such as, accelerated subspace iteration, Lanczos, Davidson, TLIME, TRACMN, and SRQMCG, are presented. The results of numerical tests for various physical problems (acoustic, vibrations of structures, quantum chemistry) are presented as well. 40 refs., 12 figs., 2 tabs.« less

  13. Unidirectional trends in annual and seasonal climate and extremes in Egypt

    NASA Astrophysics Data System (ADS)

    Nashwan, Mohamed Salem; Shahid, Shamsuddin; Abd Rahim, Norhan

    2018-05-01

    The presence of short- and long-term autocorrelations can lead to considerable change in significance of trend in hydro-climatic time series. Therefore, past findings of climatic trend studies that did not consider autocorrelations became a questionable issue. The spatial patterns in the trends of annual and seasonal temperature, rainfall, and related extremes in Egypt have been assessed in this paper using modified Mann-Kendal (MMK) trend test which can detect unidirectional trends in time series in the presence of short- and long-term autocorrelations. The trends obtained using the MMK test was compared with that obtained using standard Mann-Kendall (MK) test to show how natural variability in climate affects the trends. The daily rainfall and temperature data of Princeton Global Meteorological Forcing for the period 1948-2010 having a spatial resolution of 0.25° × 0.25° was used for this purpose. The results showed a large difference between the trends obtained using MMK and MK tests. The MMK test showed increasing trends in temperature and a number of temperature extremes in Egypt, but almost no change in rainfall and rainfall extremes. The minimum temperature was found to increase (0.08-0.29 °C/decade) much faster compared to maximum temperature (0.07-0.24 °C/decade) and therefore, a decrease in diurnal temperature range (- 0.01 to - 0.16 °C/decade) in most part of Egypt. The number of winter hot days and nights are increasing, while the number of cold days is decreasing in most part of the country. The study provides a more realistic scenario of the changes in climate and weather extremes of Egypt.

  14. Prospectus: towards the development of high-fidelity models of wall turbulence at large Reynolds number

    NASA Astrophysics Data System (ADS)

    Klewicki, J. C.; Chini, G. P.; Gibson, J. F.

    2017-03-01

    Recent and on-going advances in mathematical methods and analysis techniques, coupled with the experimental and computational capacity to capture detailed flow structure at increasingly large Reynolds numbers, afford an unprecedented opportunity to develop realistic models of high Reynolds number turbulent wall-flow dynamics. A distinctive attribute of this new generation of models is their grounding in the Navier-Stokes equations. By adhering to this challenging constraint, high-fidelity models ultimately can be developed that not only predict flow properties at high Reynolds numbers, but that possess a mathematical structure that faithfully captures the underlying flow physics. These first-principles models are needed, for example, to reliably manipulate flow behaviours at extreme Reynolds numbers. This theme issue of Philosophical Transactions of the Royal Society A provides a selection of contributions from the community of researchers who are working towards the development of such models. Broadly speaking, the research topics represented herein report on dynamical structure, mechanisms and transport; scale interactions and self-similarity; model reductions that restrict nonlinear interactions; and modern asymptotic theories. In this prospectus, the challenges associated with modelling turbulent wall-flows at large Reynolds numbers are briefly outlined, and the connections between the contributing papers are highlighted.

  15. Climate Variability and Weather Extremes: Model-Simulated and Historical Data. Chapter 9

    NASA Technical Reports Server (NTRS)

    Schubert, Siegfried D.; Lim, Young-Kwon

    2012-01-01

    Extremes in weather and climate encompass a wide array of phenomena including tropical storms, mesoscale convective systems, snowstorms, floods, heat waves, and drought. Understanding how such extremes might change in the future requires an understanding of their past behavior including their connections to large-scale climate variability and trends. Previous studies suggest that the most robust findings concerning changes in short-term extremes are those that can be most directly (though not completely) tied to the increase in the global mean temperatures. These include the findings that (IPCC 2007): There has been a widespread reduction in the number of frost days in mid-latitude regions in recent decades, an increase in the number of warm extremes, particularly warm nights, and a reduction in the number of cold extremes, particularly cold nights. For North America in particular (CCSP SAP 3.3, 2008): There are fewer unusually cold days during the last few decades. The last 10 years have seen a lower number of severe cold waves than for any other 10-year period in the historical record that dates back to 1895. There has been a decrease in the number of frost days and a lengthening of the frost-free season, particularly in the western part of North America. Other aspects of extremes such as the changes in storminess have a less clear signature of long term change, with considerable interannual, and decadal variability that can obscure any climate change signal. Nevertheless, regarding extratropical storms (CCSP SAP 3.3, 2008): The balance of evidence suggests that there has been a northward shift in the tracks of strong low pressure systems (storms) in both the North Atlantic and North Pacific basins. For North America: Regional analyses suggest that there has been a decrease in snowstorms in the South and lower Midwest of the United States, and an increase in snowstorms in the upper Midwest and Northeast. Despite the progress already made, our understanding of the basic mechanisms by which extremes vary is incomplete. As noted in IPCC (2007), Incomplete global data sets and remaining model uncertainties still restrict understanding of changes in extremes and attribution of changes to causes, although understanding of changes in the intensity, frequency and risk of extremes has improved. Separating decadal and other shorter-term variability from climate change impacts on extremes requires a better understanding of the processes responsible for the changes. In particular, the physical processes linking sea surface temperature changes to regional climate changes, and a basic understanding of the inherent variability in weather extremes and how that is impacted by atmospheric circulation changes at subseasonal to decadal and longer time scales, are still inadequately understood. Given the fundamental limitations in the time span and quality of global observations, substantial progress on these issues will rely increasingly on improvements in models, with observations continuing to play a critical role, though less as a detection tool, and more as a tool for addressing physical processes, and to insure the quality of the climate models and the verisimilitude of the simulations (CCSP SAP 1.3, 2008).

  16. Prospectus: towards the development of high-fidelity models of wall turbulence at large Reynolds number.

    PubMed

    Klewicki, J C; Chini, G P; Gibson, J F

    2017-03-13

    Recent and on-going advances in mathematical methods and analysis techniques, coupled with the experimental and computational capacity to capture detailed flow structure at increasingly large Reynolds numbers, afford an unprecedented opportunity to develop realistic models of high Reynolds number turbulent wall-flow dynamics. A distinctive attribute of this new generation of models is their grounding in the Navier-Stokes equations. By adhering to this challenging constraint, high-fidelity models ultimately can be developed that not only predict flow properties at high Reynolds numbers, but that possess a mathematical structure that faithfully captures the underlying flow physics. These first-principles models are needed, for example, to reliably manipulate flow behaviours at extreme Reynolds numbers. This theme issue of Philosophical Transactions of the Royal Society A provides a selection of contributions from the community of researchers who are working towards the development of such models. Broadly speaking, the research topics represented herein report on dynamical structure, mechanisms and transport; scale interactions and self-similarity; model reductions that restrict nonlinear interactions; and modern asymptotic theories. In this prospectus, the challenges associated with modelling turbulent wall-flows at large Reynolds numbers are briefly outlined, and the connections between the contributing papers are highlighted.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).

  17. Viscous versus inviscid exact coherent states in high Reynolds number wall flows

    NASA Astrophysics Data System (ADS)

    Montemuro, Brandon; Klewicki, Joe; White, Chris; Chini, Greg

    2017-11-01

    Streamwise-averaged motions consisting of streamwise-oriented streaks and vortices are key components of exact coherent states (ECS) arising in incompressible wall-bounded shear flows. These invariant solutions are believed to provide a scaffold in phase space for the turbulent dynamics realized at large Reynolds number Re . Nevertheless, many ECS, including upper-branch states, have a large- Re asymptotic structure in which the effective Reynolds number governing the streak and roll dynamics is order unity. Although these viscous ECS very likely play a role in the dynamics of the near-wall region, they cannot be relevant to the inertial layer, where the leading-order mean dynamics are known to be inviscid. In particular, viscous ECS cannot account for the observed regions of quasi-uniform streamwise momentum and interlaced internal shear layers (or `vortical fissures') within the inertial layer. In this work, a large- Re asymptotic analysis is performed to extend the existing self-sustaining-process/vortex-wave-interaction theory to account for largely inviscid ECS. The analysis highlights feedback mechanisms between the fissures and uniform momentum zones that can enable their self-sustenance at extreme Reynolds number. NSF CBET Award 1437851.

  18. Multi-floor cascading ferroelectric nanostructures: multiple data writing-based multi-level non-volatile memory devices

    NASA Astrophysics Data System (ADS)

    Hyun, Seung; Kwon, Owoong; Lee, Bom-Yi; Seol, Daehee; Park, Beomjin; Lee, Jae Yong; Lee, Ju Hyun; Kim, Yunseok; Kim, Jin Kon

    2016-01-01

    Multiple data writing-based multi-level non-volatile memory has gained strong attention for next-generation memory devices to quickly accommodate an extremely large number of data bits because it is capable of storing multiple data bits in a single memory cell at once. However, all previously reported devices have failed to store a large number of data bits due to the macroscale cell size and have not allowed fast access to the stored data due to slow single data writing. Here, we introduce a novel three-dimensional multi-floor cascading polymeric ferroelectric nanostructure, successfully operating as an individual cell. In one cell, each floor has its own piezoresponse and the piezoresponse of one floor can be modulated by the bias voltage applied to the other floor, which means simultaneously written data bits in both floors can be identified. This could achieve multi-level memory through a multiple data writing process.Multiple data writing-based multi-level non-volatile memory has gained strong attention for next-generation memory devices to quickly accommodate an extremely large number of data bits because it is capable of storing multiple data bits in a single memory cell at once. However, all previously reported devices have failed to store a large number of data bits due to the macroscale cell size and have not allowed fast access to the stored data due to slow single data writing. Here, we introduce a novel three-dimensional multi-floor cascading polymeric ferroelectric nanostructure, successfully operating as an individual cell. In one cell, each floor has its own piezoresponse and the piezoresponse of one floor can be modulated by the bias voltage applied to the other floor, which means simultaneously written data bits in both floors can be identified. This could achieve multi-level memory through a multiple data writing process. Electronic supplementary information (ESI) available. See DOI: 10.1039/c5nr07377d

  19. Efficient Ab initio Modeling of Random Multicomponent Alloys

    DOE PAGES

    Jiang, Chao; Uberuaga, Blas P.

    2016-03-08

    Here, we present in this Letter a novel small set of ordered structures (SSOS) method that allows extremely efficient ab initio modeling of random multi-component alloys. Using inverse II-III spinel oxides and equiatomic quinary bcc (so-called high entropy) alloys as examples, we also demonstrate that a SSOS can achieve the same accuracy as a large supercell or a well-converged cluster expansion, but with significantly reduced computational cost. In particular, because of this efficiency, a large number of quinary alloy compositions can be quickly screened, leading to the identification of several new possible high entropy alloy chemistries. Furthermore, the SSOS methodmore » developed here can be broadly useful for the rapid computational design of multi-component materials, especially those with a large number of alloying elements, a challenging problem for other approaches.« less

  20. Tempest: Tools for Addressing the Needs of Next-Generation Climate Models

    NASA Astrophysics Data System (ADS)

    Ullrich, P. A.; Guerra, J. E.; Pinheiro, M. C.; Fong, J.

    2015-12-01

    Tempest is a comprehensive simulation-to-science infrastructure that tackles the needs of next-generation, high-resolution, data intensive climate modeling activities. This project incorporates three key components: TempestDynamics, a global modeling framework for experimental numerical methods and high-performance computing; TempestRemap, a toolset for arbitrary-order conservative and consistent remapping between unstructured grids; and TempestExtremes, a suite of detection and characterization tools for identifying weather extremes in large climate datasets. In this presentation, the latest advances with the implementation of this framework will be discussed, and a number of projects now utilizing these tools will be featured.

  1. Focus issue on the Study of Matter at Extreme Conditions

    NASA Astrophysics Data System (ADS)

    Saini, Naurang L.; Saxena, Surendra K.; Bansil, Arun

    2015-09-01

    Study of matter at extreme conditions encompasses many different approaches for understanding the physics, chemistry and materials science underlying processes, products and technologies important for society. Although extreme conditions have been associated traditionally with research in areas of geology, mineral and earth sciences, the field has expanded in the recent years to include work on energy related materials and quantum functional materials from hard to soft matter. With the motivation to engage a large number of scientists with various disciplinary interests, ranging from physics, chemistry, geophysics to materials science, the study of matter at extreme conditions has been the theme of a series of conferences hosted by the High Pressure Science Society of America (HiPSSA) and the Center for the Study of Matter at Extreme Conditions (CeSMEC) of Florida International University (FIU), Miami. These SMEC (Study of Matter at Extreme Conditions) conferences are aimed at providing a unique platform for leading researchers to meet and share cutting-edge developments, and to bridge established fields under this interdisciplinary umbrella for research on materials. The seventh meeting in the SMEC series was held during March 23-30, 2013, while sailing from Miami to the Caribbean Islands, and concluded with great enthusiasm.

  2. The Characteristics of Extreme Erosion Events in a Small Mountainous Watershed

    PubMed Central

    Fang, Nu-Fang; Shi, Zhi-Hua; Yue, Ben-Jiang; Wang, Ling

    2013-01-01

    A large amount of soil loss is caused by a small number of extreme events that are mainly responsible for the time compression of geomorphic processes. The aim of this study was to analyze suspended sediment transport during extreme erosion events in a mountainous watershed. Field measurements were conducted in Wangjiaqiao, a small agricultural watershed (16.7 km2) in the Three Gorges Area (TGA) of China. Continuous records were used to analyze suspended sediment transport regimes and assess the sediment loads of 205 rainfall–runoff events during a period of 16 hydrological years (1989–2004). Extreme events were defined as the largest events, ranked in order of their absolute magnitude (representing the 95th percentile). Ten extreme erosion events from 205 erosion events, representing 83.8% of the total suspended sediment load, were selected for study. The results of canonical discriminant analysis indicated that extreme erosion events are characterized by high maximum flood-suspended sediment concentrations, high runoff coefficients, and high flood peak discharge, which could possibly be explained by the transport of deposited sediment within the stream bed during previous events or bank collapses. PMID:24146898

  3. The Microphysical Structure of Extreme Precipitation as Inferred from Ground-Based Raindrop Spectra.

    NASA Astrophysics Data System (ADS)

    Uijlenhoet, Remko; Smith, James A.; Steiner, Matthias

    2003-05-01

    The controls on the variability of raindrop size distributions in extreme rainfall and the associated radar reflectivity-rain rate relationships are studied using a scaling-law formalism for the description of raindrop size distributions and their properties. This scaling-law formalism enables a separation of the effects of changes in the scale of the raindrop size distribution from those in its shape. Parameters controlling the scale and shape of the scaled raindrop size distribution may be related to the microphysical processes generating extreme rainfall. A global scaling analysis of raindrop size distributions corresponding to rain rates exceeding 100 mm h1, collected during the 1950s with the Illinois State Water Survey raindrop camera in Miami, Florida, reveals that extreme rain rates tend to be associated with conditions in which the variability of the raindrop size distribution is strongly number controlled (i.e., characteristic drop sizes are roughly constant). This means that changes in properties of raindrop size distributions in extreme rainfall are largely produced by varying raindrop concentrations. As a result, rainfall integral variables (such as radar reflectivity and rain rate) are roughly proportional to each other, which is consistent with the concept of the so-called equilibrium raindrop size distribution and has profound implications for radar measurement of extreme rainfall. A time series analysis for two contrasting extreme rainfall events supports the hypothesis that the variability of raindrop size distributions for extreme rain rates is strongly number controlled. However, this analysis also reveals that the actual shapes of the (measured and scaled) spectra may differ significantly from storm to storm. This implies that the exponents of power-law radar reflectivity-rain rate relationships may be similar, and close to unity, for different extreme rainfall events, but their prefactors may differ substantially. Consequently, there is no unique radar reflectivity-rain rate relationship for extreme rain rates, but the variability is essentially reduced to one free parameter (i.e., the prefactor). It is suggested that this free parameter may be estimated on the basis of differential reflectivity measurements in extreme rainfall.

  4. A frequency dependent preconditioned wavelet method for atmospheric tomography

    NASA Astrophysics Data System (ADS)

    Yudytskiy, Mykhaylo; Helin, Tapio; Ramlau, Ronny

    2013-12-01

    Atmospheric tomography, i.e. the reconstruction of the turbulence in the atmosphere, is a main task for the adaptive optics systems of the next generation telescopes. For extremely large telescopes, such as the European Extremely Large Telescope, this problem becomes overly complex and an efficient algorithm is needed to reduce numerical costs. Recently, a conjugate gradient method based on wavelet parametrization of turbulence layers was introduced [5]. An iterative algorithm can only be numerically efficient when the number of iterations required for a sufficient reconstruction is low. A way to achieve this is to design an efficient preconditioner. In this paper we propose a new frequency-dependent preconditioner for the wavelet method. In the context of a multi conjugate adaptive optics (MCAO) system simulated on the official end-to-end simulation tool OCTOPUS of the European Southern Observatory we demonstrate robustness and speed of the preconditioned algorithm. We show that three iterations are sufficient for a good reconstruction.

  5. Spacecraft Dynamics and Control Program at AFRPL

    NASA Technical Reports Server (NTRS)

    Das, A.; Slimak, L. K. S.; Schloegel, W. T.

    1986-01-01

    A number of future DOD and NASA spacecraft such as the space based radar will be not only an order of magnitude larger in dimension than the current spacecraft, but will exhibit extreme structural flexibility with very low structural vibration frequencies. Another class of spacecraft (such as the space defense platforms) will combine large physical size with extremely precise pointing requirement. Such problems require a total departure from the traditional methods of modeling and control system design of spacecraft where structural flexibility is treated as a secondary effect. With these problems in mind, the Air Force Rocket Propulsion Laboratory (AFRPL) initiated research to develop dynamics and control technology so as to enable the future large space structures (LSS). AFRPL's effort in this area can be subdivided into the following three overlapping areas: (1) ground experiments, (2) spacecraft modeling and control, and (3) sensors and actuators. Both the in-house and contractual efforts of the AFRPL in LSS are summarized.

  6. An Update on Experimental Climate Prediction and Analysis Products Being Developed at NASA's Global Modeling and Assimilation Office

    NASA Technical Reports Server (NTRS)

    Schubert, Siegfried

    2011-01-01

    The Global Modeling and Assimilation Office at NASA's Goddard Space Flight Center is developing a number of experimental prediction and analysis products suitable for research and applications. The prediction products include a large suite of subseasonal and seasonal hindcasts and forecasts (as a contribution to the US National MME), a suite of decadal (10-year) hindcasts (as a contribution to the IPCC decadal prediction project), and a series of large ensemble and high resolution simulations of selected extreme events, including the 2010 Russian and 2011 US heat waves. The analysis products include an experimental atlas of climate (in particular drought) and weather extremes. This talk will provide an update on those activities, and discuss recent efforts by WCRP to leverage off these and similar efforts at other institutions throughout the world to develop an experimental global drought early warning system.

  7. Opportunities to Develop Mathematical Proficiency: How Teachers Structure Participation in the Elementary Mathematics Classroom

    ERIC Educational Resources Information Center

    Freund, Deanna Patrice Nichols

    2011-01-01

    The opportunity to learn for African American and Latino children is extremely limited in a large number of US classrooms. Many societal issues are to blame, but high-stakes testing has exacerbated this problem. The pressure to increase test scores has caused a narrowing of the curriculum, particularly in low-performing schools, most of which are…

  8. Pinon pine mortality event in the Southwest: An update for 2005

    Treesearch

    D. Allen-Reid; J. Anhold; D. Cluck; T. Eager; R. Mask; J. McMillin; S. Munson; J. Negron; T. Rogers; D. Ryerson; E. Smith; S. Smith; B. Steed; R. Thier

    2008-01-01

    (Please note, this is an abstract only) Drought conditions in the Southwest have persisted for a number of years resulting in large areas of pinon pine mortality. In 2002 drought conditions became extreme, facilitating an outbreak of pinon ips beetles (Ips confusus, Coleoptera: Scolytidae) that killed many millions of pinon pines over a six-state region by 2003. In...

  9. Extreme between-study homogeneity in meta-analyses could offer useful insights.

    PubMed

    Ioannidis, John P A; Trikalinos, Thomas A; Zintzaras, Elias

    2006-10-01

    Meta-analyses are routinely evaluated for the presence of large between-study heterogeneity. We examined whether it is also important to probe whether there is extreme between-study homogeneity. We used heterogeneity tests with left-sided statistical significance for inference and developed a Monte Carlo simulation test for testing extreme homogeneity in risk ratios across studies, using the empiric distribution of the summary risk ratio and heterogeneity statistic. A left-sided P=0.01 threshold was set for claiming extreme homogeneity to minimize type I error. Among 11,803 meta-analyses with binary contrasts from the Cochrane Library, 143 (1.21%) had left-sided P-value <0.01 for the asymptotic Q statistic and 1,004 (8.50%) had left-sided P-value <0.10. The frequency of extreme between-study homogeneity did not depend on the number of studies in the meta-analyses. We identified examples where extreme between-study homogeneity (left-sided P-value <0.01) could result from various possibilities beyond chance. These included inappropriate statistical inference (asymptotic vs. Monte Carlo), use of a specific effect metric, correlated data or stratification using strong predictors of outcome, and biases and potential fraud. Extreme between-study homogeneity may provide useful insights about a meta-analysis and its constituent studies.

  10. Sampling large random knots in a confined space

    NASA Astrophysics Data System (ADS)

    Arsuaga, J.; Blackstone, T.; Diao, Y.; Hinson, K.; Karadayi, E.; Saito, M.

    2007-09-01

    DNA knots formed under extreme conditions of condensation, as in bacteriophage P4, are difficult to analyze experimentally and theoretically. In this paper, we propose to use the uniform random polygon model as a supplementary method to the existing methods for generating random knots in confinement. The uniform random polygon model allows us to sample knots with large crossing numbers and also to generate large diagrammatically prime knot diagrams. We show numerically that uniform random polygons sample knots with large minimum crossing numbers and certain complicated knot invariants (as those observed experimentally). We do this in terms of the knot determinants or colorings. Our numerical results suggest that the average determinant of a uniform random polygon of n vertices grows faster than O(e^{n^2}) . We also investigate the complexity of prime knot diagrams. We show rigorously that the probability that a randomly selected 2D uniform random polygon of n vertices is almost diagrammatically prime goes to 1 as n goes to infinity. Furthermore, the average number of crossings in such a diagram is at the order of O(n2). Therefore, the two-dimensional uniform random polygons offer an effective way in sampling large (prime) knots, which can be useful in various applications.

  11. Vacuum statistics and stability in axionic landscapes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Masoumi, Ali; Vilenkin, Alexander, E-mail: ali@cosmos.phy.tufts.edu, E-mail: vilenkin@cosmos.phy.tufts.edu

    2016-03-01

    We investigate vacuum statistics and stability in random axionic landscapes. For this purpose we developed an algorithm for a quick evaluation of the tunneling action, which in most cases is accurate within 10%. We find that stability of a vacuum is strongly correlated with its energy density, with lifetime rapidly growing as the energy density is decreased. On the other hand, the probability P(B) for a vacuum to have a tunneling action B greater than a given value declines as a slow power law in B. This is in sharp contrast with the studies of random quartic potentials, which foundmore » a fast exponential decline of P(B). Our results suggest that the total number of relatively stable vacua (say, with B>100) grows exponentially with the number of fields N and can get extremely large for N∼> 100. The problem with this kind of model is that the stable vacua are concentrated near the absolute minimum of the potential, so the observed value of the cosmological constant cannot be explained without fine-tuning. To address this difficulty, we consider a modification of the model, where the axions acquire a quadratic mass term, due to their mixing with 4-form fields. This results in a larger landscape with a much broader distribution of vacuum energies. The number of relatively stable vacua in such models can still be extremely large.« less

  12. Prospectus: towards the development of high-fidelity models of wall turbulence at large Reynolds number

    PubMed Central

    Klewicki, J. C.; Chini, G. P.; Gibson, J. F.

    2017-01-01

    Recent and on-going advances in mathematical methods and analysis techniques, coupled with the experimental and computational capacity to capture detailed flow structure at increasingly large Reynolds numbers, afford an unprecedented opportunity to develop realistic models of high Reynolds number turbulent wall-flow dynamics. A distinctive attribute of this new generation of models is their grounding in the Navier–Stokes equations. By adhering to this challenging constraint, high-fidelity models ultimately can be developed that not only predict flow properties at high Reynolds numbers, but that possess a mathematical structure that faithfully captures the underlying flow physics. These first-principles models are needed, for example, to reliably manipulate flow behaviours at extreme Reynolds numbers. This theme issue of Philosophical Transactions of the Royal Society A provides a selection of contributions from the community of researchers who are working towards the development of such models. Broadly speaking, the research topics represented herein report on dynamical structure, mechanisms and transport; scale interactions and self-similarity; model reductions that restrict nonlinear interactions; and modern asymptotic theories. In this prospectus, the challenges associated with modelling turbulent wall-flows at large Reynolds numbers are briefly outlined, and the connections between the contributing papers are highlighted. This article is part of the themed issue ‘Toward the development of high-fidelity models of wall turbulence at large Reynolds number’. PMID:28167585

  13. FINITE ELEMENT MODEL FOR TIDAL AND RESIDUAL CIRCULATION.

    USGS Publications Warehouse

    Walters, Roy A.

    1986-01-01

    Harmonic decomposition is applied to the shallow water equations, thereby creating a system of equations for the amplitude of the various tidal constituents and for the residual motions. The resulting equations are elliptic in nature, are well posed and in practice are shown to be numerically well-behaved. There are a number of strategies for choosing elements: the two extremes are to use a few high-order elements with continuous derivatives, or to use a large number of simpler linear elements. In this paper simple linear elements are used and prove effective.

  14. Oxygen concentration dependence of silicon oxide dynamical properties

    NASA Astrophysics Data System (ADS)

    Yajima, Yuji; Shiraishi, Kenji; Endoh, Tetsuo; Kageshima, Hiroyuki

    2018-06-01

    To understand oxidation in three-dimensional silicon, dynamic characteristics of a SiO x system with various stoichiometries were investigated. The calculated results show that the self-diffusion coefficient increases as oxygen density decreases, and the increase is large when the temperature is low. It also shows that the self-diffusion coefficient saturates, when the number of removed oxygen atoms is sufficiently large. Then, approximate analytical equations are derived from the calculated results, and the previously reported expression is confirmed in the extremely low-SiO-density range.

  15. Changing Global Risk Landscape - Challenges for Risk Management (Invited)

    NASA Astrophysics Data System (ADS)

    Wenzel, F.

    2009-12-01

    The exponentially growing losses related to natural disasters on a global scale reflect a changing risk landscape that is characterized by the influence of climate change and a growing population, particularly in urban agglomerations and coastal zones. In consequence of these trends we witness (a) new hazards such as landslides due to dwindling permafrost, new patterns of strong precipitation and related floods, potential for tropical cyclones in the Mediterranean, sea level rise and others; (b) new risks related to large numbers of people in very dense urban areas, and risks related to the vulnerability of infrastructure such as energy supply, water supply, transportation, communication, etc. (c) extreme events with unprecedented size and implications. An appropriate answer to these challenges goes beyond classical views of risk assessment and protection. It must include an understanding of risk as changing with time so that risk assessment needs to be supplemented by risk monitoring. It requires decision making under high uncertainty. The risks (i.e. potentials for future losses) of extreme events are not only high but also very difficult to quantify, as they are characterized by high levels of uncertainty. Uncertainties relate to frequency, time of occurrence, strength and impact of extreme events but also to the coping capacities of society in response to them. The characterization, quantification, reduction in the extent possible of the uncertainties is an inherent topic of extreme event research. However, they will not disappear, so a rational approach to extreme events must include more than reducing uncertainties. It requires us to assess and rate the irreducible uncertainties, to evaluate options for mitigation under large uncertainties, and their communication to societal sectors. Thus scientist need to develop methodologies that aim at a rational approach to extreme events associated with high levels of uncertainty.

  16. Assessing wood quality of borer-infested red oak logs with a resonance acoustic technique

    Treesearch

    Xiping Wang; Henry E. Stelzer; Jan Wiedenbeck; Patricia K. Lebow; Robert J. Ross

    2009-01-01

    Large numbers of black oak (Quercus velutina Lam.) and scarlet oak (Quercus coccinea Muenchh.) trees are declining and dying in the Missouri Ozark forest as a result of oak decline. Red oak borer-infested trees produce low-grade logs that become extremely difficult to merchandize as the level of insect attack increases. The objective of this study was to investigate...

  17. Extreme Value Analysis of hydro meteorological extremes in the ClimEx Large-Ensemble

    NASA Astrophysics Data System (ADS)

    Wood, R. R.; Martel, J. L.; Willkofer, F.; von Trentini, F.; Schmid, F. J.; Leduc, M.; Frigon, A.; Ludwig, R.

    2017-12-01

    Many studies show an increase in the magnitude and frequency of hydrological extreme events in the course of climate change. However the contribution of natural variability to the magnitude and frequency of hydrological extreme events is not yet settled. A reliable estimate of extreme events is from great interest for water management and public safety. In the course of the ClimEx Project (www.climex-project.org) a new single-model large-ensemble was created by dynamically downscaling the CanESM2 large-ensemble with the Canadian Regional Climate Model version 5 (CRCM5) for an European Domain and a Northeastern North-American domain. By utilizing the ClimEx 50-Member Large-Ensemble (CRCM5 driven by CanESM2 Large-Ensemble) a thorough analysis of natural variability in extreme events is possible. Are the current extreme value statistical methods able to account for natural variability? How large is the natural variability for e.g. a 1/100 year return period derived from a 50-Member Large-Ensemble for Europe and Northeastern North-America? These questions should be answered by applying various generalized extreme value distributions (GEV) to the ClimEx Large-Ensemble. Hereby various return levels (5-, 10-, 20-, 30-, 60- and 100-years) based on various lengths of time series (20-, 30-, 50-, 100- and 1500-years) should be analyzed for the maximum one day precipitation (RX1d), the maximum three hourly precipitation (RX3h) and the streamflow for selected catchments in Europe. The long time series of the ClimEx Ensemble (7500 years) allows us to give a first reliable estimate of the magnitude and frequency of certain extreme events.

  18. Reaction factoring and bipartite update graphs accelerate the Gillespie Algorithm for large-scale biochemical systems.

    PubMed

    Indurkhya, Sagar; Beal, Jacob

    2010-01-06

    ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models.

  19. Reaction Factoring and Bipartite Update Graphs Accelerate the Gillespie Algorithm for Large-Scale Biochemical Systems

    PubMed Central

    Indurkhya, Sagar; Beal, Jacob

    2010-01-01

    ODE simulations of chemical systems perform poorly when some of the species have extremely low concentrations. Stochastic simulation methods, which can handle this case, have been impractical for large systems due to computational complexity. We observe, however, that when modeling complex biological systems: (1) a small number of reactions tend to occur a disproportionately large percentage of the time, and (2) a small number of species tend to participate in a disproportionately large percentage of reactions. We exploit these properties in LOLCAT Method, a new implementation of the Gillespie Algorithm. First, factoring reaction propensities allows many propensities dependent on a single species to be updated in a single operation. Second, representing dependencies between reactions with a bipartite graph of reactions and species requires only storage for reactions, rather than the required for a graph that includes only reactions. Together, these improvements allow our implementation of LOLCAT Method to execute orders of magnitude faster than currently existing Gillespie Algorithm variants when simulating several yeast MAPK cascade models. PMID:20066048

  20. Computerized Dental Comparison: A Critical Review of Dental Coding and Ranking Algorithms Used in Victim Identification.

    PubMed

    Adams, Bradley J; Aschheim, Kenneth W

    2016-01-01

    Comparison of antemortem and postmortem dental records is a leading method of victim identification, especially for incidents involving a large number of decedents. This process may be expedited with computer software that provides a ranked list of best possible matches. This study provides a comparison of the most commonly used conventional coding and sorting algorithms used in the United States (WinID3) with a simplified coding format that utilizes an optimized sorting algorithm. The simplified system consists of seven basic codes and utilizes an optimized algorithm based largely on the percentage of matches. To perform this research, a large reference database of approximately 50,000 antemortem and postmortem records was created. For most disaster scenarios, the proposed simplified codes, paired with the optimized algorithm, performed better than WinID3 which uses more complex codes. The detailed coding system does show better performance with extremely large numbers of records and/or significant body fragmentation. © 2015 American Academy of Forensic Sciences.

  1. Taxonomic study of extreme halophilic archaea isolated from the "Salar de Atacama", Chile.

    PubMed

    Lizama, C; Monteoliva-Sánchez, M; Prado, B; Ramos-Cormenzana, A; Weckesser, J; Campos, V

    2001-11-01

    A large number of halophilic bacteria were isolated in 1984-1992 from the Atacama Saltern (North of Chile). For this study 82 strains of extreme halophilic archaea were selected. The characterization was performed by using the phenotypic characters including morphological, physiological, biochemical, nutritional and antimicrobial susceptibility test. The results, together with those from reference strains, were subjected to numerical analysis, using the Simple Matching (S(SM)) coefficient and clustered by the unweighted pair group method of association (UPGMA). Fifteen phena were obtained at an 70% similarity level. The results obtained reveal a high diversity among the halophilic archaea isolated. Representative strains from the phena were chosen to determine their DNA base composition and the percentage of DNA-DNA similarity compared to reference strains. The 16S rRNA studies showed that some of these strains constitutes a new taxa of extreme halophilic archaea.

  2. Prandtl-number Effects in High-Rayleigh-number Spherical Convection

    NASA Astrophysics Data System (ADS)

    Orvedahl, Ryan J.; Calkins, Michael A.; Featherstone, Nicholas A.; Hindman, Bradley W.

    2018-03-01

    Convection is the predominant mechanism by which energy and angular momentum are transported in the outer portion of the Sun. The resulting overturning motions are also the primary energy source for the solar magnetic field. An accurate solar dynamo model therefore requires a complete description of the convective motions, but these motions remain poorly understood. Studying stellar convection numerically remains challenging; it occurs within a parameter regime that is extreme by computational standards. The fluid properties of the convection zone are characterized in part by the Prandtl number \\Pr = ν/κ, where ν is the kinematic viscosity and κ is the thermal diffusion; in stars, \\Pr is extremely low, \\Pr ≈ 10‑7. The influence of \\Pr on the convective motions at the heart of the dynamo is not well understood since most numerical studies are limited to using \\Pr ≈ 1. We systematically vary \\Pr and the degree of thermal forcing, characterized through a Rayleigh number, to explore its influence on the convective dynamics. For sufficiently large thermal driving, the simulations reach a so-called convective free-fall state where diffusion no longer plays an important role in the interior dynamics. Simulations with a lower \\Pr generate faster convective flows and broader ranges of scales for equivalent levels of thermal forcing. Characteristics of the spectral distribution of the velocity remain largely insensitive to changes in \\Pr . Importantly, we find that \\Pr plays a key role in determining when the free-fall regime is reached by controlling the thickness of the thermal boundary layer.

  3. Order flow dynamics around extreme price changes on an emerging stock market

    NASA Astrophysics Data System (ADS)

    Mu, Guo-Hua; Zhou, Wei-Xing; Chen, Wei; Kertész, János

    2010-07-01

    We study the dynamics of order flows around large intraday price changes using ultra-high-frequency data from the Shenzhen Stock Exchange. We find a significant reversal of price for both intraday price decreases and increases with a permanent price impact. The volatility, the volume of different types of orders, the bid-ask spread and the volume imbalance increase before the extreme events and decay slowly as a power law, which forms a well-established peak. The volume of buy market orders increases faster and the corresponding peak appears earlier than for sell market orders around positive events, while the volume peak of sell market orders leads buy market orders in the magnitude and time around negative events. When orders are divided into four groups according to their aggressiveness, we find that the behaviors of order volume and order number are similar, except for buy limit orders and canceled orders that the peak of order number postpones 2 min later after the peak of order volume, implying that investors placing large orders are more informed and play a central role in large price fluctuations. We also study the relative rates of different types of orders and find differences in the dynamics of relative rates between buy orders and sell orders and between individual investors and institutional investors. There is evidence that institutions behave very differently from individuals and that they have more aggressive strategies. Combining these findings, we conclude that institutional investors are better informed and play a more influential role in driving large price fluctuations.

  4. Analysis of long-term changes in extreme climatic indices: a case study of the Mediterranean climate, Marmara Region, Turkey

    NASA Astrophysics Data System (ADS)

    Abbasnia, Mohsen; Toros, Hüseyin

    2018-05-01

    This study aimed to analyze extreme temperature and precipitation indices at seven stations in the Marmara Region of Turkey for the period 1961-2016. The trend of temperature indices showed that the warm-spell duration and the numbers of summer days, tropical nights, warm nights, and warm days have increased, while the cold-spell duration and number of ice days, cool nights, and cool days have decreased across the Marmara Region. Additionally, the diurnal temperature range has slightly increased at most of the stations. A majority of stations have shown significant warming trends for warm days and warm nights throughout the study area, whereas warm extremes and night-time based temperature indices have shown stronger trends compared to cold extremes and day-time indices. The analysis of precipitation indices has mostly shown increasing trends in consecutive dry days and increasing trends in annual rainfall, rainfall intensity for inland and urban stations, especially for stations in Sariyer and Edirne, which are affected by a fast rate of urbanization. Overall, a large proportion of study stations have experienced an increase in annual precipitation and heavy precipitation events, although there was a low percentage of results that was significant. Therefore, it is expected that the rainfall events will tend to become shorter and more intense, the occurrence of temperature extremes will become more pronounced in favor of hotter events, and there will be an increase in the atmospheric moisture content over the Marmara Region. This provides regional evidence for the importance of ongoing research on climate change.

  5. A New Sample of Cool Subdwarfs from SDSS: Properties and Kinematics

    NASA Astrophysics Data System (ADS)

    Savcheva, Antonia; West, Andrew A.; Bochanski, John J.

    2014-06-01

    We present a new sample of M subdwarfs compiled from the 7th data re- lease of the Sloan Digital Sky Survey. With 3517 new subdwarfs, this new sample significantly increases the number the existing sample of low-mass subdwarfs. This catalog includes unprecedentedly large numbers of extreme and ultra sudwarfs. Here, we present the catalog and the statistical analysis we perform. Subdwarf template spectra are derived. We show color-color and reduced proper motion diagrams of the three metallicity classes, which are shown to separate from the disk dwarf population. The extreme and ultra subdwarfs are seen at larger values of reduced proper motion as expected for more dynamically heated populations. We determine 3D kinematics for all of the stars with proper motions. The color-magnitude diagrams show a clear separation of the three metallicity classes with the ultra and extreme subdwarfs being significantly closer to the main sequence than the ordinary subdwarfs. All subdwarfs lie below and to the blue of the main sequence. Based on the average (U, V, W ) velocities and their dispersions, the extreme and ultra subdwarfs likely belong to the Galactic halo, while the ordinary subdwarfs are likely part of the old Galactic (or thick) disk. An extensive activity analy- sis of subdwarfs is performed using chromospheric Hα emission and 208 active subdwarfs are found. We show that while the activity fraction of subdwarfs rises with spectral class and levels off at the latest spectral classes, consistent with the behavior of M dwarfs, the extreme and ultra subdwarfs are basically flat.

  6. The association between preceding drought occurrence and heat waves in the Mediterranean

    NASA Astrophysics Data System (ADS)

    Russo, Ana; Gouveia, Célia M.; Ramos, Alexandre M.; Páscoa, Patricia; Trigo, Ricardo M.

    2017-04-01

    A large number of weather driven extreme events has occurred worldwide in the last decade, namely in Europe that has been struck by record breaking extreme events with unprecedented socio-economic impacts, including the mega-heatwaves of 2003 in Europe and 2010 in Russia, and the large droughts in southwestern Europe in 2005 and 2012. The last IPCC report on extreme events points that a changing climate can lead to changes in the frequency, intensity, spatial extent, duration, and timing of weather and climate extremes. These, combined with larger exposure, can result in unprecedented risk to humans and ecosystems. In this context it is becoming increasingly relevant to improve the early identification and predictability of such events, as they negatively affect several socio-economic activities. Moreover, recent diagnostic and modelling experiments have confirmed that hot extremes are often preceded by surface moisture deficits in some regions throughout the world. In this study we analyze if the occurrence of hot extreme months is enhanced by the occurrence of preceding drought events throughout the Mediterranean area. In order to achieve this purpose, the number of hot days in the regions' hottest month will be associated with a drought indicator. The evolution and characterization of drought was analyzed using both the Standardized Precipitation Evaporation Index (SPEI) and the Standardized Precipitation Index (SPI), as obtained from CRU TS3.23 database for the period 1950-2014. We have used both SPI and SPEI for different time scales between 3 and 9 months with a spatial resolution of 0.5°. The number of hot days and nights per month (NHD and NHN) was determined using the ECAD-EOBS daily dataset for the same period and spatial resolution (dataset v14). The NHD and NHN were computed, respectively, as the number of days with a maximum or minimum temperature exceeding the 90th percentile. Results show that the most frequent hottest months for the Mediterranean region occur in July and August. Moreover, the magnitude of correlations between detrended NHD/NHN and the preceding 6- and 9-month SPEI/SPI are usually dimmer than for the 3 month time-scale. Most regions exhibit significantly negative correlations, i.e. high (low) NHD/NHN following negative (positive) SPEI/SPI values, and thus a potential for NHD/NHN early warning. Finally, correlations between the NHD/NHN with SPI and SPEI differ, with SPEI characterized by slightly higher values observed mainly for the 3-months time-scale. Acknowledgments: This work was partially supported by national funds through FCT (Fundação para a Ciência e a Tecnologia, Portugal) under project IMDROFLOOD (WaterJPI/0004/2014). Ana Russo thanks FCT for granted support (SFRH/BPD/99757/2014). A. M. Ramos was also supported by a FCT postdoctoral grant (FCT/DFRH/ SFRH/BPD/84328/2012).

  7. Inter-annual Variability of Temperature and Extreme Heat Events during the Nairobi Warm Season

    NASA Astrophysics Data System (ADS)

    Scott, A.; Misiani, H. O.; Zaitchik, B. F.; Ouma, G. O.; Anyah, R. O.; Jordan, A.

    2016-12-01

    Extreme heat events significantly stress all organisms in the ecosystem, and are likely to be amplified in peri-urban and urban areas. Understanding the variability and drivers behind these events is key to generating early warnings, yet in Equatorial East Africa, this information is currently unavailable. This study uses daily maximum and minimum temperature records from weather stations within Nairobi and its surroundings to characterize variability in daily minimum temperatures and the number of extreme heat events. ERA-Interim reanalysis is applied to assess the drivers of these events at event and seasonal time scales. At seasonal time scales, high temperatures in Nairobi are a function of large scale climate variability associated with the Atlantic Multi-decadal Oscillation (AMO) and Global Mean Sea Surface Temperature (GMSST). Extreme heat events, however, are more strongly associated with the El Nino Southern Oscillation (ENSO). For instance, the persistence of AMO and ENSO, in particular, provide a basis for seasonal prediction of extreme heat events/days in Nairobi. It is also apparent that the temporal signal from extreme heat events in tropics differs from classic heat wave definitions developed in the mid-latitudes, which suggests that a new approach for defining these events is necessary for tropical regions.

  8. A basis set for exploration of sensitivity to prescribed ocean conditions for estimating human contributions to extreme weather in CAM5.1-1degree

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stone, Dáithí A.; Risser, Mark D.; Angélil, Oliver M.

    This paper presents two contributions for research into better understanding the role of anthropogenic warming in extreme weather. The first contribution is the generation of a large number of multi-decadal simulations using a medium-resolution atmospheric climate model, CAM5.1-1degree, under two scenarios of historical climate following the protocols of the C20C+ Detection and Attribution project: the one we have experienced (All-Hist), and one that might have been experienced in the absence of human interference with the climate system (Nat-Hist). These simulations are also specifically designed for understanding extreme weather and atmospheric variability in the context of anthropogenic climate change.The second contributionmore » takes advantage of the duration and size of these simulations in order to identify features of variability in the prescribed ocean conditions that may strongly influence calculated estimates of the role of anthropogenic emissions on extreme weather frequency (event attribution). There is a large amount of uncertainty in how much anthropogenic emissions should warm regional ocean surface temperatures, yet contributions to the C20C+ Detection and Attribution project and similar efforts so far use only one or a limited number of possible estimates of the ocean warming attributable to anthropogenic emissions when generating their Nat-Hist simulations. Thus, the importance of the uncertainty in regional attributable warming estimates to the results of event attribution studies is poorly understood. The identification of features of the anomalous ocean state that seem to strongly influence event attribution estimates should therefore be able to serve as a basis set for effective sampling of other plausible attributable warming patterns. The identification performed in this paper examines monthly temperature and precipitation output from the CAM5.1-1degree simulations averaged over 237 land regions, and compares interannual anomalous variations in the ratio between the frequencies of extremes in the All-Hist and Nat-Hist simulations against variations in ocean temperatures.« less

  9. A basis set for exploration of sensitivity to prescribed ocean conditions for estimating human contributions to extreme weather in CAM5.1-1degree

    DOE PAGES

    Stone, Dáithí A.; Risser, Mark D.; Angélil, Oliver M.; ...

    2018-03-01

    This paper presents two contributions for research into better understanding the role of anthropogenic warming in extreme weather. The first contribution is the generation of a large number of multi-decadal simulations using a medium-resolution atmospheric climate model, CAM5.1-1degree, under two scenarios of historical climate following the protocols of the C20C+ Detection and Attribution project: the one we have experienced (All-Hist), and one that might have been experienced in the absence of human interference with the climate system (Nat-Hist). These simulations are also specifically designed for understanding extreme weather and atmospheric variability in the context of anthropogenic climate change.The second contributionmore » takes advantage of the duration and size of these simulations in order to identify features of variability in the prescribed ocean conditions that may strongly influence calculated estimates of the role of anthropogenic emissions on extreme weather frequency (event attribution). There is a large amount of uncertainty in how much anthropogenic emissions should warm regional ocean surface temperatures, yet contributions to the C20C+ Detection and Attribution project and similar efforts so far use only one or a limited number of possible estimates of the ocean warming attributable to anthropogenic emissions when generating their Nat-Hist simulations. Thus, the importance of the uncertainty in regional attributable warming estimates to the results of event attribution studies is poorly understood. The identification of features of the anomalous ocean state that seem to strongly influence event attribution estimates should therefore be able to serve as a basis set for effective sampling of other plausible attributable warming patterns. The identification performed in this paper examines monthly temperature and precipitation output from the CAM5.1-1degree simulations averaged over 237 land regions, and compares interannual anomalous variations in the ratio between the frequencies of extremes in the All-Hist and Nat-Hist simulations against variations in ocean temperatures.« less

  10. North-East monsoon rainfall extremes over the southern peninsular India and their association with El Niño

    NASA Astrophysics Data System (ADS)

    Singh, Prem; Gnanaseelan, C.; Chowdary, J. S.

    2017-12-01

    The present study investigates the relationship between extreme north-east (NE) monsoon rainfall (NEMR) over the Indian peninsula region and El Niño forcing. This turns out to be a critical science issue especially after the 2015 Chennai flood. The puzzle being while most El Niños favour good NE monsoon, some don't. In fact some El Niño years witnessed deficit NE monsoon. Therefore two different cases (or classes) of El Niños are considered for analysis based on standardized NEMR index and Niño 3.4 index with case-1 being both Niño-3.4 and NEMR indices greater than +1 and case-2 being Niño-3.4 index greater than +1 and NEMR index less than -1. Composite analysis suggests that SST anomalies in the central and eastern Pacific are strong in both cases but large differences are noted in the spatial distribution of SST over the Indo-western Pacific region. This questions our understanding of NEMR as mirror image of El Niño conditions in the Pacific. It is noted that the favourable excess NEMR in case-1 is due to anomalous moisture transport from Bay of Bengal and equatorial Indian Ocean to southern peninsular India. Strong SST gradient between warm western Indian Ocean (and Bay of Bengal) and cool western Pacific induced strong easterly wind anomalies during NE monsoon season favour moisture transport towards the core NE monsoon region. Further anomalous moisture convergence and convection over the core NE monsoon region supported positive rainfall anomalies in case-1. While in case-2, weak SST gradients over the Indo-western Pacific and absence of local low level convergence over NE monsoon region are mainly responsible for deficit rainfall. The ocean dynamics in the Indian Ocean displayed large differences during case-1 and case-2, suggesting the key role of Rossby wave dynamics in the Indian Ocean on NE monsoon extremes. Apart from the large scale circulation differences the number of cyclonic systems land fall for case-1 and case-2 have also contributed for variations in NE monsoon rainfall extremes during El Niño years. This study indicates that despite having strong warming in the central and eastern Pacific, NE monsoon rainfall variations over the southern peninsular India is mostly determined by SST gradient over the Indo-western Pacific region and number of systems formation in the Bay of Bengal and their land fall. The paper concludes that though the favourable large scale circulation induced by Pacific is important in modulating the NE monsoon rainfall the local air sea interaction plays a key role in modulating or driving rainfall extremes associated with El Niño.

  11. Low Vision Aids in Glaucoma

    PubMed Central

    Khanna, Anjani

    2012-01-01

    A large number of glaucoma patients suffer from vision impairments that qualify as low vision. Additional difficulties associated with low vision include problems with glare, lighting, and contrast, which can make daily activities extremely challenging. This article elaborates on how low vision aids can help with various tasks that visually impaired glaucoma patients need to do each day, to take care of themselves and to lead an independent life. PMID:27990068

  12. Beneficial effects of restoration practices can be thwarted by climate extremes.

    PubMed

    Maccherini, Simona; Bacaro, Giovanni; Marignani, Michela

    2018-06-01

    The impacts of climate extremes on species, communities and ecosystems have become critical concerns to science and society. Under a changing climate, how restoration outcomes are affected by extreme climate variables is a largely unknown topic. We analyzed the effects of experimental factors (grazing and sowing of native species), extreme climate events (intense precipitation and extreme temperatures indexes) and their combination on the restoration progress of a dry, calcareous grassland in Tuscany (Italy) with a 1 year before/15 years continuous annual monitoring after, control/impact (BACI) experiment. Grazing had a beneficial effect on the diversity of the grassland, while sowing had a limited impact. The climatic index that most affected the entire plant community composition was the number of very heavy precipitation days. The interaction of grazing and extreme climatic indexes had a significant detrimental effect on restoration outcomes, increasing the cover of synanthropic and Cosmopolitan-Subcosmopolitan generalist species and decreasing the cover of more valuable species such endemic species. In the richest grazed plots, species richness showed a lower sensitivity to the average precipitation per wet day but in grazed site, restoration outcomes can be negatively influenced by the intensification of precipitation and temperature extremes. In a context of progressive tropicalization of the Mediterranean area, to assist managers setting achievable restoration goals, restoration practitioners should consider that climate extremes might interfere with the beneficial effects of restoration practices. Copyright © 2018 Elsevier B.V. All rights reserved.

  13. Tree-Ring Dating of Extreme Lake Levels at the Subarctic?Boreal Interface

    NASA Astrophysics Data System (ADS)

    Bégin, Yves

    2001-03-01

    The dates of extreme water levels of two large lakes in northern Quebec have been recorded over the last century by ice scars on shoreline trees and sequences of reaction wood in shore trees tilted by wave erosion. Ice-scar chronologies indicate high water levels in spring, whereas tree-tilting by waves is caused by summer high waters. A major increase in both the amplitude and frequency of ice floods occurred in the 1930s. No such change was indicated by the tree-tilting chronologies, but wave erosion occurred in exceptionally rainy years. According to the modern record, spring lake-level rise is due to increased snowfalls since the 1930s. However, the absence of erosional marks in a large number of years since 1930 suggests a high frequency of low-water-level years resulting from dry conditions. Intercalary years with very large numbers of marked trees (e.g., 1935) indicate that the interannual range of summer lake levels has increased since the 1930s. Increased lake-flood frequency is postulated to be related to a slower expansion of arctic anticyclones, favoring the passage of cyclonic air masses over the area and resulting in abundant snowfall in early winter. Conditions in summer are due to the rate of weakening of the anticyclones controlling the position of the arctic front in summer. This position influences the path of the cyclonic air masses, which control summer precipitation and, consequently, summer lake levels in the area.

  14. Energy balance constraints on gravity wave induced eddy diffusion in the mesosphere and lower thermosphere

    NASA Technical Reports Server (NTRS)

    Strobel, D. F.; Apruzese, J. P.; Schoeberl, M. R.

    1985-01-01

    The constraints on turbulence improved by the mesospheric heat budget are reexamined, and the sufficiency of the theoretical evidence to support the hypothesis that the eddy Prandtl number is greater than one in the mesosphere is considered. The mesopause thermal structure is calculated with turbulent diffusion coefficients commonly used in chemical models and deduced from mean zonal wind deceleration. It is shown that extreme mesopause temperatures of less than 100 K are produced by the large net cooling. The results demonstrate the importance of the Prandtl number for mesospheric turbulence.

  15. Epidemic failure detection and consensus for extreme parallelism

    DOE PAGES

    Katti, Amogh; Di Fatta, Giuseppe; Naughton, Thomas; ...

    2017-02-01

    Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum s User Level Failure Mitigation proposal has introduced an operation, MPI Comm shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI Comm shrink operation requires a failure detection and consensus algorithm. This paper presents three novel failure detection and consensus algorithms using Gossiping. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that inmore » all algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus. The third approach is a three-phase distributed failure detection and consensus algorithm and provides consistency guarantees even in very large and extreme-scale systems while at the same time being memory and bandwidth efficient.« less

  16. Magnetocrystalline anisotropy in UMn 2 Ge 2 and related Mn-based actinide ferromagnets

    DOE PAGES

    Parker, David S.; Ghimire, Nirmal; Singleton, John; ...

    2015-05-04

    We presenmore » t magnetization isotherms in pulsed magnetic fields up to 62 Tesla, supported by first principles calculations, demonstrating a huge uniaxial magnetocrystalline anisotropy energy - approximately 20 MJ/m 3 - in UMn 2 Ge 2 . This large anisotropy results from the extremely strong spin-orbit coupling affecting the uranium 5 f electrons, which in the calculations exhibit a substantial orbital moment exceeding 2 μ B. Finally, we also find from theoretical calculations that a number of isostructural Mn-actinide compounds are expected to have similarly large anisotropy.« less

  17. Large-Scale Meteorological Patterns Associated with Extreme Precipitation in the US Northeast

    NASA Astrophysics Data System (ADS)

    Agel, L. A.; Barlow, M. A.

    2016-12-01

    Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. Tropopause height provides a compact representation of large-scale circulation patterns, as it is linked to mid-level circulation, low-level thermal contrasts and low-level diabatic heating. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into a larger context. Six tropopause patterns are identified on extreme days: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong upward motion during, and moisture transport preceding, extreme precipitation events.

  18. Spatially inhomogeneous electron state deep in the extreme quantum limit of strontium titanate

    DOE PAGES

    Bhattacharya, Anand; Skinner, Brian; Khalsa, Guru; ...

    2016-09-29

    When an electronic system is subjected to a sufficiently strong magnetic field that the cyclotron energy is much larger than the Fermi energy, the system enters the extreme quantum limit (EQL) and becomes susceptible to a number of instabilities. Bringing a three-dimensional electronic system deeply into the EQL can be difficult however, since it requires a small Fermi energy, large magnetic field, and low disorder. Here we present an experimental study of the EQL in lightly-doped single crystals of strontium titanate. Our experiments probe deeply into the regime where theory has long predicted an interaction-driven charge density wave or Wignermore » crystal state. A number of interesting features arise in the transport in this regime, including a striking re-entrant nonlinearity in the current-voltage characteristics. As a result, we discuss these features in the context of possible correlated electron states, and present an alternative picture based on magnetic-field induced puddling of electrons.« less

  19. State of the art and future perspectives of thermophilic anaerobic digestion.

    PubMed

    Ahring, B K; Mladenovska, Z; Iranpour, R; Westermann, P

    2002-01-01

    The state of the art of thermophilic digestion is discussed. Thermophilic digestion is a well established technology in Europe for treatment of mixtures of waste in common large scale biogas plants or for treatment of the organic fraction of municipal solid waste. Due to a large number of failures over time with thermophilic digestion of sewage sludge this process has lost its appeal in the USA. New demands on sanitation of biosolids before land use will, however, bring the attention back to the use of elevated temperatures during sludge stabilization. In the paper we show how the use of a start-up strategy based on the actual activity of key microbes can be used to ensure proper and fast transfer of mesophilic digesters into thermophilic operation. Extreme thermophilic temperatures of 65 degrees C or more may be necessary in the future to meet the demands for full sanitation of the waste material before final disposal. We show data of anaerobic digestion at extreme thermophilic temperatures.

  20. An Efficient Pipeline Wavefront Phase Recovery for the CAFADIS Camera for Extremely Large Telescopes

    PubMed Central

    Magdaleno, Eduardo; Rodríguez, Manuel; Rodríguez-Ramos, José Manuel

    2010-01-01

    In this paper we show a fast, specialized hardware implementation of the wavefront phase recovery algorithm using the CAFADIS camera. The CAFADIS camera is a new plenoptic sensor patented by the Universidad de La Laguna (Canary Islands, Spain): international patent PCT/ES2007/000046 (WIPO publication number WO/2007/082975). It can simultaneously measure the wavefront phase and the distance to the light source in a real-time process. The pipeline algorithm is implemented using Field Programmable Gate Arrays (FPGA). These devices present architecture capable of handling the sensor output stream using a massively parallel approach and they are efficient enough to resolve several Adaptive Optics (AO) problems in Extremely Large Telescopes (ELTs) in terms of processing time requirements. The FPGA implementation of the wavefront phase recovery algorithm using the CAFADIS camera is based on the very fast computation of two dimensional fast Fourier Transforms (FFTs). Thus we have carried out a comparison between our very novel FPGA 2D-FFTa and other implementations. PMID:22315523

  1. An efficient pipeline wavefront phase recovery for the CAFADIS camera for extremely large telescopes.

    PubMed

    Magdaleno, Eduardo; Rodríguez, Manuel; Rodríguez-Ramos, José Manuel

    2010-01-01

    In this paper we show a fast, specialized hardware implementation of the wavefront phase recovery algorithm using the CAFADIS camera. The CAFADIS camera is a new plenoptic sensor patented by the Universidad de La Laguna (Canary Islands, Spain): international patent PCT/ES2007/000046 (WIPO publication number WO/2007/082975). It can simultaneously measure the wavefront phase and the distance to the light source in a real-time process. The pipeline algorithm is implemented using Field Programmable Gate Arrays (FPGA). These devices present architecture capable of handling the sensor output stream using a massively parallel approach and they are efficient enough to resolve several Adaptive Optics (AO) problems in Extremely Large Telescopes (ELTs) in terms of processing time requirements. The FPGA implementation of the wavefront phase recovery algorithm using the CAFADIS camera is based on the very fast computation of two dimensional fast Fourier Transforms (FFTs). Thus we have carried out a comparison between our very novel FPGA 2D-FFTa and other implementations.

  2. Evolution of precipitation extremes in two large ensembles of climate simulations

    NASA Astrophysics Data System (ADS)

    Martel, Jean-Luc; Mailhot, Alain; Talbot, Guillaume; Brissette, François; Ludwig, Ralf; Frigon, Anne; Leduc, Martin; Turcotte, Richard

    2017-04-01

    Recent studies project significant changes in the future distribution of precipitation extremes due to global warming. It is likely that extreme precipitation intensity will increase in a future climate and that extreme events will be more frequent. In this work, annual maxima daily precipitation series from the Canadian Earth System Model (CanESM2) 50-member large ensemble (spatial resolution of 2.8°x2.8°) and the Community Earth System Model (CESM1) 40-member large ensemble (spatial resolution of 1°x1°) are used to investigate extreme precipitation over the historical (1980-2010) and future (2070-2100) periods. The use of these ensembles results in respectively 1 500 (30 years x 50 members) and 1200 (30 years x 40 members) simulated years over both the historical and future periods. These large datasets allow the computation of empirical daily extreme precipitation quantiles for large return periods. Using the CanESM2 and CESM1 large ensembles, extreme daily precipitation with return periods ranging from 2 to 100 years are computed in historical and future periods to assess the impact of climate change. Results indicate that daily precipitation extremes generally increase in the future over most land grid points and that these increases will also impact the 100-year extreme daily precipitation. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety. Estimated increases in precipitation associated to very extreme precipitation events (e.g. 100 years) will drastically change the likelihood of flooding and their extent in future climate. These results, although interesting, need to be extended to sub-daily durations, relevant for urban flooding protection and urban infrastructure design (e.g. sewer networks, culverts). Models and simulations at finer spatial and temporal resolution are therefore needed.

  3. Feasibility of ultra-wideband SAW RFID tags meeting FCC rules.

    PubMed

    Härmä, Sanna; Plessky, Victor P; Li, Xianyi; Hartogh, Paul

    2009-04-01

    We discuss the feasibility of surface acoustic wave (SAW) radio-frequency identification (RFID) tags that rely on ultra-wideband (UWB) technology. We propose a design of a UWB SAW tag, carry out numerical experiments on the device performance, and study signal processing in the system. We also present experimental results for the proposed device and estimate the potentially achievable reading distance. UWB SAW tags will have an extremely small chip size (<0.5 x 1 mm(2)) and a low cost. They also can provide a large number of different codes. The estimated read range for UWB SAW tags is about 2 m with a reader radiating as low as <0.1 mW power levels with an extremely low duty factor.

  4. [TREATMENT OF EXTREMELY PREMATURE NEWBORN INFANT WITH INO. CLINICAL CASE].

    PubMed

    Radulova, P; Slancheva, B; Marinov, R

    2015-01-01

    Prolonged inhaled nitric oxide (iNO) from birth in preterm neonates with BPD improves endogenous surfactant function as well as lung growth, angiogenesis, and alveologenesis. As a result there is a reduction in the frequency of the "new" form of BPD in neonates under 28 weeks of gestation and birth weight under 1000 gr. Delivery of inhaled nitric oxide is a new method of prevention of chronic lung disease. According to a large number of randomized trials iNO in premature neonates reduces pulmonary morbidity and leads to a reduction of the mortality in this population of patients. This new therapy does not have serious side effects. We represent a clinical case of extremely premature newborn infant with BPD treated with iNO.

  5. Instructive Biologic Scaffold for Functional Tissue Regeneration Following Trauma to the Extremities

    DTIC Science & Technology

    2016-10-01

    Award Number: W81XWH-12-2-0128 TITLE: Instructive Biologic Scaffold for Functional Tissue Regeneration Following Trauma to the Extremities...SUBTITLE Instructive Biologic Scaffold for Functional Tissue Regeneration Following Trauma to the Extremities 5a. CONTRACT NUMBER 5b. GRANT NUMBER...identification of cell phenotype, extracellular 5 matrix characterization, and histomorphometric analysis. The main endpoint of this study was to

  6. Evolution of Precipitation Extremes in Three Large Ensembles of Climate Simulations - Impact of Spatial and Temporal Resolutions

    NASA Astrophysics Data System (ADS)

    Martel, J. L.; Brissette, F.; Mailhot, A.; Wood, R. R.; Ludwig, R.; Frigon, A.; Leduc, M.; Turcotte, R.

    2017-12-01

    Recent studies indicate that the frequency and intensity of extreme precipitation will increase in future climate due to global warming. In this study, we compare annual maxima precipitation series from three large ensembles of climate simulations at various spatial and temporal resolutions. The first two are at the global scale: the Canadian Earth System Model (CanESM2) 50-member large ensemble (CanESM2-LE) at a 2.8° resolution and the Community Earth System Model (CESM1) 40-member large ensemble (CESM1-LE) at a 1° resolution. The third ensemble is at the regional scale over both Eastern North America and Europe: the Canadian Regional Climate Model (CRCM5) 50-member large ensemble (CRCM5-LE) at a 0.11° resolution, driven at its boundaries by the CanESM-LE. The CRCM5-LE is a new ensemble issued from the ClimEx project (http://www.climex-project.org), a Québec-Bavaria collaboration. Using these three large ensembles, change in extreme precipitations over the historical (1980-2010) and future (2070-2100) periods are investigated. This results in 1 500 (30 years x 50 members for CanESM2-LE and CRCM5-LE) and 1200 (30 years x 40 members for CESM1-LE) simulated years over both the historical and future periods. Using these large datasets, the empirical daily (and sub-daily for CRCM5-LE) extreme precipitation quantiles for large return periods ranging from 2 to 100 years are computed. Results indicate that daily extreme precipitations generally will increase over most land grid points of both domains according to the three large ensembles. Regarding the CRCM5-LE, the increase in sub-daily extreme precipitations will be even more important than the one observed for daily extreme precipitations. Considering that many public infrastructures have lifespans exceeding 75 years, the increase in extremes has important implications on service levels of water infrastructures and public safety.

  7. Regression and Data Mining Methods for Analyses of Multiple Rare Variants in the Genetic Analysis Workshop 17 Mini-Exome Data

    PubMed Central

    Bailey-Wilson, Joan E.; Brennan, Jennifer S.; Bull, Shelley B; Culverhouse, Robert; Kim, Yoonhee; Jiang, Yuan; Jung, Jeesun; Li, Qing; Lamina, Claudia; Liu, Ying; Mägi, Reedik; Niu, Yue S.; Simpson, Claire L.; Wang, Libo; Yilmaz, Yildiz E.; Zhang, Heping; Zhang, Zhaogong

    2012-01-01

    Group 14 of Genetic Analysis Workshop 17 examined several issues related to analysis of complex traits using DNA sequence data. These issues included novel methods for analyzing rare genetic variants in an aggregated manner (often termed collapsing rare variants), evaluation of various study designs to increase power to detect effects of rare variants, and the use of machine learning approaches to model highly complex heterogeneous traits. Various published and novel methods for analyzing traits with extreme locus and allelic heterogeneity were applied to the simulated quantitative and disease phenotypes. Overall, we conclude that power is (as expected) dependent on locus-specific heritability or contribution to disease risk, large samples will be required to detect rare causal variants with small effect sizes, extreme phenotype sampling designs may increase power for smaller laboratory costs, methods that allow joint analysis of multiple variants per gene or pathway are more powerful in general than analyses of individual rare variants, population-specific analyses can be optimal when different subpopulations harbor private causal mutations, and machine learning methods may be useful for selecting subsets of predictors for follow-up in the presence of extreme locus heterogeneity and large numbers of potential predictors. PMID:22128066

  8. Influence of hurricane-related activity on North American extreme precipitation

    NASA Astrophysics Data System (ADS)

    Barlow, Mathew

    2010-05-01

    Individual hurricanes and their remnants can produce exceptionally intense rainfall, and the associated flooding, even independent of storm surge, is one of the leading causes of hurricane-related death in the U.S. Despite the catastrophic societal costs of hurricanes and the considerable recent attention to possible trends in strength and number, little is known about the general contribution of hurricane-related activity to extreme precipitation over North America and the underlying dynamical mechanisms. Here we show, based on a 25-year observational analysis, that there are important contributions to the occurrence of extreme precipitation events over more than half of North America, including a pronounced signal over northern and inland areas, associated with an average span of influence that extends to several hundred kilometers. Large-scale vertical velocity, maximum wind speed, and tropical/extratropical character are important factors in the strength and range of influence, and the pattern of influence depends on whether an absolute or relative measure of precipitation is considered. Associated changes in stability, moisture, and vertical motion are analyzed to investigate the dynamics of the influence: the largest changes are in vertical motion, with the hurricane-related activity bringing deep tropical values even to inland and high latitude areas, consistent with the occurrence of very heavy, tropical-like precipitation. While the maximum contribution of hurricane-related activity to mean precipitation is generally less than 25% even for the most-affected coastal regions, the contribution to extreme events is much larger: well over 50% for several regions and exceeding 25% for large swaths of the continent. Typical track density plots do not capture the activity's influence on extreme precipitation.

  9. Extreme-value dependence: An application to exchange rate markets

    NASA Astrophysics Data System (ADS)

    Fernandez, Viviana

    2007-04-01

    Extreme value theory (EVT) focuses on modeling the tail behavior of a loss distribution using only extreme values rather than the whole data set. For a sample of 10 countries with dirty/free float regimes, we investigate whether paired currencies exhibit a pattern of asymptotic dependence. That is, whether an extremely large appreciation or depreciation in the nominal exchange rate of one country might transmit to another. In general, after controlling for volatility clustering and inertia in returns, we do not find evidence of extreme-value dependence between paired exchange rates. However, for asymptotic-independent paired returns, we find that tail dependency of exchange rates is stronger under large appreciations than under large depreciations.

  10. Experimental determination of Ramsey numbers.

    PubMed

    Bian, Zhengbing; Chudak, Fabian; Macready, William G; Clark, Lane; Gaitan, Frank

    2013-09-27

    Ramsey theory is a highly active research area in mathematics that studies the emergence of order in large disordered structures. Ramsey numbers mark the threshold at which order first appears and are extremely difficult to calculate due to their explosive rate of growth. Recently, an algorithm that can be implemented using adiabatic quantum evolution has been proposed that calculates the two-color Ramsey numbers R(m,n). Here we present results of an experimental implementation of this algorithm and show that it correctly determines the Ramsey numbers R(3,3) and R(m,2) for 4≤m≤8. The R(8,2) computation used 84 qubits of which 28 were computational qubits. This computation is the largest experimental implementation of a scientifically meaningful adiabatic evolution algorithm that has been done to date.

  11. Experimental Determination of Ramsey Numbers

    NASA Astrophysics Data System (ADS)

    Bian, Zhengbing; Chudak, Fabian; Macready, William G.; Clark, Lane; Gaitan, Frank

    2013-09-01

    Ramsey theory is a highly active research area in mathematics that studies the emergence of order in large disordered structures. Ramsey numbers mark the threshold at which order first appears and are extremely difficult to calculate due to their explosive rate of growth. Recently, an algorithm that can be implemented using adiabatic quantum evolution has been proposed that calculates the two-color Ramsey numbers R(m,n). Here we present results of an experimental implementation of this algorithm and show that it correctly determines the Ramsey numbers R(3,3) and R(m,2) for 4≤m≤8. The R(8,2) computation used 84 qubits of which 28 were computational qubits. This computation is the largest experimental implementation of a scientifically meaningful adiabatic evolution algorithm that has been done to date.

  12. Widely tunable 1.94-μm Tm:BaY2F8 laser

    NASA Astrophysics Data System (ADS)

    Galzerano, Gianluca; Cornacchia, Francesco; Parisi, Daniela; Toncelli, Alessandra; Tonelli, Mauro; Laporta, Paolo

    2005-04-01

    A novel BaY2F8 crystal doped with thulium ions is grown and extensively investigated. Owing to the large number of vibronic levels and to a favorable electron-phonon coupling, extremely wide absorption and emission bands around 1.9 μm are observed. A room-temperature Tm:BaY2F8 laser tunable over a 210-nm interval, from 1849 to 2059 nm, is demonstrated.

  13. Classical Gradual-Channel Modeling of Graphene Field-Effect Transistors (FETs)

    DTIC Science & Technology

    2010-08-01

    29  1 1. Introduction Over the past 60 years, a large number of papers have been written about the properties of graphene (1...strongly covalent features of its atomic bonding, properties that it inherits from the semimetal graphite. Indeed, the earliest papers on graphene (2...simplicity of this “single-layer graphite” model, the picture of graphene that evolved from these papers is one of extreme complexity. This is

  14. In Harmony with the Population: Ethnomusicology as a Framework for Countering Violent Extremism in the Sahel

    DTIC Science & Technology

    2016-12-01

    digital media , art, multiculturalism, communication flow theory 15. NUMBER OF PAGES 143 16. PRICE CODE 17. SECURITY CLASSIFICATION OF REPORT...and Chad as a Case Study,” 59. 41 methods), and mass communication ( communication to a large audience via mass media ).91 According to a 2007...proliferation of digital technology for at least the foreseeable future. Early communication theorists considered mass- media communication flow to be a

  15. Work-Related Musculoskeletal Symptoms and Job Factors Among Large-Herd Dairy Milkers.

    PubMed

    Douphrate, David I; Nonnenmann, Matthew W; Hagevoort, Robert; Gimeno Ruiz de Porras, David

    2016-01-01

    Dairy production in the United States is moving towards large-herd milking operations, resulting in an increase in task specialization and work demands. The objective of this project was to provide preliminary evidence of the association of a number of specific job conditions that commonly characterize large-herd parlor milking operations with work-related musculoskeletal symptoms (MSS). A modified version of the Standardized Nordic Questionnaire was administered to assess MSS prevalence among 450 US large-herd parlor workers. Worker demographics and MSS prevalences were generated. Prevalence ratios were also generated to determine associations of a number of specific job conditions that commonly characterize large-herd parlor milking operations with work-related MSS. Work-related MSS are prevalent among large-herd parlor workers, since nearly 80% report 12-month prevalences of one or more symptoms, which are primarily located in the upper extremities, specifically shoulders and wrist/hand. Specific large-herd milking parlor job conditions are associated with MSS in multiple body regions, including performing the same task repeatedly, insufficient rest breaks, working when injured, static postures, adverse environmental conditions, and reaching overhead. These findings support the need for administrative and engineering solutions aimed at reducing exposure to job risk factors for work-related MSS among large-herd parlor workers.

  16. The Number Density of Quiescent Compact Galaxies at Intermediate Redshift

    NASA Astrophysics Data System (ADS)

    Damjanov, Ivana; Hwang, Ho Seong; Geller, Margaret J.; Chilingarian, Igor

    2014-09-01

    Massive compact systems at 0.2 < z < 0.6 are the missing link between the predominantly compact population of massive quiescent galaxies at high redshift and their analogs and relics in the local volume. The evolution in number density of these extreme objects over cosmic time is the crucial constraining factor for the models of massive galaxy assembly. We select a large sample of ~200 intermediate-redshift massive compacts from the Baryon Oscillation Spectroscopic Survey (BOSS) spectroscopy by identifying point-like Sloan Digital Sky Survey photometric sources with spectroscopic signatures of evolved redshifted galaxies. A subset of our targets have publicly available high-resolution ground-based images that we use to augment the dynamical and stellar population properties of these systems by their structural parameters. We confirm that all BOSS compact candidates are as compact as their high-redshift massive counterparts and less than half the size of similarly massive systems at z ~ 0. We use the completeness-corrected numbers of BOSS compacts to compute lower limits on their number densities in narrow redshift bins spanning the range of our sample. The abundance of extremely dense quiescent galaxies at 0.2 < z < 0.6 is in excellent agreement with the number densities of these systems at high redshift. Our lower limits support the models of massive galaxy assembly through a series of minor mergers over the redshift range 0 < z < 2.

  17. Extreme seismicity and disaster risks: Hazard versus vulnerability (Invited)

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.

    2013-12-01

    Although the extreme nature of earthquakes has been known for millennia due to the resultant devastation from many of them, the vulnerability of our civilization to extreme seismic events is still growing. It is partly because of the increase in the number of high-risk objects and clustering of populations and infrastructure in the areas prone to seismic hazards. Today an earthquake may affect several hundreds thousand lives and cause significant damage up to hundred billion dollars; it can trigger an ecological catastrophe if occurs in close vicinity to a nuclear power plant. Two types of extreme natural events can be distinguished: (i) large magnitude low probability events, and (ii) the events leading to disasters. Although the first-type events may affect earthquake-prone countries directly or indirectly (as tsunamis, landslides etc.), the second-type events occur mainly in economically less-developed countries where the vulnerability is high and the resilience is low. Although earthquake hazards cannot be reduced, vulnerability to extreme events can be diminished by monitoring human systems and by relevant laws preventing an increase in vulnerability. Significant new knowledge should be gained on extreme seismicity through observations, monitoring, analysis, modeling, comprehensive hazard assessment, prediction, and interpretations to assist in disaster risk analysis. The advanced disaster risk communication skill should be developed to link scientists, emergency management authorities, and the public. Natural, social, economic, and political reasons leading to disasters due to earthquakes will be discussed.

  18. Improved belief propagation algorithm finds many Bethe states in the random-field Ising model on random graphs

    NASA Astrophysics Data System (ADS)

    Perugini, G.; Ricci-Tersenghi, F.

    2018-01-01

    We first present an empirical study of the Belief Propagation (BP) algorithm, when run on the random field Ising model defined on random regular graphs in the zero temperature limit. We introduce the notion of extremal solutions for the BP equations, and we use them to fix a fraction of spins in their ground state configuration. At the phase transition point the fraction of unconstrained spins percolates and their number diverges with the system size. This in turn makes the associated optimization problem highly non trivial in the critical region. Using the bounds on the BP messages provided by the extremal solutions we design a new and very easy to implement BP scheme which is able to output a large number of stable fixed points. On one hand this new algorithm is able to provide the minimum energy configuration with high probability in a competitive time. On the other hand we found that the number of fixed points of the BP algorithm grows with the system size in the critical region. This unexpected feature poses new relevant questions about the physics of this class of models.

  19. Magnetic and velocity fields in a dynamo operating at extremely small Ekman and magnetic Prandtl numbers

    NASA Astrophysics Data System (ADS)

    Šimkanin, Ján; Kyselica, Juraj

    2017-12-01

    Numerical simulations of the geodynamo are becoming more realistic because of advances in computer technology. Here, the geodynamo model is investigated numerically at the extremely low Ekman and magnetic Prandtl numbers using the PARODY dynamo code. These parameters are more realistic than those used in previous numerical studies of the geodynamo. Our model is based on the Boussinesq approximation and the temperature gradient between upper and lower boundaries is a source of convection. This study attempts to answer the question how realistic the geodynamo models are. Numerical results show that our dynamo belongs to the strong-field dynamos. The generated magnetic field is dipolar and large-scale while convection is small-scale and sheet-like flows (plumes) are preferred to a columnar convection. Scales of magnetic and velocity fields are separated, which enables hydromagnetic dynamos to maintain the magnetic field at the low magnetic Prandtl numbers. The inner core rotation rate is lower than that in previous geodynamo models. On the other hand, dimensional magnitudes of velocity and magnetic fields and those of the magnetic and viscous dissipation are larger than those expected in the Earth's core due to our parameter range chosen.

  20. Character expansion methods for matrix models of dually weighted graphs

    NASA Astrophysics Data System (ADS)

    Kazakov, Vladimir A.; Staudacher, Matthias; Wynter, Thomas

    1996-04-01

    We consider generalized one-matrix models in which external fields allow control over the coordination numbers on both the original and dual lattices. We rederive in a simple fashion a character expansion formula for these models originally due to Itzykson and Di Francesco, and then demonstrate how to take the large N limit of this expansion. The relationship to the usual matrix model resolvent is elucidated. Our methods give as a by-product an extremely simple derivation of the Migdal integral equation describing the large N limit of the Itzykson-Zuber formula. We illustrate and check our methods by analysing a number of models solvable by traditional means. We then proceed to solve a new model: a sum over planar graphys possessing even coordination numbers on both the original and the dual lattice. We conclude by formulating the equations for the case of arbitrary sets of even, self-dual coupling constants. This opens the way for studying the deep problems of phase transitions from random to flat lattices. January 1995

  1. Numerical computation of spherical harmonics of arbitrary degree and order by extending exponent of floating point numbers

    NASA Astrophysics Data System (ADS)

    Fukushima, Toshio

    2012-04-01

    By extending the exponent of floating point numbers with an additional integer as the power index of a large radix, we compute fully normalized associated Legendre functions (ALF) by recursion without underflow problem. The new method enables us to evaluate ALFs of extremely high degree as 232 = 4,294,967,296, which corresponds to around 1 cm resolution on the Earth's surface. By limiting the application of exponent extension to a few working variables in the recursion, choosing a suitable large power of 2 as the radix, and embedding the contents of the basic arithmetic procedure of floating point numbers with the exponent extension directly in the program computing the recurrence formulas, we achieve the evaluation of ALFs in the double-precision environment at the cost of around 10% increase in computational time per single ALF. This formulation realizes meaningful execution of the spherical harmonic synthesis and/or analysis of arbitrary degree and order.

  2. Large number limit of multifield inflation

    NASA Astrophysics Data System (ADS)

    Guo, Zhong-Kai

    2017-12-01

    We compute the tensor and scalar spectral index nt, ns, the tensor-to-scalar ratio r , and the consistency relation nt/r in the general monomial multifield slow-roll inflation models with potentials V ˜∑iλi|ϕi| pi . The general models give a novel relation that nt, ns and nt/r are all proportional to the logarithm of the number of fields Nf when Nf is getting extremely large with the order of magnitude around O (1040). An upper bound Nf≲N*eZ N* is given by requiring the slow variation parameter small enough where N* is the e -folding number and Z is a function of distributions of λi and pi. Besides, nt/r differs from the single-field result -1 /8 with substantial probability except for a few very special cases. Finally, we derive theoretical bounds r >2 /N* (r ≳0.03 ) and for nt, which can be tested by observation in the near future.

  3. Plastic Surgery Challenges in War Wounded I: Flap-Based Extremity Reconstruction

    PubMed Central

    Sabino, Jennifer M.; Slater, Julia; Valerio, Ian L.

    2016-01-01

    Scope and Significance: Reconstruction of traumatic injuries requiring tissue transfer begins with aggressive resuscitation and stabilization. Systematic advances in acute casualty care at the point of injury have improved survival and allowed for increasingly complex treatment before definitive reconstruction at tertiary medical facilities outside the combat zone. As a result, the complexity of the limb salvage algorithm has increased over 14 years of combat activities in Iraq and Afghanistan. Problem: Severe poly-extremity trauma in combat casualties has led to a large number of extremity salvage cases. Advanced reconstructive techniques coupled with regenerative medicine applications have played a critical role in the restoration, recovery, and rehabilitation of functional limb salvage. Translational Relevance: The past 14 years of war trauma have increased our understanding of tissue transfer for extremity reconstruction in the treatment of combat casualties. Injury patterns, flap choice, and reconstruction timing are critical variables to consider for optimal outcomes. Clinical Relevance: Subacute reconstruction with specifically chosen flap tissue and donor site location based on individual injuries result in successful tissue transfer, even in critically injured patients. These considerations can be combined with regenerative therapies to optimize massive wound coverage and limb salvage form and function in previously active patients. Summary: Traditional soft tissue reconstruction is integral in the treatment of war extremity trauma. Pedicle and free flaps are a critically important part of the reconstructive ladder for salvaging extreme extremity injuries that are seen as a result of the current practice of war. PMID:27679751

  4. OCTOCAM: A Workhorse Instrument for the Gemini Telescopes During the Era of LSST

    NASA Astrophysics Data System (ADS)

    Roming, Peter; van der Horst, Alexander; OCTOCAM Team

    2018-01-01

    The decade of the 2020s are planned to be an era of large surveys and giant telescopes. A trademark of this era will be the large number of interesting objects observed daily by high-cadence surveys, such as the LSST. Because of the sheer numbers, only a very small fraction of these interesting objects will be observed with extremely large telescopes. The follow up workhorses during this era will be the 8-meter class telescopes and corresponding instruments that are prepared to pursue these interesting objects. One such workhorse instrument is OCTOCAM, a highly efficient instrument designed to probe the time domain window with simulatenous broad-wavelength coverage. OCTOCAM optimizes the use of Gemini for broadband imaging and spectroscopic single-target observations. The instrument is designed for high temporal resolution, broad spectral coverage, and moderate spectral resolution. OCTOCAM was selected as part of the Gemini instrumentation program in early 2017. Here we provide a description of the science cases to be addressed, overall instrument design, and current status.

  5. Changing Pattern of Indian Monsoon Extremes: Global and Local Factors

    NASA Astrophysics Data System (ADS)

    Ghosh, Subimal; Shastri, Hiteshri; Pathak, Amey; Paul, Supantha

    2017-04-01

    Indian Summer Monsoon Rainfall (ISMR) extremes have remained a major topic of discussion in the field of global change and hydro-climatology over the last decade. This attributes to multiple conclusions on changing pattern of extremes along with poor understanding of multiple processes at global and local scales associated with monsoon extremes. At a spatially aggregate scale, when number of extremes in the grids are summed over, a statistically significant increasing trend is observed for both Central India (Goswami et al., 2006) and all India (Rajeevan et al., 2008). However, such a result over Central India does not satisfy flied significance test of increase and no decrease (Krishnamurthy et al., 2009). Statistically rigorous extreme value analysis that deals with the tail of the distribution reveals a spatially non-uniform trend of extremes over India (Ghosh et al., 2012). This results into statistically significant increasing trend of spatial variability. Such an increase of spatial variability points to the importance of local factors such as deforestation and urbanization. We hypothesize that increase of spatial average of extremes is associated with the increase of events occurring over large region, while increase in spatial variability attributes to local factors. A Lagrangian approach based dynamic recycling model reveals that the major contributor of moisture to wide spread extremes is Western Indian Ocean, while land surface also contributes around 25-30% of moisture during the extremes in Central India. We further test the impacts of local urbanization on extremes and find the impacts are more visible over West central, Southern and North East India. Regional atmospheric simulations coupled with Urban Canopy Model (UCM) shows that urbanization intensifies extremes in city areas, but not uniformly all over the city. The intensification occurs over specific pockets of the urban region, resulting an increase in spatial variability even within the city. This also points to the need of setting up multiple weather stations over the city at a finer resolution for better understanding of urban extremes. We conclude that the conventional method of considering large scale factors is not sufficient for analysing the monsoon extremes and characterization of the same needs a blending of both global and local factors. Ghosh, S., Das, D., Kao, S-C. & Ganguly, A. R. Lack of uniform trends but increasing spatial variability in observed Indian rainfall extremes. Nature Clim. Change 2, 86-91 (2012) Goswami, B. N., Venugopal, V., Sengupta, D., Madhusoodanan, M. S. & Xavier, P. K. Increasing trend of extreme rain events over India in a warming environment. Science 314, 1442-1445 (2006). Krishnamurthy, C. K. B., Lall, U. & Kwon, H-H. Changing frequency and intensity of rainfall extremes over India from 1951 to 2003. J. Clim. 22, 4737-4746 (2009). Rajeevan, M., Bhate, J. & Jaswal, A. K. Analysis of variability and trends of extreme rainfall events over India using 104 years of gridded daily rainfall data. Geophys. Res. Lett. 35, L18707 (2008).

  6. Analysis of extreme rain and flood events using a regional hydrologically enhanced hydrometeorological system

    NASA Astrophysics Data System (ADS)

    Yucel, Ismail; Onen, Alper

    2013-04-01

    Evidence is showing that global warming or climate change has a direct influence on changes in precipitation and the hydrological cycle. Extreme weather events such as heavy rainfall and flooding are projected to become much more frequent as climate warms. Regional hydrometeorological system model which couples the atmosphere with physical and gridded based surface hydrology provide efficient predictions for extreme hydrological events. This modeling system can be used for flood forecasting and warning issues as they provide continuous monitoring of precipitation over large areas at high spatial resolution. This study examines the performance of the Weather Research and Forecasting (WRF-Hydro) model that performs the terrain, sub-terrain, and channel routing in producing streamflow from WRF-derived forcing of extreme precipitation events. The capability of the system with different options such as data assimilation is tested for number of flood events observed in basins of western Black Sea Region in Turkey. Rainfall event structures and associated flood responses are evaluated with gauge and satellite-derived precipitation and measured streamflow values. The modeling system shows skills in capturing the spatial and temporal structure of extreme rainfall events and resulted flood hydrographs. High-resolution routing modules activated in the model enhance the simulated discharges.

  7. Extreme storm activity in North Atlantic and European region

    NASA Astrophysics Data System (ADS)

    Vyazilova, N.

    2010-09-01

    The extreme storm activity study over North Atlantic and Europe includes the analyses of extreme cyclone (track number, integral cyclonic intensity) and extreme storm (track number) during winter and summer seasons in the regions: 1) 55°N-80N, 50°W-70°E; 2) 30°N-55°N, 50°W-70°E. Extreme cyclones were selected based on cyclone centre pressure (P<=970 mbar). Extreme storms were selected from extreme cyclones based on wind velocity on 925 mbar. The Bofort scala was used for this goal. Integral cyclonic intensity (for region) includes the calculation cyclone centers number and sum of MSLP anomalies in cyclone centers. The analyses based on automated cyclone tracking algorithm, 6-hourly MSLP and wind data (u and v on 925 gPa) from the NCEP/NCAR reanalyses from January 1948 to March 2010. The comparision of mean, calculated for every ten years, had shown, that in polar region extreme cyclone and storm track number, and integral cyclonic intensity gradually increases and have maximum during last years (as for summer, as for winter season). Every ten years means for summer season are more then for winter season, as for polar, as for tropical region. Means (ten years) for tropical region are significance less then for polar region.

  8. Recent and future warm extreme events and high-mountain slope stability.

    PubMed

    Huggel, C; Salzmann, N; Allen, S; Caplan-Auerbach, J; Fischer, L; Haeberli, W; Larsen, C; Schneider, D; Wessels, R

    2010-05-28

    The number of large slope failures in some high-mountain regions such as the European Alps has increased during the past two to three decades. There is concern that recent climate change is driving this increase in slope failures, thus possibly further exacerbating the hazard in the future. Although the effects of a gradual temperature rise on glaciers and permafrost have been extensively studied, the impacts of short-term, unusually warm temperature increases on slope stability in high mountains remain largely unexplored. We describe several large slope failures in rock and ice in recent years in Alaska, New Zealand and the European Alps, and analyse weather patterns in the days and weeks before the failures. Although we did not find one general temperature pattern, all the failures were preceded by unusually warm periods; some happened immediately after temperatures suddenly dropped to freezing. We assessed the frequency of warm extremes in the future by analysing eight regional climate models from the recently completed European Union programme ENSEMBLES for the central Swiss Alps. The models show an increase in the higher frequency of high-temperature events for the period 2001-2050 compared with a 1951-2000 reference period. Warm events lasting 5, 10 and 30 days are projected to increase by about 1.5-4 times by 2050 and in some models by up to 10 times. Warm extremes can trigger large landslides in temperature-sensitive high mountains by enhancing the production of water by melt of snow and ice, and by rapid thaw. Although these processes reduce slope strength, they must be considered within the local geological, glaciological and topographic context of a slope.

  9. GROMACS 4:  Algorithms for Highly Efficient, Load-Balanced, and Scalable Molecular Simulation.

    PubMed

    Hess, Berk; Kutzner, Carsten; van der Spoel, David; Lindahl, Erik

    2008-03-01

    Molecular simulation is an extremely useful, but computationally very expensive tool for studies of chemical and biomolecular systems. Here, we present a new implementation of our molecular simulation toolkit GROMACS which now both achieves extremely high performance on single processors from algorithmic optimizations and hand-coded routines and simultaneously scales very well on parallel machines. The code encompasses a minimal-communication domain decomposition algorithm, full dynamic load balancing, a state-of-the-art parallel constraint solver, and efficient virtual site algorithms that allow removal of hydrogen atom degrees of freedom to enable integration time steps up to 5 fs for atomistic simulations also in parallel. To improve the scaling properties of the common particle mesh Ewald electrostatics algorithms, we have in addition used a Multiple-Program, Multiple-Data approach, with separate node domains responsible for direct and reciprocal space interactions. Not only does this combination of algorithms enable extremely long simulations of large systems but also it provides that simulation performance on quite modest numbers of standard cluster nodes.

  10. Exact extreme-value statistics at mixed-order transitions.

    PubMed

    Bar, Amir; Majumdar, Satya N; Schehr, Grégory; Mukamel, David

    2016-05-01

    We study extreme-value statistics for spatially extended models exhibiting mixed-order phase transitions (MOT). These are phase transitions that exhibit features common to both first-order (discontinuity of the order parameter) and second-order (diverging correlation length) transitions. We consider here the truncated inverse distance squared Ising model, which is a prototypical model exhibiting MOT, and study analytically the extreme-value statistics of the domain lengths The lengths of the domains are identically distributed random variables except for the global constraint that their sum equals the total system size L. In addition, the number of such domains is also a fluctuating variable, and not fixed. In the paramagnetic phase, we show that the distribution of the largest domain length l_{max} converges, in the large L limit, to a Gumbel distribution. However, at the critical point (for a certain range of parameters) and in the ferromagnetic phase, we show that the fluctuations of l_{max} are governed by novel distributions, which we compute exactly. Our main analytical results are verified by numerical simulations.

  11. Exploring the Early Chemical Evolution of the Milky Way with LAMOST and Subaru

    NASA Astrophysics Data System (ADS)

    Li, Haining; Aoki, Wako; Honda, Satoshi; Zhao, Gang; Suda, Takuma; Christlieb, Norbert

    Extremely Metal-Poor (EMP) stars ([Fe/H] < -3.0) provide fundamental evidence on the nucleosynthesis and enrichment of the first stars and supernovae. LAMOST will observe 6 million Galactic stars through a 5-year spectroscopic survey, and thus provide an unprecedented chance to enlarge the EMP star sample. In 2014, a joint project on EMP stars was initiated with the LAMOST survey and Subaru follow-up observation. So far, more than 70 EMP stars have been found and confirmed, including identifications of a number of chemically interesting objects: three UMP (ultra metal-poor) stars with [Fe/H] ˜ -4.0, including the second UMP turnoff star with Li detection; a super Li-rich (A(Li) = +3) EMP giant, which is the most extreme example of Li enhancement in red giants known to date; a few EMP stars showing extreme enhancements in neutron-capture elements. Statistics of a large sample of EMP stars will constrain formation of the Milky Way halo.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Katti, Amogh; Di Fatta, Giuseppe; Naughton, Thomas

    Future extreme-scale high-performance computing systems will be required to work under frequent component failures. The MPI Forum s User Level Failure Mitigation proposal has introduced an operation, MPI Comm shrink, to synchronize the alive processes on the list of failed processes, so that applications can continue to execute even in the presence of failures by adopting algorithm-based fault tolerance techniques. This MPI Comm shrink operation requires a failure detection and consensus algorithm. This paper presents three novel failure detection and consensus algorithms using Gossiping. The proposed algorithms were implemented and tested using the Extreme-scale Simulator. The results show that inmore » all algorithms the number of Gossip cycles to achieve global consensus scales logarithmically with system size. The second algorithm also shows better scalability in terms of memory and network bandwidth usage and a perfect synchronization in achieving global consensus. The third approach is a three-phase distributed failure detection and consensus algorithm and provides consistency guarantees even in very large and extreme-scale systems while at the same time being memory and bandwidth efficient.« less

  13. Staging memory for massively parallel processor

    NASA Technical Reports Server (NTRS)

    Batcher, Kenneth E. (Inventor)

    1988-01-01

    The invention herein relates to a computer organization capable of rapidly processing extremely large volumes of data. A staging memory is provided having a main stager portion consisting of a large number of memory banks which are accessed in parallel to receive, store, and transfer data words simultaneous with each other. Substager portions interconnect with the main stager portion to match input and output data formats with the data format of the main stager portion. An address generator is coded for accessing the data banks for receiving or transferring the appropriate words. Input and output permutation networks arrange the lineal order of data into and out of the memory banks.

  14. Advanced spacecraft: What will they look like and why

    NASA Technical Reports Server (NTRS)

    Price, Humphrey W.

    1990-01-01

    The next century of spaceflight will witness an expansion in the physical scale of spacecraft, from the extreme of the microspacecraft to the very large megaspacecraft. This will respectively spawn advances in highly integrated and miniaturized components, and also advances in lightweight structures, space fabrication, and exotic control systems. Challenges are also presented by the advent of advanced propulsion systems, many of which require controlling and directing hot plasma, dissipating large amounts of waste heat, and handling very high radiation sources. Vehicle configuration studies for a number of theses types of advanced spacecraft were performed, and some of them are presented along with the rationale for their physical layouts.

  15. Vertical interferometer workstation for testing large spherical optics

    NASA Astrophysics Data System (ADS)

    Truax, B.

    2013-09-01

    The design of an interferometer workstation for the testing of large concave and convex spherical optics is presented. The workstation handles optical components and mounts up to 425 mm in diameter with mass of up to 40 kg with 6 axes of adjustment. A unique method for the implementation of focus, roll and pitch was used allowing for extremely precise adjustment. The completed system includes transmission spheres with f-numbers from f/1.6 to f/0.82 incorporating reference surface diameters of up to 306 mm and surface accuracies of better than 63 nm PVr. The design challenges and resulting solutions are discussed. System performance results are presented.

  16. Circular on controlling the outflow of labourers, March 1989.

    PubMed

    1989-01-01

    In early March 1989, China's General Office of the State Council issued an urgent Circular demanding that various local governmental bodies "do a good job in strictly controlling the blind outflow of laborers." The circular pointed out that "since the Spring Festival, large numbers of laborers from Sichuan, Henan, Hubei, Shandong, Shaanxi, Jiangsu, Zhejiang, Anhui, and other provinces have concentrated in large numbers in regions such as the northwest, the northwest, and Guangdong Province, causing a huge increase in railroad passenger traffic. There has been a large pile-up of passengers on some main railroad lines and stations, and trains have been seriously overcrowded. This has put tremendous pressure on railroad transport. After arriving in the above mentioned regions, some of these laborers hang around the streets because they cannot find work, and their life is extremely difficult. The large influx of laborers into these regions has caused confusion in local social order." In order to resolve this problem satisfactorily, the circular makes the following demands: "The people's government at all levels must rapidly get under control the blind outflow of laborers and their assembly in large numbers for moving elsewhere. It is necessary to organize forces to admonish and stop those laborers who have already assembled at the railroad stations, so that they will not blindly move elsewhere. They should also be mobilized to return to their home towns." full text

  17. Geochemistry of extremely alkaline (pH>12) ground water in slag-fill aquifers.

    PubMed

    Roadcap, George S; Kelly, Walton R; Bethke, Craig M

    2005-01-01

    Extremely alkaline ground water has been found underneath many shuttered steel mills and slag dumps and has been an impediment to the cleanup and economic redevelopment of these sites because little is known about the geochemistry. A large number of these sites occur in the Lake Calumet region of Chicago, Illinois, where large-scale infilling of the wetlands with steel slag has created an aquifer with pH values as high as 12.8. To understand the geochemistry of the alkaline ground water system, we analyzed samples of ground water and the associated slag and weathering products from four sites. We also considered several potential remediation schemes to lower the pH and toxicity of the water. The principal cause of the alkaline conditions is the weathering of calcium silicates within the slag. The resulting ground water at most of the sites is dominated by Ca2+ and OH- in equilibrium with Ca(OH)2. Where the alkaline ground water discharges in springs, atmospheric CO2 dissolves into the water and thick layers of calcite form. Iron, manganese, and other metals in the metallic portion of the slag have corroded to form more stable low-temperature oxides and sulfides and have not accumulated in large concentrations in the ground water. Calcite precipitated at the springs is rich in a number of heavy metals, suggesting that metals can move through the system as particulate matter. Air sparging appears to be an effective remediation strategy for reducing the toxicity of discharging alkaline water.

  18. Overview of the biology of extreme events

    NASA Astrophysics Data System (ADS)

    Gutschick, V. P.; Bassirirad, H.

    2008-12-01

    Extreme events have, variously, meteorological origins as in heat waves or precipitation extremes, or biological origins as in pest and disease eruptions (or tectonic, earth-orbital, or impact-body origins). Despite growing recognition that these events are changing in frequency and intensity, a universal model of ecological responses to these events is slow to emerge. Extreme events, negative and positive, contrast with normal events in terms of their effects on the physiology, ecology, and evolution of organisms, hence also on water, carbon, and nutrient cycles. They structure biogeographic ranges and biomes, almost surely more than mean values often used to define biogeography. They are challenging to study for obvious reasons of field-readiness but also because they are defined by sequences of driving variables such as temperature, not point events. As sequences, their statistics (return times, for example) are challenging to develop, as also from the involvement of multiple environmental variables. These statistics are not captured well by climate models. They are expected to change with climate and land-use change but our predictive capacity is currently limited. A number of tools for description and analysis of extreme events are available, if not widely applied to date. Extremes for organisms are defined by their fitness effects on those organisms, and are specific to genotypes, making them major agents of natural selection. There is evidence that effects of extreme events may be concentrated in an extended recovery phase. We review selected events covering ranges of time and magnitude, from Snowball Earth to leaf functional loss in weather events. A number of events, such as the 2003 European heat wave, evidence effects on water and carbon cycles over large regions. Rising CO2 is the recent extreme of note, for its climatic effects and consequences for growing seasons, transpiration, etc., but also directly in its action as a substrate of photosynthesis. Effects on water and N cycles are already marked. Adaptive responses of plants are very irregularly distributed among species and genotypes, most adaptive responses having been lost over 20 My of minimal or virtually accidental genetic selection for correlated traits. Offsets of plant activity from those of pollinators and pests may amplify direct physiological effects on plants. Another extreme of interest is the insect-mediated mass dieoff of conifers across western North America tied to a rare combination of drought and year-long high temperatures.

  19. Dynamical systems proxies of atmospheric predictability and mid-latitude extremes

    NASA Astrophysics Data System (ADS)

    Messori, Gabriele; Faranda, Davide; Caballero, Rodrigo; Yiou, Pascal

    2017-04-01

    Extreme weather ocurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. Many extremes (for e.g. storms, heatwaves, cold spells, heavy precipitation) are tied to specific patterns of midlatitude atmospheric circulation. The ability to identify these patterns and use them to enhance the predictability of the extremes is therefore a topic of crucial societal and economic value. We propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We use two simple dynamical systems metrics - local dimension and persistence - to identify sets of similar large-scale atmospheric flow patterns which present a coherent temporal evolution. When these patterns correspond to weather extremes, they therefore afford a particularly good forward predictability. We specifically test this technique on European winter temperatures, whose variability largely depends on the atmospheric circulation in the North Atlantic region. We find that our dynamical systems approach provides predictability of large-scale temperature extremes up to one week in advance.

  20. Studying the impact of climate change on flooding in large river basins

    NASA Astrophysics Data System (ADS)

    Thiele-Eich, I.; Hopson, T.; Gilleland, E.; Lamarque, J.-F.; Hu, A.; Simmer, C.

    2012-04-01

    Assessing the potential impact of global climate change on hydrological extremes becomes crucial for regions such as Bangladesh, where a high population density results in a large exposure to risks associated with extreme flooding. In addition, low-lying countries such as Bangladesh are especially vulnerable to sea-level rise and its influence on present-day flood characteristics. By combining the impact of climate change on upper catchment precipitation as well as on sea-level rise at the river mouths, we attempt to analyze the development of flood characteristics such as frequency and magnitude in large river basins. Since flood duration is also of great importance to people exposed to flooding, the development of the number of days with extreme flooding is evaluated for possible trends in the future. Data used includes historical observations from the Global Runoff Data Centre, while recently released model output for upper catchment precipitation and annual mean thermosteric sea-level rise is taken from the four CCSM4 1° 20th Century ensemble members, as well as from six CCSM4 1° ensemble members for the reference concentration pathway scenarios RCP8.5, 6.0, 4.5 and 2.6. A peak-over-threshold approach is used to quantify the expected future changes in flood return levels, where discharge exceedances over a certain threshold are fit to a Generalized Pareto Distribution. Return levels are compared from both 20th century and future model simulations for time slices at 2030, 2050, 2070 and 2090. It can be seen that return periods of flood events decrease as the 21st century progresses in all RCP scenarios, with this shift most pronounced in RCP 8.5. The evaluation of flood duration, or the number of days with discharges above a certain threshold, yields an increase. While the number of days with flooding increases in all RCP scenarios, with the largest increase seen at the end of the 21st century, this increase is only statistically significant for RCP 8.5. Finally, we study how sea-level rise governs the flooding behavior further upstream by calculating the effective additional discharge due to the backwater effect of sea-level rise. Sea-level rise anomalies for the 21st century are taken from CCSM4 model output at each of the river mouths. Judging from our work, the increase in effective discharge due to sea-level rise cannot be neglected when discussing flooding in the respective river basins. Impact of sea-level rise on changes in return levels will be investigated further by using extreme-value theory to calculate how the tails of the current river discharge distribution will be shifted by changing climate.

  1. Gigavolt Bound free Transitions Driven by Extreme Light

    DTIC Science & Technology

    2016-05-12

    negligible role in near term scenarios, but become interesting in the multi-exawatt regime. A significant advance in numerical particle tracking is... negligible , the total momentum distribution is f(p) = ∑ i |Srip| 2 (14) where i indexes each ion. By loading a large number of ions into any given simulation...spectrum of tunnel ionized electrons. RR is the force acting on an electron due to its own fields. This force is normally negligible , only becoming

  2. Attributing Historical Changes in Probabilities of Record-Breaking Daily Temperature and Precipitation Extreme Events

    DOE PAGES

    Shiogama, Hideo; Imada, Yukiko; Mori, Masato; ...

    2016-08-07

    Here, we describe two unprecedented large (100-member), longterm (61-year) ensembles based on MRI-AGCM3.2, which were driven by historical and non-warming climate forcing. These ensembles comprise the "Database for Policy Decision making for Future climate change (d4PDF)". We compare these ensembles to large ensembles based on another climate model, as well as to observed data, to investigate the influence of anthropogenic activities on historical changes in the numbers of record-breaking events, including: the annual coldest daily minimum temperature (TNn), the annual warmest daily maximum temperature (TXx) and the annual most intense daily precipitation event (Rx1day). These two climate model ensembles indicatemore » that human activity has already had statistically significant impacts on the number of record-breaking extreme events worldwide mainly in the Northern Hemisphere land. Specifically, human activities have altered the likelihood that a wider area globally would suffer record-breaking TNn, TXx and Rx1day events than that observed over the 2001- 2010 period by a factor of at least 0.6, 5.4 and 1.3, respectively. However, we also find that the estimated spatial patterns and amplitudes of anthropogenic impacts on the probabilities of record-breaking events are sensitive to the climate model and/or natural-world boundary conditions used in the attribution studies.« less

  3. Have Large Dams Altered Extreme Precipitation Patterns?

    NASA Astrophysics Data System (ADS)

    Hossain, Faisal; Jeyachandran, Indumathi; Pielke, Roger

    2009-12-01

    Dams and their impounded waters are among the most common civil infrastructures, with a long heritage of modern design and operations experience. In particular, large dams, defined by the International Commission on Large Dams (ICOLD) as having a height greater than 15 meters from the foundation and holding a reservoir volume of more than 3 million cubic meters, have the potential to vastly transform local climate, landscapes, regional economics, and urbanization patterns. In the United States alone, about 75,000 dams are capable of storing a volume of water equaling almost 1 year's mean runoff of the nation [Graf, 1999]. The World Commission on Dams (WCD) reports that at least 45,000 large dams have been built worldwide since the 1930s. These sheer numbers raise the question of the extent to which large dams and their impounded waters alter patterns that would have been pervasive had the dams not been built.

  4. Large-scale data analysis of power grid resilience across multiple US service regions

    NASA Astrophysics Data System (ADS)

    Ji, Chuanyi; Wei, Yun; Mei, Henry; Calzada, Jorge; Carey, Matthew; Church, Steve; Hayes, Timothy; Nugent, Brian; Stella, Gregory; Wallace, Matthew; White, Joe; Wilcox, Robert

    2016-05-01

    Severe weather events frequently result in large-scale power failures, affecting millions of people for extended durations. However, the lack of comprehensive, detailed failure and recovery data has impeded large-scale resilience studies. Here, we analyse data from four major service regions representing Upstate New York during Super Storm Sandy and daily operations. Using non-stationary spatiotemporal random processes that relate infrastructural failures to recoveries and cost, our data analysis shows that local power failures have a disproportionally large non-local impact on people (that is, the top 20% of failures interrupted 84% of services to customers). A large number (89%) of small failures, represented by the bottom 34% of customers and commonplace devices, resulted in 56% of the total cost of 28 million customer interruption hours. Our study shows that extreme weather does not cause, but rather exacerbates, existing vulnerabilities, which are obscured in daily operations.

  5. Tomlinson-Harashima Precoding for Multiuser MIMO Systems With Quantized CSI Feedback and User Scheduling

    NASA Astrophysics Data System (ADS)

    Sun, Liang; McKay, Matthew R.

    2014-08-01

    This paper studies the sum rate performance of a low complexity quantized CSI-based Tomlinson-Harashima (TH) precoding scheme for downlink multiuser MIMO tansmission, employing greedy user selection. The asymptotic distribution of the output signal to interference plus noise ratio of each selected user and the asymptotic sum rate as the number of users K grows large are derived by using extreme value theory. For fixed finite signal to noise ratios and a finite number of transmit antennas $n_T$, we prove that as K grows large, the proposed approach can achieve the optimal sum rate scaling of the MIMO broadcast channel. We also prove that, if we ignore the precoding loss, the average sum rate of this approach converges to the average sum capacity of the MIMO broadcast channel. Our results provide insights into the effect of multiuser interference caused by quantized CSI on the multiuser diversity gain.

  6. THE LARGE HIGH PRESSURE ARC PLASMA GENERATOR: A FACILITY FOR SIMULATING MISSLE AND SATELLITE RE-ENTRY. Research Report 56

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rose, P.; Powers, W.; Hritzay, D.

    1959-06-01

    The development of an arc wind tunnel capable of stagnation pressures in the excess of twenty atmospheres and using as much as fifteen megawatts of electrical power is described. The calibration of this facility shows that it is capable of reproducing the aerodynamic environment encountered by vehicles flying at velocities as great as satellite velocity. Its use as a missile re-entry material test facility is described. The large power capacity of this facility allows one to make material tests on specimens of size sufficient to be useful for material development yet at realistic energy and Reynolds number values. By themore » addition of a high-capacity vacuum system, this facility can be used to produce the low density, high Mach number environment needed for simulating satellite re-entry, as well as hypersonic flight at extreme altitudes. (auth)« less

  7. Multi-floor cascading ferroelectric nanostructures: multiple data writing-based multi-level non-volatile memory devices.

    PubMed

    Hyun, Seung; Kwon, Owoong; Lee, Bom-Yi; Seol, Daehee; Park, Beomjin; Lee, Jae Yong; Lee, Ju Hyun; Kim, Yunseok; Kim, Jin Kon

    2016-01-21

    Multiple data writing-based multi-level non-volatile memory has gained strong attention for next-generation memory devices to quickly accommodate an extremely large number of data bits because it is capable of storing multiple data bits in a single memory cell at once. However, all previously reported devices have failed to store a large number of data bits due to the macroscale cell size and have not allowed fast access to the stored data due to slow single data writing. Here, we introduce a novel three-dimensional multi-floor cascading polymeric ferroelectric nanostructure, successfully operating as an individual cell. In one cell, each floor has its own piezoresponse and the piezoresponse of one floor can be modulated by the bias voltage applied to the other floor, which means simultaneously written data bits in both floors can be identified. This could achieve multi-level memory through a multiple data writing process.

  8. Small vs. Large Convective Cloud Objects from CERES Aqua Observations: Where are the Intraseasonal Variation Signals?

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2016-01-01

    During inactive phases of Madden-Julian oscillation (MJO), there are plenty of deep but small convective systems and far fewer deep and large ones. During active phases of MJO, a manifestation of an increase in the occurrence of large and deep cloud clusters results from an amplification of large-scale motions by stronger convective heating. This study is designed to quantitatively examine the roles of small and large cloud clusters during the MJO life cycle. We analyze the cloud object data from Aqua CERES observations for tropical deep convective (DC) and cirrostratus (CS) cloud object types according to the real-time multivariate MJO index. The cloud object is a contiguous region of the earth with a single dominant cloud-system type. The size distributions, defined as the footprint numbers as a function of cloud object diameters, for particular MJO phases depart greatly from the combined (8-phase) distribution at large cloud-object diameters due to the reduced/increased numbers of cloud objects related to changes in the large-scale environments. The medium diameter corresponding to the combined distribution is determined and used to partition all cloud objects into "small" and "large" groups of a particular phase. The two groups corresponding to the combined distribution have nearly equal numbers of footprints. The medium diameters are 502 km for DC and 310 km for cirrostratus. The range of the variation between two extreme phases (typically, the most active and depressed phases) for the small group is 6-11% in terms of the numbers of cloud objects and the total footprint numbers. The corresponding range for the large group is 19-44%. In terms of the probability density functions of radiative and cloud physical properties, there are virtually no differences between the MJO phases for the small group, but there are significant differences for the large groups for both DC and CS types. These results suggest that the intreseasonal variation signals reside at the large cloud clusters while the small cloud clusters represent the background noises resulting from various types of the tropical waves with different wavenumbers and propagation directions/speeds.

  9. Evaluating the Large-Scale Environment of Extreme Events Using Reanalyses

    NASA Astrophysics Data System (ADS)

    Bosilovich, M. G.; Schubert, S. D.; Koster, R. D.; da Silva, A. M., Jr.; Eichmann, A.

    2014-12-01

    Extreme conditions and events have always been a long standing concern in weather forecasting and national security. While some evidence indicates extreme weather will increase in global change scenarios, extremes are often related to the large scale atmospheric circulation, but also occurring infrequently. Reanalyses assimilate substantial amounts of weather data and a primary strength of reanalysis data is the representation of the large-scale atmospheric environment. In this effort, we link the occurrences of extreme events or climate indicators to the underlying regional and global weather patterns. Now, with greater than 3o years of data, reanalyses can include multiple cases of extreme events, and thereby identify commonality among the weather to better characterize the large-scale to global environment linked to the indicator or extreme event. Since these features are certainly regionally dependent, and also, the indicators of climate are continually being developed, we outline various methods to analyze the reanalysis data and the development of tools to support regional evaluation of the data. Here, we provide some examples of both individual case studies and composite studies of similar events. For example, we will compare the large scale environment for Northeastern US extreme precipitation with that of highest mean precipitation seasons. Likewise, southerly winds can shown to be a major contributor to very warm days in the Northeast winter. While most of our development has involved NASA's MERRA reanalysis, we are also looking forward to MERRA-2 which includes several new features that greatly improve the representation of weather and climate, especially for the regions and sectors involved in the National Climate Assessment.

  10. Extreme cyclone events in the Arctic: Wintertime variability and trends

    NASA Astrophysics Data System (ADS)

    Rinke, A.; Maturilli, M.; Graham, R. M.; Matthes, H.; Handorf, D.; Cohen, L.; Hudson, S. R.; Moore, J. C.

    2017-12-01

    Extreme cyclone events often occur during Arctic winters, and are of concern as they transport heat and moisture into the Arctic, which is associated with mixed-phase clouds and increased longwave downward radiation, and can cause temperatures to rise above freezing resulting in wintertime sea-ice melting or retarded sea-ice growth. With Arctic amplification and associated reduced sea-ice cover and warmer sea surface temperatures, the occurrence of extreme cyclones events could be a plausible scenario. We calculate the spatial patterns, and changes and trends of the number of extreme cyclone events in the Arctic based on ERA-Interim six-hourly sea level pressure (SLP) data for winter (November-February) 1979-2015. Further, we analyze the SLP data from the Ny-Ålesund station for the same 37 year period. We define an extreme cyclone event by an extreme low central pressure (SLP below 985 hPa, which is the 5th percentile of the Ny-Ålesund/N-ICE2015 SLP data). Typically 20-40 extreme cyclone events (sometimes called `weather bombs') occur in the Arctic North Atlantic per winter season, with an increasing trend of 6 events/decade, according to the Ny-Ålesund data. This increased frequency of extreme cyclones drive considerable warming in that region, consistent with the observed significant winter warming of 3 K/decade. The positive winter trend in extreme cyclones is dominated by a positive monthly trend of about 3-4 events/decade in November-December, due mainly to an increasing persistence of extreme cyclone events. A negative trend in January opposes this, while there is no significant trend in February. We relate the regional patterns of the trend in extreme cyclones to anomalously low sea-ice conditions in recent years, together with associated large-scale atmospheric circulation changes such as "blocking-like" circulation patterns (e.g. Scandinavian blocking in December and Ural blocking during January-February).

  11. Large-Scale Atmospheric Circulation Patterns Associated with Temperature Extremes as a Basis for Model Evaluation: Methodological Overview and Results

    NASA Astrophysics Data System (ADS)

    Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.

    2015-12-01

    Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are being simulated for plausible physical reasons, boosting confidence in future projections of temperature extremes. Conversely, where model skill is identified to be lower, caution should be exercised in interpreting future projections.

  12. Historical Time Series of Extreme Convective Weather in Finland

    NASA Astrophysics Data System (ADS)

    Laurila, T. K.; Mäkelä, A.; Rauhala, J.; Olsson, T.; Jylhä, K.

    2016-12-01

    Thunderstorms, lightning, tornadoes, downbursts, large hail and heavy precipitation are well-known for their impacts to human life. In the high latitudes as in Finland, these hazardous warm season convective weather events are focused in the summer season, roughly from May to September with peak in the midsummer. The position of Finland between the maritime Atlantic and the continental Asian climate zones makes possible large variability in weather in general which reflects also to the occurrence of severe weather; the hot, moist and extremely unstable air masses sometimes reach Finland and makes possible for the occurrence of extreme and devastating weather events. Compared to lower latitudes, the Finnish climate of severe convection is "moderate" and contains a large year-to-year variation; however, behind the modest annual average is hidden the climate of severe weather events that practically every year cause large economical losses and sometimes even losses of life. Because of the increased vulnerability of our modern society, these episodes have gained recently plenty of interest. During the decades, the Finnish Meteorological Institute (FMI) has collected observations and damage descriptions of severe weather episodes in Finland; thunderstorm days (1887-present), annual number of lightning flashes (1960-present), tornados (1796-present), large hail (1930-present), heavy rainfall (1922-present). The research findings show e.g. that a severe weather event may occur practically anywhere in the country, although in general the probability of occurrence is smaller in the Northern Finland. This study, funded by the Finnish Research Programme on Nuclear Power Plant Safety (SAFIR), combines the individual Finnish severe weather time series' and examines their trends, cross-correlation and correlations with other atmospheric parameters. Furthermore, a numerical weather model (HARMONIE) simulation is performed for a historical severe weather case for analyzing how well the present state-of-the-art models grasp these small-scale weather phenomena. Our results give important background for estimating the Finnish severe weather climate in the future.

  13. Identification of large-scale meteorological patterns associated with extreme precipitation in the US northeast

    NASA Astrophysics Data System (ADS)

    Agel, Laurie; Barlow, Mathew; Feldstein, Steven B.; Gutowski, William J.

    2018-03-01

    Patterns of daily large-scale circulation associated with Northeast US extreme precipitation are identified using both k-means clustering (KMC) and Self-Organizing Maps (SOM) applied to tropopause height. The tropopause height provides a compact representation of the upper-tropospheric potential vorticity, which is closely related to the overall evolution and intensity of weather systems. Extreme precipitation is defined as the top 1% of daily wet-day observations at 35 Northeast stations, 1979-2008. KMC is applied on extreme precipitation days only, while the SOM algorithm is applied to all days in order to place the extreme results into the overall context of patterns for all days. Six tropopause patterns are identified through KMC for extreme day precipitation: a summertime tropopause ridge, a summertime shallow trough/ridge, a summertime shallow eastern US trough, a deeper wintertime eastern US trough, and two versions of a deep cold-weather trough located across the east-central US. Thirty SOM patterns for all days are identified. Results for all days show that 6 SOM patterns account for almost half of the extreme days, although extreme precipitation occurs in all SOM patterns. The same SOM patterns associated with extreme precipitation also routinely produce non-extreme precipitation; however, on extreme precipitation days the troughs, on average, are deeper and the downstream ridges more pronounced. Analysis of other fields associated with the large-scale patterns show various degrees of anomalously strong moisture transport preceding, and upward motion during, extreme precipitation events.

  14. Extreme Quantum Memory Advantage for Rare-Event Sampling

    NASA Astrophysics Data System (ADS)

    Aghamohammadi, Cina; Loomis, Samuel P.; Mahoney, John R.; Crutchfield, James P.

    2018-02-01

    We introduce a quantum algorithm for memory-efficient biased sampling of rare events generated by classical memoryful stochastic processes. Two efficiency metrics are used to compare quantum and classical resources for rare-event sampling. For a fixed stochastic process, the first is the classical-to-quantum ratio of required memory. We show for two example processes that there exists an infinite number of rare-event classes for which the memory ratio for sampling is larger than r , for any large real number r . Then, for a sequence of processes each labeled by an integer size N , we compare how the classical and quantum required memories scale with N . In this setting, since both memories can diverge as N →∞ , the efficiency metric tracks how fast they diverge. An extreme quantum memory advantage exists when the classical memory diverges in the limit N →∞ , but the quantum memory has a finite bound. We then show that finite-state Markov processes and spin chains exhibit memory advantage for sampling of almost all of their rare-event classes.

  15. Distribution of correlated spiking events in a population-based approach for Integrate-and-Fire networks.

    PubMed

    Zhang, Jiwei; Newhall, Katherine; Zhou, Douglas; Rangan, Aaditya

    2014-04-01

    Randomly connected populations of spiking neurons display a rich variety of dynamics. However, much of the current modeling and theoretical work has focused on two dynamical extremes: on one hand homogeneous dynamics characterized by weak correlations between neurons, and on the other hand total synchrony characterized by large populations firing in unison. In this paper we address the conceptual issue of how to mathematically characterize the partially synchronous "multiple firing events" (MFEs) which manifest in between these two dynamical extremes. We further develop a geometric method for obtaining the distribution of magnitudes of these MFEs by recasting the cascading firing event process as a first-passage time problem, and deriving an analytical approximation of the first passage time density valid for large neuron populations. Thus, we establish a direct link between the voltage distributions of excitatory and inhibitory neurons and the number of neurons firing in an MFE that can be easily integrated into population-based computational methods, thereby bridging the gap between homogeneous firing regimes and total synchrony.

  16. Extreme Mean and Its Applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.

    1979-01-01

    Extreme value statistics obtained from normally distributed data are considered. An extreme mean is defined as the mean of p-th probability truncated normal distribution. An unbiased estimate of this extreme mean and its large sample distribution are derived. The distribution of this estimate even for very large samples is found to be nonnormal. Further, as the sample size increases, the variance of the unbiased estimate converges to the Cramer-Rao lower bound. The computer program used to obtain the density and distribution functions of the standardized unbiased estimate, and the confidence intervals of the extreme mean for any data are included for ready application. An example is included to demonstrate the usefulness of extreme mean application.

  17. Evaluating the ClimEx Single Model large ensemble in comparison with EURO-CORDEX results of heatwave and drought indicators

    NASA Astrophysics Data System (ADS)

    von Trentini, F.; Schmid, F. J.; Braun, M.; Frigon, A.; Leduc, M.; Martel, J. L.; Willkofer, F.; Wood, R. R.; Ludwig, R.

    2017-12-01

    Meteorological extreme events seem to become more frequent in the present and future, and a seperation of natural climate variability and a clear climate change effect on these extreme events gains more and more interest. Since there is only one realisation of historical events, natural variability in terms of very long timeseries for a robust statistical analysis is not possible with observation data. A new single model large ensemble (SMLE), developed for the ClimEx project (Climate change and hydrological extreme events - risks and perspectives for water management in Bavaria and Québec) is supposed to overcome this lack of data by downscaling 50 members of the CanESM2 (RCP 8.5) with the Canadian CRCM5 regional model (using the EURO-CORDEX grid specifications) for timeseries of 1950-2099 each, resulting in 7500 years of simulated climate. This allows for a better probabilistic analysis of rare and extreme events than any preceding dataset. Besides seasonal sums, several indicators concerning heatwave frequency, duration and mean temperature a well as number and maximum length of dry periods (cons. days <1mm) are calculated for the ClimEx ensemble and several EURO-CORDEX runs. This enables us to investigate the interaction between natural variability (as it appears in the CanESM2-CRCM5 members) and a climate change signal of those members for past, present and future conditions. Adding the EURO-CORDEX results to this, we can also assess the role of internal model variability (or natural variability) in climate change simulations. A first comparison shows similar magnitudes of variability of climate change signals between the ClimEx large ensemble and the CORDEX runs for some indicators, while for most indicators the spread of the SMLE is smaller than the spread of different CORDEX models.

  18. Hydroclimatic Extremes and Cholera Dynamics in the 21st Century

    NASA Astrophysics Data System (ADS)

    Akanda, A. S.; Jutla, A. S.; Islam, S.

    2012-12-01

    Cholera, an acute water-borne diarrheal illness, has reemerged as a significant health threat across much of the developing world. Despite major advances in the ecological and the microbiological understanding of the causative agent, V. cholerae, the role of the underlying climatic and environmental processes in propagating transmission is not adequately understood. Recent findings suggest a more prominent role of hydroclimatic extremes - droughts and floods - on the unique dual cholera peaks in the Bengal Delta region of South Asia, the native homeland of cholera. Increasing water scarcity and abundance, and coastal sea-level rise, influenced by changing climate patterns and large-scale climatic phenomena, is likely to adversely impact cholera transmission in South Asia. We focus on understanding how associated changes in macro-scale conditions in this region will impact micro-scale processes related to cholera in coming decades. We use the PRECIS Regional Climate Model over the Ganges-Brahmaputra-Meghna (GBM) basin region to simulate detailed high resolution projections of climate patterns for the 21st century. Precipitation outputs are analyzed for the 1980-2040 period to identify the trends and changes in hydroclimatic extremes and potential impacts on cholera dynamics over the next three decades (2010-2040), in relation to the cholera surveillance operations over the past three decades (1980-2010). We find that an increased number of extreme precipitation events with prolonged dry periods in the Ganges basin region will likely adversely affect dry season cholera outbreaks. Increased monsoon precipitation volumes in the Brahmaputra basin catchments are likely to cause record floods and subsequently trigger large epidemics in downstream areas. Our results provide new insight by identifying the changes in the two distinctly different, pre and post monsoon, cholera transmission mechanisms related to large-scale climatic controls that prevail in the region. A quantitative understanding of the changes in seasonal hydroclimatic controls and underlying dominant processes will form the basis for forecasting future epidemic cholera outbreaks in light of changing climate patterns.

  19. The association of extreme temperatures and the incidence of tuberculosis in Japan

    NASA Astrophysics Data System (ADS)

    Onozuka, Daisuke; Hagihara, Akihito

    2015-08-01

    Seasonal variation in the incidence of tuberculosis (TB) has been widely assumed. However, few studies have investigated the association between extreme temperatures and the incidence of TB. We collected data on cases of TB and mean temperature in Fukuoka, Japan for 2008-2012 and used time-series analyses to assess the possible relationship of extreme temperatures with TB incident cases, adjusting for seasonal and interannual variation. Our analysis revealed that the occurrence of extreme heat temperature events resulted in a significant increase in the number of TB cases (relative risk (RR) 1.20, 95 % confidence interval (CI) 1.01-1.43). We also found that the occurrence of extreme cold temperature events resulted in a significant increase in the number of TB cases (RR 1.23, 95 % CI 1.05-1.45). Sex and age did not modify the effect of either heat or cold extremes. Our study provides quantitative evidence that the number of TB cases increased significantly with extreme heat and cold temperatures. The results may help public health officials predict extreme temperature-related TB incidence and prepare for the implementation of preventive public health interventions.

  20. Channels That Cooperatively Service a Data Stream and Voice Messages. II. Diffusion Approximations.

    DTIC Science & Technology

    1979-11-01

    Leoczky, Profegsr J 2 ’ "-Carnegie-Mellon University D -1 " Gaver, Professor Department of Operations Research Reviewed by: Released by: Michael G...ered) REPORT DOCUMENTATION PAGE BFRE CNSTRUTINOR ~I4 REPORT-9Nu 2 . GOVT ACCESSION No. 3 RECIPIENT’S CATALOG NUMBER 4TITLE (and Sbtile) TPOFP0 ED...approximations valid for the extreme but realistic case in which n/p is large, and none give a Wiener approximation. 2 Rather than focus on this simple

  1. The thermolysin family (M4) of enzymes: therapeutic and biotechnological potential.

    PubMed

    Adekoya, Olayiwola A; Sylte, Ingebrigt

    2009-01-01

    Zinc containing peptidases are widely distributed in nature and have important roles in many physiological processes. M4 family comprises numerous zinc-dependent metallopeptidases that hydrolyze peptide bonds. A large number of these enzymes are implicated as virulence factors of the microorganisms that produce them and are therefore potential drug targets. Some enzymes of the family are able to function at the extremes of temperatures, and some function in organic solvents. Thereby enzymes of the thermolysin family have an innovative potential for biotechnological applications.

  2. Open clusters as laboratories: The angular momentum evolution of young stars

    NASA Technical Reports Server (NTRS)

    Stauffer, John R.

    1994-01-01

    This is the annual status report for the third year of our LTSA grant 'Open Clusters as Laboratories.' Because we have now had a few years to work on the project, we have started to produce and publish a large number of papers. We have been extremely successful in obtaining ROSAT observations of open clusters. With the demise of the PSPC on ROSAT, our main data source has come to an end and we will be able to concentrate on analyzing those data.

  3. Personal identification by eyes.

    PubMed

    Marinović, Dunja; Njirić, Sanja; Coklo, Miran; Muzić, Vedrana

    2011-09-01

    Identification of persons through the eyes is in the field of biometrical science. Many security systems are based on biometric methods of personal identification, to determine whether a person is presenting itself truly. The human eye contains an extremely large number of individual characteristics that make it particularly suitable for the process of identifying a person. Today, the eye is considered to be one of the most reliable body parts for human identification. Systems using iris recognition are among the most secure biometric systems.

  4. Using ensembles in water management: forecasting dry and wet episodes

    NASA Astrophysics Data System (ADS)

    van het Schip-Haverkamp, Tessa; van den Berg, Wim; van de Beek, Remco

    2015-04-01

    Extreme weather situations as droughts and extensive precipitation are becoming more frequent, which makes it more important to obtain accurate weather forecasts for the short and long term. Ensembles can provide a solution in terms of scenario forecasts. MeteoGroup uses ensembles in a new forecasting technique which presents a number of weather scenarios for a dynamical water management project, called Water-Rijk, in which water storage and water retention plays a large role. The Water-Rijk is part of Park Lingezegen, which is located between Arnhem and Nijmegen in the Netherlands. In collaboration with the University of Wageningen, Alterra and Eijkelkamp a forecasting system is developed for this area which can provide water boards with a number of weather and hydrology scenarios in order to assist in the decision whether or not water retention or water storage is necessary in the near future. In order to make a forecast for drought and extensive precipitation, the difference 'precipitation- evaporation' is used as a measurement of drought in the weather forecasts. In case of an upcoming drought this difference will take larger negative values. In case of a wet episode, this difference will be positive. The Makkink potential evaporation is used which gives the most accurate potential evaporation values during the summer, when evaporation plays an important role in the availability of surface water. Scenarios are determined by reducing the large number of forecasts in the ensemble to a number of averaged members with each its own likelihood of occurrence. For the Water-Rijk project 5 scenario forecasts are calculated: extreme dry, dry, normal, wet and extreme wet. These scenarios are constructed for two forecasting periods, each using its own ensemble technique: up to 48 hours ahead and up to 15 days ahead. The 48-hour forecast uses an ensemble constructed from forecasts of multiple high-resolution regional models: UKMO's Euro4 model,the ECMWF model, WRF and Hirlam. Using multiple model runs and additional post processing, an ensemble can be created from non-ensemble models. The 15-day forecast uses the ECMWF Ensemble Prediction System forecast from which scenarios can be deduced directly. A combination of the ensembles from the two forecasting periods is used in order to have the highest possible resolution of the forecast for the first 48 hours followed by the lower resolution long term forecast.

  5. Variability of extreme rainfall over La Plata Basin and Amazon Basin in South America in model simulations of the 20th century and projections under global warming

    NASA Astrophysics Data System (ADS)

    Cavalcanti, I. F.

    2011-12-01

    The two largest river basins in South America are Amazon Basin (AMB) in the tropical region and La Plata Basin (LPB) in subtropical and extratropical regions. Extreme droughts have occurred during this decade in Amazonia region which have affected the transportation, fishing activities with impacts in the local population, and also affecting the forest. Droughts or floods over LPB have impacts on agriculture, hydroelectricity power and social life. Therefore, monthly wet and dry extremes in these two regions have a profound effect on the economy and society. Observed rainfall over Amazon Basin (AMB) and La Plata Basin (LPB) is analyzed in monthly timescale using the Standardized Precipitation Index (SPI), from 1979 to 1999. This period is taken to compare GPCP data with HADCM3 simulations (Hadley Centre) of the 20th century and to analyze reanalyses data which have the contribution of satellite information after 1979. HADCM3 projections using SRES A2 scenario is analyzed in two periods: 2000 to 2020 and 2079 to 2099 to study the extremes frequency in a near future and in a longer timescale. Extreme, severe and moderate cases are identified in the northern and southern sectors of LPB and in the western and eastern sectors of AMB. The main objective is to analyze changes in the frequency of cases, considering the global warming and the associated mechanisms. In the observations for the 20th century, the number of extreme rainy cases is higher than the number of dry cases in both sectors of LPB and AMB. The model simulates this variability in the two sectors of LPB and in the west sector of AMB. In the near future 2000 to 2020 the frequency of wet and dry extremes does not change much in LPB and in the western sector of AMB, but the wet cases increase in the eastern AMB. However, in the period of 2079 to 2099 the projections indicate increase of wet cases in LPB and increase of dry cases in AMB. The influence of large scale features related to Sea Surface Temperature Anomalies, Walker and Hadley circulations, teleconnections, as well as the regional features related to humidity flux are discussed. The extreme droughts of 2005 and 2010 in Amazonia are show to be related to these features.

  6. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    NASA Astrophysics Data System (ADS)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  7. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    NASA Astrophysics Data System (ADS)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  8. Climate Impacts on Extreme Energy Consumption of Different Types of Buildings

    PubMed Central

    Li, Mingcai; Shi, Jun; Guo, Jun; Cao, Jingfu; Niu, Jide; Xiong, Mingming

    2015-01-01

    Exploring changes of building energy consumption and its relationships with climate can provide basis for energy-saving and carbon emission reduction. Heating and cooling energy consumption of different types of buildings during 1981-2010 in Tianjin city, was simulated by using TRNSYS software. Daily or hourly extreme energy consumption was determined by percentile methods, and the climate impact on extreme energy consumption was analyzed. The results showed that days of extreme heating consumption showed apparent decrease during the recent 30 years for residential and large venue buildings, whereas days of extreme cooling consumption increased in large venue building. No significant variations were found for the days of extreme energy consumption for commercial building, although a decreasing trend in extreme heating energy consumption. Daily extreme energy consumption for large venue building had no relationship with climate parameters, whereas extreme energy consumption for commercial and residential buildings was related to various climate parameters. Further multiple regression analysis suggested heating energy consumption for commercial building was affected by maximum temperature, dry bulb temperature, solar radiation and minimum temperature, which together can explain 71.5 % of the variation of the daily extreme heating energy consumption. The daily extreme cooling energy consumption for commercial building was only related to the wet bulb temperature (R2= 0.382). The daily extreme heating energy consumption for residential building was affected by 4 climate parameters, but the dry bulb temperature had the main impact. The impacts of climate on hourly extreme heating energy consumption has a 1-3 hour delay in all three types of buildings, but no delay was found in the impacts of climate on hourly extreme cooling energy consumption for the selected buildings. PMID:25923205

  9. Climate impacts on extreme energy consumption of different types of buildings.

    PubMed

    Li, Mingcai; Shi, Jun; Guo, Jun; Cao, Jingfu; Niu, Jide; Xiong, Mingming

    2015-01-01

    Exploring changes of building energy consumption and its relationships with climate can provide basis for energy-saving and carbon emission reduction. Heating and cooling energy consumption of different types of buildings during 1981-2010 in Tianjin city, was simulated by using TRNSYS software. Daily or hourly extreme energy consumption was determined by percentile methods, and the climate impact on extreme energy consumption was analyzed. The results showed that days of extreme heating consumption showed apparent decrease during the recent 30 years for residential and large venue buildings, whereas days of extreme cooling consumption increased in large venue building. No significant variations were found for the days of extreme energy consumption for commercial building, although a decreasing trend in extreme heating energy consumption. Daily extreme energy consumption for large venue building had no relationship with climate parameters, whereas extreme energy consumption for commercial and residential buildings was related to various climate parameters. Further multiple regression analysis suggested heating energy consumption for commercial building was affected by maximum temperature, dry bulb temperature, solar radiation and minimum temperature, which together can explain 71.5 % of the variation of the daily extreme heating energy consumption. The daily extreme cooling energy consumption for commercial building was only related to the wet bulb temperature (R2= 0.382). The daily extreme heating energy consumption for residential building was affected by 4 climate parameters, but the dry bulb temperature had the main impact. The impacts of climate on hourly extreme heating energy consumption has a 1-3 hour delay in all three types of buildings, but no delay was found in the impacts of climate on hourly extreme cooling energy consumption for the selected buildings.

  10. Can custom-made biomechanic shoe orthoses prevent problems in the back and lower extremities? A randomized, controlled intervention trial of 146 military conscripts.

    PubMed

    Larsen, Kristian; Weidich, Flemming; Leboeuf-Yde, Charlotte

    2002-06-01

    Shock-absorbing and biomechanic shoe orthoses are frequently used in the prevention and treatment of back and lower extremity problems. One review concludes that the former is clinically effective in relation to prevention, whereas the latter has been tested in only 1 randomized clinical trial, concluding that stress fractures could be prevented. To investigate if biomechanic shoe orthoses can prevent problems in the back and lower extremities and if reducing the number of days off-duty because of back or lower extremity problems is possible. Prospective, randomized, controlled intervention trial. One female and 145 male military conscripts (aged 18 to 24 years), representing 25% of all new conscripts in a Danish regiment. Health data were collected by questionnaires at initiation of the study and 3 months later. Custom-made biomechanic shoe orthoses to be worn in military boots were provided to all in the study group during the 3-month intervention period. No intervention was provided for the control group. Differences between the 2 groups were tested with the chi-square test, and statistical significance was accepted at P <.05. Risk ratio (RR), risk difference (ARR), numbers needed to prevent (NNP), and cost per successfully prevented case were calculated. Outcome variables included self-reported back and/or lower extremity problems; specific problems in the back or knees or shin splints, Achilles tendonitis, sprained ankle, or other problems in the lower extremity; number of subjects with at least 1 day off-duty because of back or lower extremity problems and total number of days off-duty within the first 3 months of military service because of back or lower extremity problems. Results were significantly better in an actual-use analysis in the intervention group for total number of subjects with back or lower extremity problems (RR 0.7, ARR 19%, NNP 5, cost 98 US dollars); number of subjects with shin splints (RR 0.2, ARR 19%, NNP 5, cost 101 US dollars); number of off-duty days because of back or lower extremity problems (RR 0.6, ARR < 1%, NNP 200, cost 3750 US dollars). In an intention-to-treat analysis, a significant difference was found for only number of subjects with shin splints (RR 0.3, ARR 18%, NNP 6 cost 105 US dollars), whereas a worst-case analysis revealed no significant differences between the study groups. This study shows that it may be possible to prevent certain musculoskeletal problems in the back or lower extremities among military conscripts by using custom-made biomechanic shoe orthoses. However, because care-seeking for lower extremity problems is rare, using this method of prevention in military conscripts would be too costly. We also noted that the choice of statistical approach determined the outcome.

  11. Extremely large telescopes as a motor of socio-economic development and implications of their construction and installation

    NASA Astrophysics Data System (ADS)

    Burgos-Martin, J.; Sanchez-Padron, M.; Sanchez, F.; Martinez-Roger, Carlos

    2004-07-01

    Large-Scale observing facilities are scarce and costly. Even so, the perspective to enlarge or to increase the number of these facilities are quite real and several projects are undertaking their first steps in this direction. These costly facilities require the cooperation of highly qualified institutions, able to undertake the project from the scientific and technological point of view, as well as the vital collaboration and effective support of several countries, at the highest level, able to provide the necessary investment for their construction. Because of these technological implications and the financial magnitude of these projects, their impact goes well beyond the international astrophysical community. We propose to carry out a study on the socio-economic impact from the construction and operation of an Extremely Large Telescope of class 30 - 100 m. We plan to approach several aspects such as its impact in the promotion of the employment; social, educational and cultural integration of the population; the impulse of industries; its impact on the national and international policies on research; environmental issues; etc. We will also analyze the financial instruments available, and those special aids only accessible for some countries and regions to encourage their participation in projects of this magnitude.

  12. Heidelberg Retina Tomography Analysis in Optic Disks with Anatomic Particularities

    PubMed Central

    Alexandrescu, C; Pascu, R; Ilinca, R; Popescu, V; Ciuluvica, R; Voinea, L; Celea, C

    2010-01-01

    Due to its objectivity, reproducibility and predictive value confirmed by many large scale statistical clinical studies, Heidelberg Retina Tomography has become one of the most used computerized image analysis of the optic disc in glaucoma. It has been signaled, though, that the diagnostic value of Moorfieds Regression Analyses and Glaucoma Probability Score decreases when analyzing optic discs with extreme sizes. The number of false positive results increases in cases of megalopapilllae and the number of false negative results increases in cases of small size optic discs. The present paper is a review of the aspects one should take into account when analyzing a HRT result of an optic disc with anatomic particularities. PMID:21254731

  13. Practice makes perfect in memory recall

    PubMed Central

    Romani, Sandro; Katkov, Mikhail

    2016-01-01

    A large variability in performance is observed when participants recall briefly presented lists of words. The sources of such variability are not known. Our analysis of a large data set of free recall revealed a small fraction of participants that reached an extremely high performance, including many trials with the recall of complete lists. Moreover, some of them developed a number of consistent input-position-dependent recall strategies, in particular recalling words consecutively (“chaining”) or in groups of consecutively presented words (“chunking”). The time course of acquisition and particular choice of positional grouping were variable among participants. Our results show that acquiring positional strategies plays a crucial role in improvement of recall performance. PMID:26980785

  14. High Sensitive Scintillation Observations At Very Low Frequencies

    NASA Astrophysics Data System (ADS)

    Konovalenko, A. A.; Falkovich, I. S.; Kalinichenko, N. N.; Olyak, M. R.; Lecacheux, A.; Rosolen, C.; Bougeret, J.-L.; Rucker, H. O.; Tokarev, Yu.

    The observation of interplanetary scintillations of compact radio sources is powerful method of solar wind diagnostics. This method is developed mainly at decimeter- meter wavelengths. New possibilities are opened at extremely low frequencies (decameter waves) especially at large elongations. Now this approach is being actively developed using high effective decameter antennas UTR-2, URAN and Nancay Decameter Array. New class of back-end facility like high dynamic range, high resolution digital spectral processors, as well as dynamic spectra determination ideology give us new opportunities for distinguishing of the ionospheric and interplanetary scintillations and for observations of large number of radio sources, whith different angular sizes and elongations, even for the cases of rather weak objects.

  15. SIGN SINGULARITY AND FLARES IN SOLAR ACTIVE REGION NOAA 11158

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sorriso-Valvo, L.; De Vita, G.; Kazachenko, M. D.

    Solar Active Region NOAA 11158 has hosted a number of strong flares, including one X2.2 event. The complexity of current density and current helicity are studied through cancellation analysis of their sign-singular measure, which features power-law scaling. Spectral analysis is also performed, revealing the presence of two separate scaling ranges with different spectral index. The time evolution of parameters is discussed. Sudden changes of the cancellation exponents at the time of large flares and the presence of correlation with Extreme-Ultra-Violet and X-ray flux suggest that eruption of large flares can be linked to the small-scale properties of the current structures.

  16. Theory and computation of optimal low- and medium-thrust transfers

    NASA Technical Reports Server (NTRS)

    Chuang, C.-H.

    1994-01-01

    This report presents two numerical methods considered for the computation of fuel-optimal, low-thrust orbit transfers in large numbers of burns. The origins of these methods are observations made with the extremal solutions of transfers in small numbers of burns; there seems to exist a trend such that the longer the time allowed to perform an optimal transfer the less fuel that is used. These longer transfers are obviously of interest since they require a motor of low thrust; however, we also find a trend that the longer the time allowed to perform the optimal transfer the more burns are required to satisfy optimality. Unfortunately, this usually increases the difficulty of computation. Both of the methods described use small-numbered burn solutions to determine solutions in large numbers of burns. One method is a homotopy method that corrects for problems that arise when a solution requires a new burn or coast arc for optimality. The other method is to simply patch together long transfers from smaller ones. An orbit correction problem is solved to develop this method. This method may also lead to a good guidance law for transfer orbits with long transfer times.

  17. Uncovering Heavily Obscured AGN with WISE and NuSTAR

    NASA Astrophysics Data System (ADS)

    Hickox, Ryan C.; Carroll, Christopher M.; Yan, Wei; DiPompeo, Michael A.; Hainline, Kevin N.; NuSTAR Obscured AGN Team

    2018-01-01

    Supermassive black holes gain their mass through accretion as active galactic nuclei (AGN), but it is now clear that a large fraction of this growth is "hidden" behind large columns of gas and dust. Of particular interest are Compton-thick (CT) AGN, with columns NH > 1024 cm-2, that have been difficult to identify using optical or soft X-ray surveys. We will present two studies of heavily obscured AGN that aim to uncover more of the full population of "hidden" growing black holes: (1) Analysis of the spectral energy distributions of millions of galaxies with photometry from WISE (mid-IR), UKIDSS (near-IR), and SDSS (optical), that uncovers large populations of weak or heavily buried AGN, and (2) NuSTAR observations of a sample of candidate highly obscured AGN, selected from WISE and SDSS photometry,and confirmed using SALT and Keck spectroscopy. The NuSTAR data reveal the existence of powerful CT quasars with extremely large columns NH > 1025 cm-2, which may represent a significant fraction of previously hidden black hole growth. This work is supported by NASA grant numbers NNX16AN48G and NNX15AP24G, and the NSF through grant numbers 1515364 and 1554584.

  18. Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patchett, John M; Ahrens, James P; Lo, Li - Ta

    2010-10-15

    Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less

  19. Recent progress in 3-D imaging of sea freight containers

    NASA Astrophysics Data System (ADS)

    Fuchs, Theobald; Schön, Tobias; Dittmann, Jonas; Sukowski, Frank; Hanke, Randolf

    2015-03-01

    The inspection of very large objects like sea freight containers with X-ray Computed Tomography (CT) is an emerging technology. A complete 3-D CT scan of a see-freight container takes several hours. Of course, this is too slow to apply it to a large number of containers. However, the benefits of a 3-D CT for sealed freight are obvious: detection of potential threats or illicit cargo without being confronted with legal complications or high time consumption and risks for the security personnel during a manual inspection. Recently distinct progress was made in the field of reconstruction of projections with only a relatively low number of angular positions. Instead of today's 500 to 1000 rotational steps, as needed for conventional CT reconstruction techniques, this new class of algorithms provides the potential to reduce the number of projection angles approximately by a factor of 10. The main drawback of these advanced iterative methods is the high consumption for numerical processing. But as computational power is getting steadily cheaper, there will be practical applications of these complex algorithms in a foreseeable future. In this paper, we discuss the properties of iterative image reconstruction algorithms and show results of their application to CT of extremely large objects scanning a sea-freight container. A specific test specimen is used to quantitatively evaluate the image quality in terms of spatial and contrast resolution and depending on different number of projections.

  20. Characterization of Sound Radiation by Unresolved Scales of Motion in Computational Aeroacoustics

    NASA Technical Reports Server (NTRS)

    Rubinstein, Robert; Zhou, Ye

    1999-01-01

    Evaluation of the sound sources in a high Reynolds number turbulent flow requires time-accurate resolution of an extremely large number of scales of motion. Direct numerical simulations will therefore remain infeasible for the forseeable future: although current large eddy simulation methods can resolve the largest scales of motion accurately the, they must leave some scales of motion unresolved. A priori studies show that acoustic power can be underestimated significantly if the contribution of these unresolved scales is simply neglected. In this paper, the problem of evaluating the sound radiation properties of the unresolved, subgrid-scale motions is approached in the spirit of the simplest subgrid stress models: the unresolved velocity field is treated as isotropic turbulence with statistical descriptors, evaluated from the resolved field. The theory of isotropic turbulence is applied to derive formulas for the total power and the power spectral density of the sound radiated by a filtered velocity field. These quantities are compared with the corresponding quantities for the unfiltered field for a range of filter widths and Reynolds numbers.

  1. Adiabatic theory for the population distribution in the evolutionary minority game

    NASA Astrophysics Data System (ADS)

    Chen, Kan; Wang, Bing-Hong; Yuan, Baosheng

    2004-02-01

    We study the evolutionary minority game (EMG) using a statistical mechanics approach. We derive a theory for the steady-state population distribution of the agents. The theory is based on an “adiabatic approximation” in which short time fluctuations in the population distribution are integrated out to obtain an effective equation governing the steady-state distribution. We discover the mechanism for the transition from segregation (into opposing groups) to clustering (towards cautious behaviors). The transition is determined by two generic factors: the market impact (of the agents’ own actions) and the short time market inefficiency (arbitrage opportunities) due to fluctuations in the numbers of agents using opposite strategies. A large market impact favors “extreme” players who choose fixed opposite strategies, while large market inefficiency favors cautious players. The transition depends on the number of agents (N) and the effective rate of strategy switching. When N is small, the market impact is relatively large; this favors the extreme behaviors. Frequent strategy switching, on the other hand, leads to a clustering of the cautious agents.

  2. Single-particle dispersion in compressible turbulence

    NASA Astrophysics Data System (ADS)

    Zhang, Qingqing; Xiao, Zuoli

    2018-04-01

    Single-particle dispersion statistics in compressible box turbulence are studied using direct numerical simulation. Focus is placed on the detailed discussion of effects of the particle Stokes number and turbulent Mach number, as well as the forcing type. When solenoidal forcing is adopted, it is found that the single-particle dispersion undergoes a transition from the ballistic regime at short times to the diffusive regime at long times, in agreement with Taylor's particle dispersion argument. The strongest dispersion of heavy particles is announced when the Stokes number is of order 1, which is similar to the scenario in incompressible turbulence. The dispersion tends to be suppressed as the Mach number increases. When hybrid solenoidal and compressive forcing at a ratio of 1/2 is employed, the flow field shows apparent anisotropic property, characterized by the appearance of large shock wave structures. Accordingly, the single-particle dispersion shows extremely different behavior from the solenoidal forcing case.

  3. Gene-for-gene disease resistance: bridging insect pest and pathogen defense.

    PubMed

    Kaloshian, Isgouhi

    2004-12-01

    Active plant defense, also known as gene-for-gene resistance, is triggered when a plant resistance (R) gene recognizes the intrusion of a specific insect pest or pathogen. Activation of plant defense includes an array of physiological and transcriptional reprogramming. During the past decade, a large number of plant R genes that confer resistance to diverse group of pathogens have been cloned from a number of plant species. Based on predicted protein structures, these genes are classified into a small number of groups, indicating that structurally related R genes recognize phylogenetically distinct pathogens. An extreme example is the tomato Mi-1 gene, which confers resistance to potato aphid (Macrosiphum euphorbiae), whitefly (Bemisia tabaci), and root-knot nematodes (Meloidogyne spp.). While Mi-1 remains the only cloned insect R gene, there is evidence that gene-for-gene type of plant defense against piercing-sucking insects exists in a number of plant species.

  4. Concept definition study for an extremely large aerophysics range facility

    NASA Technical Reports Server (NTRS)

    Swift, H.; Witcofski, R.

    1992-01-01

    The development of a large aerophysical ballistic range facility is considered to study large-scale hypersonic flows at high Reynolds numbers for complex shapes. A two-stage light gas gun is considered for the hypervelocity launcher, and the extensive range tankage is discussed with respect to blast suppression, model disposition, and the sabot impact tank. A layout is given for the large aerophysics facility, and illustrations are provided for key elements such as the guide rail. The paper shows that such a facility could be used to launch models with diameters approaching 250 mm at velocities of 6.5 km/s with peak achievable accelerations of not more than 85.0 kgs. The envisioned range would provide gas-flow facilities capable of controlling the modeled quiescent atmospheric conditions. The facility is argued to be a feasible and important step in the investigation and experiment of such hypersonic vehicles as the National Aerospace Plane.

  5. Credible occurrence probabilities for extreme geophysical events: earthquakes, volcanic eruptions, magnetic storms

    USGS Publications Warehouse

    Love, Jeffrey J.

    2012-01-01

    Statistical analysis is made of rare, extreme geophysical events recorded in historical data -- counting the number of events $k$ with sizes that exceed chosen thresholds during specific durations of time $\\tau$. Under transformations that stabilize data and model-parameter variances, the most likely Poisson-event occurrence rate, $k/\\tau$, applies for frequentist inference and, also, for Bayesian inference with a Jeffreys prior that ensures posterior invariance under changes of variables. Frequentist confidence intervals and Bayesian (Jeffreys) credibility intervals are approximately the same and easy to calculate: $(1/\\tau)[(\\sqrt{k} - z/2)^{2},(\\sqrt{k} + z/2)^{2}]$, where $z$ is a parameter that specifies the width, $z=1$ ($z=2$) corresponding to $1\\sigma$, $68.3\\%$ ($2\\sigma$, $95.4\\%$). If only a few events have been observed, as is usually the case for extreme events, then these "error-bar" intervals might be considered to be relatively wide. From historical records, we estimate most likely long-term occurrence rates, 10-yr occurrence probabilities, and intervals of frequentist confidence and Bayesian credibility for large earthquakes, explosive volcanic eruptions, and magnetic storms.

  6. Methods comparison for microsatellite marker development: Different isolation methods, different yield efficiency

    NASA Astrophysics Data System (ADS)

    Zhan, Aibin; Bao, Zhenmin; Hu, Xiaoli; Lu, Wei; Hu, Jingjie

    2009-06-01

    Microsatellite markers have become one kind of the most important molecular tools used in various researches. A large number of microsatellite markers are required for the whole genome survey in the fields of molecular ecology, quantitative genetics and genomics. Therefore, it is extremely necessary to select several versatile, low-cost, efficient and time- and labor-saving methods to develop a large panel of microsatellite markers. In this study, we used Zhikong scallop ( Chlamys farreri) as the target species to compare the efficiency of the five methods derived from three strategies for microsatellite marker development. The results showed that the strategy of constructing small insert genomic DNA library resulted in poor efficiency, while the microsatellite-enriched strategy highly improved the isolation efficiency. Although the mining public database strategy is time- and cost-saving, it is difficult to obtain a large number of microsatellite markers, mainly due to the limited sequence data of non-model species deposited in public databases. Based on the results in this study, we recommend two methods, microsatellite-enriched library construction method and FIASCO-colony hybridization method, for large-scale microsatellite marker development. Both methods were derived from the microsatellite-enriched strategy. The experimental results obtained from Zhikong scallop also provide the reference for microsatellite marker development in other species with large genomes.

  7. Prediction of Rare Transitions in Planetary Atmosphere Dynamics Between Attractors with Different Number of Zonal Jets

    NASA Astrophysics Data System (ADS)

    Bouchet, F.; Laurie, J.; Zaboronski, O.

    2012-12-01

    We describe transitions between attractors with either one, two or more zonal jets in models of turbulent atmosphere dynamics. Those transitions are extremely rare, and occur over times scales of centuries or millennia. They are extremely hard to observe in direct numerical simulations, because they require on one hand an extremely good resolution in order to simulate accurately the turbulence and on the other hand simulations performed over an extremely long time. Those conditions are usually not met together in any realistic models. However many examples of transitions between turbulent attractors in geophysical flows are known to exist (paths of the Kuroshio, Earth's magnetic field reversal, atmospheric flows, and so on). Their study through numerical computations is inaccessible using conventional means. We present an alternative approach, based on instanton theory and large deviations. Instanton theory provides a way to compute (both numerically and theoretically) extremely rare transitions between turbulent attractors. This tool, developed in field theory, and justified in some cases through the large deviation theory in mathematics, can be applied to models of turbulent atmosphere dynamics. It provides both new theoretical insights and new type of numerical algorithms. Those algorithms can predict transition histories and transition rates using numerical simulations run over only hundreds of typical model dynamical time, which is several order of magnitude lower than the typical transition time. We illustrate the power of those tools in the framework of quasi-geostrophic models. We show regimes where two or more attractors coexist. Those attractors corresponds to turbulent flows dominated by either one or more zonal jets similar to midlatitude atmosphere jets. Among the trajectories connecting two non-equilibrium attractors, we determine the most probable ones. Moreover, we also determine the transition rates, which are several of magnitude larger than a typical time determined from the jet structure. We discuss the medium-term generalization of those results to models with more complexity, like primitive equations or GCMs.

  8. Extreme Precipitation and High-Impact Landslides

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia; Adler, Robert; Huffman, George; Peters-Lidard, Christa

    2012-01-01

    It is well known that extreme or prolonged rainfall is the dominant trigger of landslides; however, there remain large uncertainties in characterizing the distribution of these hazards and meteorological triggers at the global scale. Researchers have evaluated the spatiotemporal distribution of extreme rainfall and landslides at local and regional scale primarily using in situ data, yet few studies have mapped rainfall-triggered landslide distribution globally due to the dearth of landslide data and consistent precipitation information. This research uses a newly developed Global Landslide Catalog (GLC) and a 13-year satellite-based precipitation record from Tropical Rainfall Measuring Mission (TRMM) data. For the first time, these two unique products provide the foundation to quantitatively evaluate the co-occurence of precipitation and rainfall-triggered landslides globally. The GLC, available from 2007 to the present, contains information on reported rainfall-triggered landslide events around the world using online media reports, disaster databases, etc. When evaluating this database, we observed that 2010 had a large number of high-impact landslide events relative to previous years. This study considers how variations in extreme and prolonged satellite-based rainfall are related to the distribution of landslides over the same time scales for three active landslide areas: Central America, the Himalayan Arc, and central-eastern China. Several test statistics confirm that TRMM rainfall generally scales with the observed increase in landslide reports and fatal events for 2010 and previous years over each region. These findings suggest that the co-occurrence of satellite precipitation and landslide reports may serve as a valuable indicator for characterizing the spatiotemporal distribution of landslide-prone areas in order to establish a global rainfall-triggered landslide climatology. This research also considers the sources for this extreme rainfall, citing teleconnections from ENSO as likely contributors to regional precipitation variability. This work demonstrates the potential for using satellite-based precipitation estimates to identify potentially active landslide areas at the global scale in order to improve landslide cataloging and quantify landslide triggering at daily, monthly and yearly time scales.

  9. Generating extreme weather event sets from very large ensembles of regional climate models

    NASA Astrophysics Data System (ADS)

    Massey, Neil; Guillod, Benoit; Otto, Friederike; Allen, Myles; Jones, Richard; Hall, Jim

    2015-04-01

    Generating extreme weather event sets from very large ensembles of regional climate models Neil Massey, Benoit P. Guillod, Friederike E. L. Otto, Myles R. Allen, Richard Jones, Jim W. Hall Environmental Change Institute, University of Oxford, Oxford, UK Extreme events can have large impacts on societies and are therefore being increasingly studied. In particular, climate change is expected to impact the frequency and intensity of these events. However, a major limitation when investigating extreme weather events is that, by definition, only few events are present in observations. A way to overcome this issue it to use large ensembles of model simulations. Using the volunteer distributed computing (VDC) infrastructure of weather@home [1], we run a very large number (10'000s) of RCM simulations over the European domain at a resolution of 25km, with an improved land-surface scheme, nested within a free-running GCM. Using VDC allows many thousands of climate model runs to be computed. Using observations for the GCM boundary forcings we can run historical "hindcast" simulations over the past 100 to 150 years. This allows us, due to the chaotic variability of the atmosphere, to ascertain how likely an extreme event was, given the boundary forcings, and to derive synthetic event sets. The events in these sets did not actually occur in the observed record but could have occurred given the boundary forcings, with an associated probability. The event sets contain time-series of fields of meteorological variables that allow impact modellers to assess the loss the event would incur. Projections of events into the future are achieved by modelling projections of the sea-surface temperature (SST) and sea-ice boundary forcings, by combining the variability of the SST in the observed record with a range of warming signals derived from the varying responses of SSTs in the CMIP5 ensemble to elevated greenhouse gas (GHG) emissions in three RCP scenarios. Simulating the future with a range of SST responses, as well as a range of RCP scenarios, allows us to assess the uncertainty in the response to elevated GHG emissions that occurs in the CMIP5 ensemble. Numerous extreme weather events can be studied. Firstly, we analyse droughts in Europe with a focus on the UK in the context of the project MaRIUS (Managing the Risks, Impacts and Uncertainties of droughts and water Scarcity). We analyse the characteristics of the simulated droughts, the underlying physical mechanisms, and assess droughts observed in the recent past. Secondly, we analyse windstorms by applying an objective storm-identification and tracking algorithm to the ensemble output, isolating those storms that cause high loss and building a probabilistic storm catalogue, which can be used by impact modellers, insurance loss modellers, etc. Finally, we combine the model output with a heat-stress index to determine the detrimental effect on health of heat waves in Europe. [1] Massey, N. et al., 2014, Q. J. R. Meteorol. Soc.

  10. The cost of large numbers of hypothesis tests on power, effect size and sample size.

    PubMed

    Lazzeroni, L C; Ray, A

    2012-01-01

    Advances in high-throughput biology and computer science are driving an exponential increase in the number of hypothesis tests in genomics and other scientific disciplines. Studies using current genotyping platforms frequently include a million or more tests. In addition to the monetary cost, this increase imposes a statistical cost owing to the multiple testing corrections needed to avoid large numbers of false-positive results. To safeguard against the resulting loss of power, some have suggested sample sizes on the order of tens of thousands that can be impractical for many diseases or may lower the quality of phenotypic measurements. This study examines the relationship between the number of tests on the one hand and power, detectable effect size or required sample size on the other. We show that once the number of tests is large, power can be maintained at a constant level, with comparatively small increases in the effect size or sample size. For example at the 0.05 significance level, a 13% increase in sample size is needed to maintain 80% power for ten million tests compared with one million tests, whereas a 70% increase in sample size is needed for 10 tests compared with a single test. Relative costs are less when measured by increases in the detectable effect size. We provide an interactive Excel calculator to compute power, effect size or sample size when comparing study designs or genome platforms involving different numbers of hypothesis tests. The results are reassuring in an era of extreme multiple testing.

  11. bigSCale: an analytical framework for big-scale single-cell data.

    PubMed

    Iacono, Giovanni; Mereu, Elisabetta; Guillaumet-Adkins, Amy; Corominas, Roser; Cuscó, Ivon; Rodríguez-Esteban, Gustavo; Gut, Marta; Pérez-Jurado, Luis Alberto; Gut, Ivo; Heyn, Holger

    2018-06-01

    Single-cell RNA sequencing (scRNA-seq) has significantly deepened our insights into complex tissues, with the latest techniques capable of processing tens of thousands of cells simultaneously. Analyzing increasing numbers of cells, however, generates extremely large data sets, extending processing time and challenging computing resources. Current scRNA-seq analysis tools are not designed to interrogate large data sets and often lack sensitivity to identify marker genes. With bigSCale, we provide a scalable analytical framework to analyze millions of cells, which addresses the challenges associated with large data sets. To handle the noise and sparsity of scRNA-seq data, bigSCale uses large sample sizes to estimate an accurate numerical model of noise. The framework further includes modules for differential expression analysis, cell clustering, and marker identification. A directed convolution strategy allows processing of extremely large data sets, while preserving transcript information from individual cells. We evaluated the performance of bigSCale using both a biological model of aberrant gene expression in patient-derived neuronal progenitor cells and simulated data sets, which underlines the speed and accuracy in differential expression analysis. To test its applicability for large data sets, we applied bigSCale to assess 1.3 million cells from the mouse developing forebrain. Its directed down-sampling strategy accumulates information from single cells into index cell transcriptomes, thereby defining cellular clusters with improved resolution. Accordingly, index cell clusters identified rare populations, such as reelin ( Reln )-positive Cajal-Retzius neurons, for which we report previously unrecognized heterogeneity associated with distinct differentiation stages, spatial organization, and cellular function. Together, bigSCale presents a solution to address future challenges of large single-cell data sets. © 2018 Iacono et al.; Published by Cold Spring Harbor Laboratory Press.

  12. Compound summer temperature and precipitation extremes over central Europe

    NASA Astrophysics Data System (ADS)

    Sedlmeier, Katrin; Feldmann, H.; Schädler, G.

    2018-02-01

    Reliable knowledge of the near-future climate change signal of extremes is important for adaptation and mitigation strategies. Especially compound extremes, like heat and drought occurring simultaneously, may have a greater impact on society than their univariate counterparts and have recently become an active field of study. In this paper, we use a 12-member ensemble of high-resolution (7 km) regional climate simulations with the regional climate model COSMO-CLM over central Europe to analyze the climate change signal and its uncertainty for compound heat and drought extremes in summer by two different measures: one describing absolute (i.e., number of exceedances of absolute thresholds like hot days), the other relative (i.e., number of exceedances of time series intrinsic thresholds) compound extreme events. Changes are assessed between a reference period (1971-2000) and a projection period (2021-2050). Our findings show an increase in the number of absolute compound events for the whole investigation area. The change signal of relative extremes is more region-dependent, but there is a strong signal change in the southern and eastern parts of Germany and the neighboring countries. Especially the Czech Republic shows strong change in absolute and relative extreme events.

  13. TP53 copy number expansion is associated with the evolution of increased body size and an enhanced DNA damage response in elephants

    PubMed Central

    Sulak, Michael; Fong, Lindsey; Mika, Katelyn; Chigurupati, Sravanthi; Yon, Lisa; Mongan, Nigel P; Emes, Richard D; Lynch, Vincent J

    2016-01-01

    A major constraint on the evolution of large body sizes in animals is an increased risk of developing cancer. There is no correlation, however, between body size and cancer risk. This lack of correlation is often referred to as 'Peto's Paradox'. Here, we show that the elephant genome encodes 20 copies of the tumor suppressor gene TP53 and that the increase in TP53 copy number occurred coincident with the evolution of large body sizes, the evolution of extreme sensitivity to genotoxic stress, and a hyperactive TP53 signaling pathway in the elephant (Proboscidean) lineage. Furthermore, we show that several of the TP53 retrogenes (TP53RTGs) are transcribed and likely translated. While TP53RTGs do not appear to directly function as transcription factors, they do contribute to the enhanced sensitivity of elephant cells to DNA damage and the induction of apoptosis by regulating activity of the TP53 signaling pathway. These results suggest that an increase in the copy number of TP53 may have played a direct role in the evolution of very large body sizes and the resolution of Peto's paradox in Proboscideans. DOI: http://dx.doi.org/10.7554/eLife.11994.001 PMID:27642012

  14. Compact high-resolution spectrographs for large and extremely large telescopes: using the diffraction limit

    NASA Astrophysics Data System (ADS)

    Robertson, J. Gordon; Bland-Hawthorn, Joss

    2012-09-01

    As telescopes get larger, the size of a seeing-limited spectrograph for a given resolving power becomes larger also, and for ELTs the size will be so great that high resolution instruments of simple design will be infeasible. Solutions include adaptive optics (but not providing full correction for short wavelengths) or image slicers (which give feasible but still large instruments). Here we develop the solution proposed by Bland-Hawthorn and Horton: the use of diffraction-limited spectrographs which are compact even for high resolving power. Their use is made possible by the photonic lantern, which splits a multi-mode optical fiber into a number of single-mode fibers. We describe preliminary designs for such spectrographs, at a resolving power of R ~ 50,000. While they are small and use relatively simple optics, the challenges are to accommodate the longest possible fiber slit (hence maximum number of single-mode fibers in one spectrograph) and to accept the beam from each fiber at a focal ratio considerably faster than for most spectrograph collimators, while maintaining diffraction-limited imaging quality. It is possible to obtain excellent performance despite these challenges. We also briefly consider the number of such spectrographs required, which can be reduced by full or partial adaptive optics correction, and/or moving towards longer wavelengths.

  15. Direct evidence that density-dependent regulation underpins the temporal stability of abundant species in a diverse animal community

    PubMed Central

    Henderson, Peter A.; Magurran, Anne E.

    2014-01-01

    To understand how ecosystems are structured and stabilized, and to identify when communities are at risk of damage or collapse, we need to know how the abundances of the taxa in the entire assemblage vary over ecologically meaningful timescales. Here, we present an analysis of species temporal variability within a single large vertebrate community. Using an exceptionally complete 33-year monthly time series following the dynamics of 81 species of fishes, we show that the most abundant species are least variable in terms of temporal biomass, because they are under density-dependent (negative feedback) regulation. At the other extreme, a relatively large number of low abundance transient species exhibit the greatest population variability. The high stability of the consistently common high abundance species—a result of density-dependence—is reflected in the observation that they consistently represent over 98% of total fish biomass. This leads to steady ecosystem nutrient and energy flux irrespective of the changes in species number and abundance among the large number of low abundance transient species. While the density-dependence of the core species ensures stability under the existing environmental regime, the pool of transient species may support long-term stability by replacing core species should environmental conditions change. PMID:25100702

  16. Survival in extreme environments - on the current knowledge of adaptations in tardigrades.

    PubMed

    Møbjerg, N; Halberg, K A; Jørgensen, A; Persson, D; Bjørn, M; Ramløv, H; Kristensen, R M

    2011-07-01

    Tardigrades are microscopic animals found worldwide in aquatic as well as terrestrial ecosystems. They belong to the invertebrate superclade Ecdysozoa, as do the two major invertebrate model organisms: Caenorhabditis elegans and Drosophila melanogaster. We present a brief description of the tardigrades and highlight species that are currently used as models for physiological and molecular investigations. Tardigrades are uniquely adapted to a range of environmental extremes. Cryptobiosis, currently referred to as a reversible ametabolic state induced by e.g. desiccation, is common especially among limno-terrestrial species. It has been shown that the entry and exit of cryptobiosis may involve synthesis of bioprotectants in the form of selective carbohydrates and proteins as well as high levels of antioxidant enzymes and other free radical scavengers. However, at present a general scheme of mechanisms explaining this phenomenon is lacking. Importantly, recent research has shown that tardigrades even in their active states may be extremely tolerant to environmental stress, handling extreme levels of ionizing radiation, large fluctuation in external salinity and avoiding freezing by supercooling to below -20 °C, presumably relying on efficient DNA repair mechanisms and osmoregulation. This review summarizes the current knowledge on adaptations found among tardigrades, and presents new data on tardigrade cell numbers and osmoregulation. © 2011 The Authors. Acta Physiologica © 2011 Scandinavian Physiological Society.

  17. Solar Imaging UV/EUV Spectrometers Using TVLS Gratings

    NASA Technical Reports Server (NTRS)

    Thomas, Roger J.

    2003-01-01

    It is a particular challenge to develop a stigmatic spectrograph for UV, EUV wavelengths since the very low normal-incidence reflectance of standard materials most often requires that the design be restricted to a single optical element which must simultaneously provide both reimaging and spectral dispersion. This problem has been solved in the past by the use of toroidal gratings with uniform line-spaced rulings (TULS). A number of solar extreme ultraviolet (EUV) spectrometers have been based on such designs, including SOHO/CDS, Solar-B/EIS, and the sounding rockets Solar Extreme ultraviolet Research Telescope and Spectrograph (SERTS) and Extreme Ultraviolet Normal Incidence Spectrograph (EUNIS). More recently, Kita, Harada, and collaborators have developed the theory of spherical gratings with varied line-space rulings (SVLS) operated at unity magnification, which have been flown on several astronomical satellite missions. We now combine these ideas into a spectrometer concept that puts varied-line space rulings onto toroidal gratings. Such TVLS designs are found to provide excellent imaging even at very large spectrograph magnifications and beam-speeds, permitting extremely high-quality performance in remarkably compact instrument packages. Optical characteristics of three new solar spectrometers based on this concept are described: SUMI and RAISE, two sounding rocket payloads, and NEXUS, currently being proposed as a Small-Explorer (SMEX) mission.

  18. A dynamical systems approach to studying midlatitude weather extremes

    NASA Astrophysics Data System (ADS)

    Messori, Gabriele; Caballero, Rodrigo; Faranda, Davide

    2017-04-01

    Extreme weather occurrences carry enormous social and economic costs and routinely garner widespread scientific and media coverage. The ability to predict these events is therefore a topic of crucial importance. Here we propose a novel predictability pathway for extreme events, by building upon recent advances in dynamical systems theory. We show that simple dynamical systems metrics can be used to identify sets of large-scale atmospheric flow patterns with similar spatial structure and temporal evolution on time scales of several days to a week. In regions where these patterns favor extreme weather, they afford a particularly good predictability of the extremes. We specifically test this technique on the atmospheric circulation in the North Atlantic region, where it provides predictability of large-scale wintertime surface temperature extremes in Europe up to 1 week in advance.

  19. Impacts of climate change on precipitation and discharge extremes through the use of statistical downscaling approaches in a Mediterranean basin.

    PubMed

    Piras, Monica; Mascaro, Giuseppe; Deidda, Roberto; Vivoni, Enrique R

    2016-02-01

    Mediterranean region is characterized by high precipitation variability often enhanced by orography, with strong seasonality and large inter-annual fluctuations, and by high heterogeneity of terrain and land surface properties. As a consequence, catchments in this area are often prone to the occurrence of hydrometeorological extremes, including storms, floods and flash-floods. A number of climate studies focused in the Mediterranean region predict that extreme events will occur with higher intensity and frequency, thus requiring further analyses to assess their effect at the land surface, particularly in small- and medium-sized watersheds. In this study, climate and hydrologic simulations produced within the Climate Induced Changes on the Hydrology of Mediterranean Basins (CLIMB) EU FP7 research project were used to analyze how precipitation extremes propagate into discharge extremes in the Rio Mannu basin (472.5km(2)), located in Sardinia, Italy. The basin hydrologic response to climate forcings in a reference (1971-2000) and a future (2041-2070) period was simulated through the combined use of a set of global and regional climate models, statistical downscaling techniques, and a process based distributed hydrologic model. We analyzed and compared the distribution of annual maxima extracted from hourly and daily precipitation and peak discharge time series, simulated by the hydrologic model under climate forcing. For this aim, yearly maxima were fit by the Generalized Extreme Value (GEV) distribution using a regional approach. Next, we discussed commonality and contrasting behaviors of precipitation and discharge maxima distributions to better understand how hydrological transformations impact propagation of extremes. Finally, we show how rainfall statistical downscaling algorithms produce more reliable forcings for hydrological models than coarse climate model outputs. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Changes in the extreme wave heights over the Baltic Sea

    NASA Astrophysics Data System (ADS)

    Kudryavtseva, Nadia; Soomere, Tarmo

    2017-04-01

    Storms over the Baltic Sea and northwestern Europe have a large impact on the population, offshore industry, and shipping. The understanding of extreme events in sea wave heights and their change due to the climate change and variability is critical for assessment of flooding risks and coastal protection. The BACCII Assessment of Climate Change for the Baltic Sea Basin showed that the extreme events analysis of wind waves is currently not very well addressed, as well as satellite observations of the wave heights. Here we discuss the analysis of all existing satellite altimetry data over the Baltic Sea Basin regarding extremes in the wave heights. In this talk for the first time, we present an analysis of 100-yr return periods, fitted generalized Pareto and Weibull distributions, number, and frequency of extreme events in wave heights in the Baltic Sea measured by the multi-mission satellite altimetry. The data span more than 23 years and provide an excellent spatial coverage over the Baltic Sea, allowing to study in details spatial variations and changes in extreme wave heights. The analysis is based on an application of the Initial Distribution Method, Annual Maxima method and Peak-Over-Threshold approach to satellite altimetry data, all validated in comparison with in-situ wave height measurements. Here we show that the 100-yr return periods of wave heights show significant spatial changes over the Baltic Sea indicating a decrease in the southern part of the Baltic Sea and an increase in adjacent areas, which can significantly affect coast vulnerability. Here we compare the observed shift with storm track database data and discuss a spatial correlation and possible connection between the changes in the storm tracks over the Baltic Sea and the change in the extreme wave heights.

  1. Climate change impacts on tropical cyclones and extreme sea levels in the South Pacific — A regional assessment

    NASA Astrophysics Data System (ADS)

    Walsh, Kevin J. E.; McInnes, Kathleen L.; McBride, John L.

    2012-01-01

    This paper reviews the current understanding of the effect of climate change on extreme sea levels in the South Pacific region. This region contains many locations that are vulnerable to extreme sea levels in the current climate, and projections indicate that this vulnerability will increase in the future. The recent publication of authoritative statements on the relationship between global warming and global sea level rise, tropical cyclones and the El Niño-Southern Oscillation phenomenon has motivated this review. Confident predictions of global mean sea level rise are modified by regional differences in the steric (density-related) component of sea level rise and changing gravitational interactions between the ocean and the ice sheets which affect the regional distribution of the eustatic (mass-related) contribution to sea level rise. The most extreme sea levels in this region are generated by tropical cyclones. The intensity of the strongest tropical cyclones is likely to increase, but many climate models project a substantial decrease in tropical cyclone numbers in this region, which may lead to an overall decrease in the total number of intense tropical cyclones. This projection, however, needs to be better quantified using improved high-resolution climate model simulations of tropical cyclones. Future changes in ENSO may lead to large regional variations in tropical cyclone incidence and sea level rise, but these impacts are also not well constrained. While storm surges from tropical cyclones give the largest sea level extremes in the parts of this region where they occur, other more frequent high sea level events can arise from swell generated by distant storms. Changes in wave climate are projected for the tropical Pacific due to anthropogenically-forced changes in atmospheric circulation. Future changes in sea level extremes will be caused by a combination of changes in mean sea level, regional sea level trends, tropical cyclone incidence and wave climate. Recommendations are given for research to increase understanding of the response of these factors to climate change. Implications of the results for adaptation research are also discussed.

  2. Hierarchies in Quantum Gravity: Large Numbers, Small Numbers, and Axions

    NASA Astrophysics Data System (ADS)

    Stout, John Eldon

    Our knowledge of the physical world is mediated by relatively simple, effective descriptions of complex processes. By their very nature, these effective theories obscure any phenomena outside their finite range of validity, discarding information crucial to understanding the full, quantum gravitational theory. However, we may gain enormous insight into the full theory by understanding how effective theories with extreme characteristics--for example, those which realize large-field inflation or have disparate hierarchies of scales--can be naturally realized in consistent theories of quantum gravity. The work in this dissertation focuses on understanding the quantum gravitational constraints on these "extreme" theories in well-controlled corners of string theory. Axion monodromy provides one mechanism for realizing large-field inflation in quantum gravity. These models spontaneously break an axion's discrete shift symmetry and, assuming that the corrections induced by this breaking remain small throughout the excursion, create a long, quasi-flat direction in field space. This weakly-broken shift symmetry has been used to construct a dynamical solution to the Higgs hierarchy problem, dubbed the "relaxion." We study this relaxion mechanism and show that--without major modifications--it can not be naturally embedded within string theory. In particular, we find corrections to the relaxion potential--due to the ten-dimensional backreaction of monodromy charge--that conflict with naive notions of technical naturalness and render the mechanism ineffective. The super-Planckian field displacements necessary for large-field inflation may also be realized via the collective motion of many aligned axions. However, it is not clear that string theory provides the structures necessary for this to occur. We search for these structures by explicitly constructing the leading order potential for C4 axions and computing the maximum possible field displacement in all compactifications of type IIB string theory on toric Calabi-Yau hypersurfaces with h1,1 ≤ 4 in the Kreuzer-Skarke database. While none of these examples can sustain a super-Planckian displacement--the largest possible is 0.3 Mpl--we find an alignment mechanism responsible for large displacements in random matrix models at large h 1,1 >> 1, indicating that large-field inflation may be feasible in compactifications with tens or hundreds of axions. These results represent a modest step toward a complete understanding of large hierarchies and naturalness in quantum gravity.

  3. 21st Century Changes in Precipitation Extremes Over the United States: Can Climate Analogues Help or Hinder?

    NASA Astrophysics Data System (ADS)

    Gao, X.; Schlosser, C. A.

    2013-12-01

    Global warming is expected to alter the frequency and/or magnitude of extreme precipitation events. Such changes could have substantial ecological, economic, and sociological consequences. However, climate models in general do not correctly reproduce the frequency and intensity distribution of precipitation, especially at the regional scale. In this study, gridded data from a dense network of surface precipitation gauges and a global atmospheric analysis at a coarser scale are combined to develop a diagnostic framework for the large-scale meteorological conditions (i.e. flow features, moisture supply) that dominate during extreme precipitation. Such diagnostic framework is first evaluated with the late 20th century simulations from an ensemble of climate models participating in the Coupled Model Intercomparison Project Phase 5 (CMIP5), and is found to produce more consistent (and less uncertain) total and interannaul number of extreme days with the observations than the model-based precipitation over the south-central United States and the Western United States examined in this study. The framework is then applied to the CMIP5 multi-model projections for two radiative forcing scenarios (Representative Concentration Pathways 4.5 and 8.5) to assess the potential future changes in the probability of precipitation extremes over the same study regions. We further analyze the accompanying circulation features and their changes that may be responsible for shifts in extreme precipitation in response to changed climate. The results from this study may guide hazardous weather watches and help society develop adaptive strategies for preventing catastrophic losses.

  4. Macroscopic response to microscopic intrinsic noise in three-dimensional Fisher fronts.

    PubMed

    Nesic, S; Cuerno, R; Moro, E

    2014-10-31

    We study the dynamics of three-dimensional Fisher fronts in the presence of density fluctuations. To this end we simulate the Fisher equation subject to stochastic internal noise, and study how the front moves and roughens as a function of the number of particles in the system, N. Our results suggest that the macroscopic behavior of the system is driven by the microscopic dynamics at its leading edge where number fluctuations are dominated by rare events. Contrary to naive expectations, the strength of front fluctuations decays extremely slowly as 1/logN, inducing large-scale fluctuations which we find belong to the one-dimensional Kardar-Parisi-Zhang universality class of kinetically rough interfaces. Hence, we find that there is no weak-noise regime for Fisher fronts, even for realistic numbers of particles in macroscopic systems.

  5. Are extreme events (statistically) special? (Invited)

    NASA Astrophysics Data System (ADS)

    Main, I. G.; Naylor, M.; Greenhough, J.; Touati, S.; Bell, A. F.; McCloskey, J.

    2009-12-01

    We address the generic problem of testing for scale-invariance in extreme events, i.e. are the biggest events in a population simply a scaled model of those of smaller size, or are they in some way different? Are large earthquakes for example ‘characteristic’, do they ‘know’ how big they will be before the event nucleates, or is the size of the event determined only in the avalanche-like process of rupture? In either case what are the implications for estimates of time-dependent seismic hazard? One way of testing for departures from scale invariance is to examine the frequency-size statistics, commonly used as a bench mark in a number of applications in Earth and Environmental sciences. Using frequency data however introduces a number of problems in data analysis. The inevitably small number of data points for extreme events and more generally the non-Gaussian statistical properties strongly affect the validity of prior assumptions about the nature of uncertainties in the data. The simple use of traditional least squares (still common in the literature) introduces an inherent bias to the best fit result. We show first that the sampled frequency in finite real and synthetic data sets (the latter based on the Epidemic-Type Aftershock Sequence model) converge to a central limit only very slowly due to temporal correlations in the data. A specific correction for temporal correlations enables an estimate of convergence properties to be mapped non-linearly on to a Gaussian one. Uncertainties closely follow a Poisson distribution of errors across the whole range of seismic moment for typical catalogue sizes. In this sense the confidence limits are scale-invariant. A systematic sample bias effect due to counting whole numbers in a finite catalogue makes a ‘characteristic’-looking type extreme event distribution a likely outcome of an underlying scale-invariant probability distribution. This highlights the tendency of ‘eyeball’ fits to unconsciously (but wrongly in this case) assume Gaussian errors. We develop methods to correct for these effects, and show that the current best fit maximum likelihood regression model for the global frequency-moment distribution in the digital era is a power law, i.e. mega-earthquakes continue to follow the Gutenberg-Richter trend of smaller earthquakes with no (as yet) observable cut-off or characteristic extreme event. The results may also have implications for the interpretation of other time-limited geophysical time series that exhibit power-law scaling.

  6. Testing for scale-invariance in extreme events, with application to earthquake occurrence

    NASA Astrophysics Data System (ADS)

    Main, I.; Naylor, M.; Greenhough, J.; Touati, S.; Bell, A.; McCloskey, J.

    2009-04-01

    We address the generic problem of testing for scale-invariance in extreme events, i.e. are the biggest events in a population simply a scaled model of those of smaller size, or are they in some way different? Are large earthquakes for example ‘characteristic', do they ‘know' how big they will be before the event nucleates, or is the size of the event determined only in the avalanche-like process of rupture? In either case what are the implications for estimates of time-dependent seismic hazard? One way of testing for departures from scale invariance is to examine the frequency-size statistics, commonly used as a bench mark in a number of applications in Earth and Environmental sciences. Using frequency data however introduces a number of problems in data analysis. The inevitably small number of data points for extreme events and more generally the non-Gaussian statistical properties strongly affect the validity of prior assumptions about the nature of uncertainties in the data. The simple use of traditional least squares (still common in the literature) introduces an inherent bias to the best fit result. We show first that the sampled frequency in finite real and synthetic data sets (the latter based on the Epidemic-Type Aftershock Sequence model) converge to a central limit only very slowly due to temporal correlations in the data. A specific correction for temporal correlations enables an estimate of convergence properties to be mapped non-linearly on to a Gaussian one. Uncertainties closely follow a Poisson distribution of errors across the whole range of seismic moment for typical catalogue sizes. In this sense the confidence limits are scale-invariant. A systematic sample bias effect due to counting whole numbers in a finite catalogue makes a ‘characteristic'-looking type extreme event distribution a likely outcome of an underlying scale-invariant probability distribution. This highlights the tendency of ‘eyeball' fits unconsciously (but wrongly in this case) to assume Gaussian errors. We develop methods to correct for these effects, and show that the current best fit maximum likelihood regression model for the global frequency-moment distribution in the digital era is a power law, i.e. mega-earthquakes continue to follow the Gutenberg-Richter trend of smaller earthquakes with no (as yet) observable cut-off or characteristic extreme event. The results may also have implications for the interpretation of other time-limited geophysical time series that exhibit power-law scaling.

  7. [Extreme reactive thrombocytosis in a healthy 6 year-old child].

    PubMed

    de Lama Caro-Patón, G; García-Salido, A; Iglesias-Bouzas, M I; Guillén, M; Cañedo-Villaroya, E; Martínez-Romera, I; Serrano-González, A; Casado-Flores, J

    2014-11-01

    Thrombocytosis is usually a casual finding in children. Reactive or secondary thrombocytosis is the more common form, being the infections diseases the most prevalent cause of it. Regarding the number of platelets there are four degrees of thrombocytosis; in its extreme degree the number of platelets exceeds 1,000,000/mm(3). We describe a case of extreme reactive thrombocytosis in a healthy 6-year-old child. He required critical care admission for diagnosis and treatment (maximum number of platelets 7,283,000/mm(3)). We review the different causes of thrombocytosis in childhood, the differential diagnosis, and the available treatments in case of extreme thrombocytosis. Copyright © 2013 Asociación Española de Pediatría. Published by Elsevier Espana. All rights reserved.

  8. Extreme-scale motions in turbulent plane Couette flows

    NASA Astrophysics Data System (ADS)

    Lee, Myoungkyu; Moser, Robert D.

    2018-05-01

    We study the size of large-scale motions in turbulent plane Couette flows at moderate Reynolds number up to $Re_\\tau$ = 500. Direct numerical simulation domains were as large as $100\\pi\\delta\\times2\\delta\\times5\\pi\\delta$, where $\\delta$ is half the distance between the walls. The results indicate that there are structures with streamwise extent, as measured by the wavelength, as long as 78$\\delta$ and at least 310$\\delta$ at $Re_\\tau$ = 220 and 500, respectively. The presence of these very long structures is apparent in the spectra of all three velocity components and the Reynolds stress. In DNS using a smaller domain, the large structures are constrained, eliminating the streamwise variations present in the larger domain. Effects of a smaller domain are also present in the mean velocity and the streamwise velocity variance in the outer flow.

  9. Facilitating large-scale clinical trials: in Asia.

    PubMed

    Choi, Han Yong; Ko, Jae-Wook

    2010-01-01

    The number of clinical trials conducted in Asian countries has started to increase as a result of expansion of the pharmaceutical market in this area. There is a growing opportunity for large-scale clinical trials because of the large number of patients, significant market potential, good quality of data, and the cost effective and qualified medical infrastructure. However, for carrying out large-scale clinical trials in Asia, there are several major challenges, including the quality control of data, budget control, laboratory validation, monitoring capacity, authorship, staff training, and nonstandard treatment that need to be considered. There are also several difficulties in collaborating on international trials in Asia because Asia is an extremely diverse continent. The major challenges are language differences, diversity of patterns of disease, and current treatments, a large gap in the experience with performing multinational trials, and regulatory differences among the Asian countries. In addition, there are also differences in the understanding of global clinical trials, medical facilities, indemnity assurance, and culture, including food and religion. To make regional and local data provide evidence for efficacy through the standardization of these differences, unlimited effort is required. At this time, there are no large clinical trials led by urologists in Asia, but it is anticipated that the role of urologists in clinical trials will continue to increase. Copyright © 2010 Elsevier Inc. All rights reserved.

  10. How changes of climate extremes affect summer and winter crop yields and water productivity in the southeast USA

    NASA Astrophysics Data System (ADS)

    Tian, D.; Cammarano, D.

    2017-12-01

    Modeling changes of crop production at regional scale is important to make adaptation measures for sustainably food supply under global change. In this study, we explore how changing climate extremes in the 20th and 21st century affect maize (summer crop) and wheat (winter crop) yields in an agriculturally important region: the southeast United States. We analyze historical (1950-1999) and projected (2006-2055) precipitation and temperature extremes by calculating the changes of 18 climate extreme indices using the statistically downscaled CMIP5 data from 10 general circulation models (GCMs). To evaluate how these climate extremes affect maize and wheat yields, historical baseline and projected maize and wheat yields under RCP4.5 and RCP8.5 scenarios are simulated using the DSSAT-CERES maize and wheat models driven by the same downscaled GCMs data. All of the changes are examined at 110 locations over the study region. The results show that most of the precipitation extreme indices do not have notable change; mean precipitation, precipitation intensity, and maximum 1-day precipitation are generally increased; the number of rainy days is decreased. The temperature extreme indices mostly showed increased values on mean temperature, number of high temperature days, diurnal temperature range, consecutive high temperature days, maximum daily maximum temperature, and minimum daily minimum temperature; the number of low temperature days and number of consecutive low temperature days are decreased. The conditional probabilistic relationships between changes in crop yields and changes in extreme indices suggested different responses of crop yields to climate extremes during sowing to anthesis and anthesis to maturity periods. Wheat yields and crop water productivity for wheat are increased due to an increased CO2 concentration and minimum temperature; evapotranspiration, maize yields, and crop water productivity for wheat are decreased owing to the increased temperature extremes. We found the effects of precipitation changes on both yields are relatively uncertain.

  11. Using Blue Stragglers to Predict Retained Black Hole Population in Globular Clusters

    NASA Astrophysics Data System (ADS)

    Hermanek, Keith; Chatterjee, Sourav; Rasio, Frederic

    2018-01-01

    Large numbers of black holes (BHs) are expected to form in massive star clusters typical of the globular clusters (GCs). Sophisticated theoretical models suggest that many of these BHs can be retained in present-day GCs. Observations have also identified several BH candidates in Galactic and extragalactic GCs (e.g., Macarone et al. 2007; Irwin et al. 2010; Strader et al. 2012; Chomiuk et al. 2013; Miller-Jones et al. 2014). It has also been shown that high-mass and high-density clusters such as GCs are efficient factories of merging binary BHs similar to those observed by the LIGO observatories (Abbott et al. 2016a,b,c,d,e; Rodriguez et al. 2016). Understanding the formation rate and properties of binary BHs are dependent on a detailed understanding of how the BHs dynamically evolve within GCs. Nevertheless, directly detecting BHs in GCs is extremely challenging; BHs only in binaries with limited configurations can be directly detected by the detection of gravitational wave, X-ray, or radio emissions. We propose an indirect of inferring the number of undetected retained BHs in a GC by investigating the dynamical effects of a large number of BHs on the production of other tracer populations such as Blue Straggler Stars (BSS). Using a large grid of detailed GC models we show that there is a clear anti-correlation between the number of BSS in a cluster and the number of retained BHs. Being the most massive species, large numbers of retained BHs will dominate the core of the cluster as a result of mass-segregation driving away other low-mass species such as main-sequence stars from central high-density regions. BSS are expected to form from physical collisions between main-sequence (MS) stars mediated by binary encounters (e.g., Chatterjee et al. 2013) in cores of GCs. Production of BSS by collisions or mass transfer channels are suppressed if a large number of retained BHs in a cluster restrict the number of MS stars in the core. Extensive observational data exist on the number and radial distribution of BSS in GCs. Thus, this anti-correlation between the number of retained BHs and the number of BSS, once carefully calibrated by theoretical models, can be used to infer the population of undetected BHs in GCs.

  12. Changes in extremes due to half a degree warming in observations and models

    NASA Astrophysics Data System (ADS)

    Fischer, E. M.; Schleussner, C. F.; Pfleiderer, P.

    2017-12-01

    Assessing the climate impacts of half-a-degree warming increments is high on the post-Paris science agenda. Discriminating those effects is particularly challenging for climate extremes such as heavy precipitation and heat extremes for which model uncertainties are generally large, and for which internal variability is so important that it can easily offset or strongly amplify the forced local changes induced by half a degree warming. Despite these challenges we provide evidence for large-scale changes in the intensity and frequency of climate extremes due to half a degree warming. We first assess the difference in extreme climate indicators in observational data for the 1960s and 1970s versus the recent past, two periods differ by half a degree. We identify distinct differences for the global and continental-scale occurrence of heat and heavy precipitation extremes. We show that those observed changes in heavy precipitation and heat extremes broadly agree with simulated historical differences and are informative for the projected differences between 1.5 and 2°C warming despite different radiative forcings. We therefore argue that evidence from the observational record can inform the debate about discernible climate impacts in the light of model uncertainty by providing a conservative estimate of the implications of 0.5°C warming. A limitation of using the observational record arises from potential non-linearities in the response of climate extremes to a certain level of warming. We test for potential non-linearities in the response of heat and heavy precipitation extremes in a large ensemble of transient climate simulations. We further quantify differences between a time-window approach in a coupled model large ensemble vs. time-slice experiments using prescribed SST experiments performed in the context of the HAPPI-MIP project. Thereby we provide different lines of evidence that half a degree warming leads to substantial changes in the expected occurrence of heat and heavy precipitation extremes.

  13. Role of absorbing aerosols on hot extremes in India in a GCM

    NASA Astrophysics Data System (ADS)

    Mondal, A.; Sah, N.; Venkataraman, C.; Patil, N.

    2017-12-01

    Temperature extremes and heat waves in North-Central India during the summer months of March through June are known for causing significant impact in terms of human health, productivity and mortality. While greenhouse gas-induced global warming is generally believed to intensify the magnitude and frequency of such extremes, aerosols are usually associated with an overall cooling, by virtue of their dominant radiation scattering nature, in most world regions. Recently, large-scale atmospheric conditions leading to heat wave and extreme temperature conditions have been analysed for the North-Central Indian region. However, the role of absorbing aerosols, including black carbon and dust, is still not well understood, in mediating hot extremes in the region. In this study, we use 30-year simulations from a chemistry-coupled atmosphere-only General Circulation Model (GCM), ECHAM6-HAM2, forced with evolving aerosol emissions in an interactive aerosol module, along with observed sea surface temperatures, to examine large-scale and mesoscale conditions during hot extremes in India. The model is first validated with observed gridded temperature and reanalysis data, and is found to represent observed variations in temperature in the North-Central region and concurrent large-scale atmospheric conditions during high temperature extremes realistically. During these extreme events, changes in near surface properties include a reduction in single scattering albedo and enhancement in short-wave solar heating rate, compared to climatological conditions. This is accompanied by positive anomalies of black carbon and dust aerosol optical depths. We conclude that the large-scale atmospheric conditions such as the presence of anticyclones and clear skies, conducive to heat waves and high temperature extremes, are exacerbated by absorbing aerosols in North-Central India. Future air quality regulations are expected to reduce sulfate particles and their masking of GHG warming. It is concurrently important to mitigate emissions of warming black carbon particles, to manage future climate change-induced hot extremes.

  14. Active galaxies observed during the Extreme Ultraviolet Explorer all-sky survey

    NASA Technical Reports Server (NTRS)

    Marshall, H. L.; Fruscione, A.; Carone, T. E.

    1995-01-01

    We present observations of active galactic nuclei (AGNs) obtained with the Extreme Ultraviolet Explorer (EUVE) during the all-sky survey. A total of 13 sources were detected at a significance of 2.5 sigma or better: seven Seyfert galaxies, five BL Lac objects, and one quasar. The fraction of BL Lac objects is higher in our sample than in hard X-ray surveys but is consistent with the soft X-ray Einstein Slew Survey, indicating that the main reason for the large number of BL Lac objects in the extreme ulktraviolet (EUV) and soft X-ray bands is their steeper X-ray spectra. We show that the number of AGNs observed in both the EUVE and ROSAT Wide Field Camera surveys can readily be explained by modelling the EUV spectra with a simple power law in the case of BL Lac objects and with an additional EUV excess in the case of Seyferts and quasars. Allowing for cold matter absorption in Seyfert galaxy hosts drive up the inferred average continuum slope to 2.0 +/- 0.5 (at 90% confidence), compared to a slope of 1.0 usually found from soft X-ray data. If Seyfert galaxies without EUV excesses form a significant fraction of the population, then the average spectrum of those with bumps should be even steeper. We place a conservative limit on neutral gas in BL Lac objects: N(sub H) less than 10(exp 20)/sq cm.

  15. D Webgis and Visualization Issues for Architectures and Large Sites

    NASA Astrophysics Data System (ADS)

    De Amicis, R.; Conti, G.; Girardi, G.; Andreolli, M.

    2011-09-01

    Traditionally, within the field of archaeology and, more generally, within the cultural heritage domain, Geographical Information Systems (GIS) have been mostly used as support to cataloguing activities, essentially operating as gateways to large geo-referenced archives of specialised cultural heritage information. Additionally GIS have proved to be essential to help cultural heritage institutions improve management of their historical information, providing the means for detection of otherwise hard-to-discover spatial patterns, supporting with computation tools necessary to perform spatial clustering, proximity and orientation analysis. This paper presents a platform developed to answer to both the aforementioned issues, by allowing geo-referenced cataloguing of multi-media resources of cultural relevance as well as access, in a user-friendly manner, through an interactive 3D geobrowser which operates as single point of access to the available digital repositories. The solution has been showcased in the context of "Festival dell'economia" (the Fair of Economics) a major event recently occurred in Trento, Italy and it has allowed visitors of the event to interactively access an extremely large repository of information, as well as their metadata, available across the area of the Autonomous Province of Trento, in Italy. Within the event, an extremely large repository was made accessible, via the network, through web-services, from a 3D interactive geobrowser developed by the authors. The 3D scene was enriched with a number of Points of Interest (POIs) linking to information available within various databases. The software package was deployed with a complex hardware set-up composed of a large composite panoramic screen covering a horizontal field of view of 240 degrees.

  16. Red, redder, reddest: SCUBA-2 imaging of colour-selected Herschel sources

    NASA Astrophysics Data System (ADS)

    Duivenvoorden, S.; Oliver, S.; Scudder, J. M.; Greenslade, J.; Riechers, D. A.; Wilkins, S. M.; Buat, V.; Chapman, S. C.; Clements, D. L.; Cooray, A.; Coppin, K. E. K.; Dannerbauer, H.; De Zotti, G.; Dunlop, J. S.; Eales, S. A.; Efstathiou, A.; Farrah, D.; Geach, J. E.; Holland, W. S.; Hurley, P. D.; Ivison, R. J.; Marchetti, L.; Petitpas, G.; Sargent, M. T.; Scott, D.; Symeonidis, M.; Vaccari, M.; Vieira, J. D.; Wang, L.; Wardlow, J.; Zemcov, M.

    2018-06-01

    High-redshift, luminous, dusty star-forming galaxies (DSFGs) constrain the extremity of galaxy formation theories. The most extreme are discovered through follow-up on candidates in large area surveys. Here, we present extensive 850 μm SCUBA-2 follow-up observations of 188 red DSFG candidates from the Herschel Multitiered Extragalactic Survey (HerMES) Large Mode Survey, covering 274 deg2. We detected 87 per cent with a signal-to-noise ratio >3 at 850 μm. We introduce a new method for incorporating the confusion noise in our spectral energy distribution fitting by sampling correlated flux density fluctuations from a confusion limited map. The new 850 μm data provide a better constraint on the photometric redshifts of the candidates, with photometric redshift errors decreasing from σz/(1 + z) ≈ 0.21 to 0.15. Comparison spectroscopic redshifts also found little bias (<(z - zspec)/(1 + zspec)> = 0.08). The mean photometric redshift is found to be 3.6 with a dispersion of 0.4 and we identify 21 DSFGs with a high probability of lying at z > 4. After simulating our selection effects we find number counts are consistent with phenomenological galaxy evolution models. There is a statistically significant excess of WISE-1 and SDSS sources near our red galaxies, giving a strong indication that lensing may explain some of the apparently extreme objects. Nevertheless, our sample includes examples of galaxies with the highest star formation rates in the Universe (≫103 M⊙ yr-1).

  17. Estimation of Future Return Levels for Heavy Rainfall in the Iberian Peninsula: Comparison of Methodologies

    NASA Astrophysics Data System (ADS)

    Parey, S.

    2014-12-01

    F. J. Acero1, S. Parey2, T.T.H. Hoang2, D. Dacunha-Castelle31Dpto. Física, Universidad de Extremadura, Avda. de Elvas s/n, 06006, Badajoz 2EDF/R&D, 6 quai Watier, 78401 Chatou Cedex, France 3Laboratoire de Mathématiques, Université Paris 11, Orsay, France Trends can already be detected in daily rainfall amount in the Iberian Peninsula (IP), and this will have an impact on the extreme levels. In this study, we compare different ways to estimate future return levels for heavy rainfall, based on the statistical extreme value theory. Both Peaks over Threshold (POT) and block maxima with the Generalized Extreme Value (GEV) distribution will be used and their results compared when linear trends are assumed in the parameters: threshold and scale parameter for POT and location and scale parameter for GEV. But rainfall over the IP is a special variable in that a large number of the values are 0. Thus, the impact of taking this into account is discussed too. Another approach is then tested, based on the evolutions of the mean and variance obtained from the time series of rainy days only, and of the number of rainy days. A statistical test, similar to that designed for temperature in Parey et al. 2013, is used to assess if the trends in extremes can be considered as mostly due to these evolutions when considering only rainy days. The results show that it is mainly the case: the extremes of the residuals, after removing the trends in mean and standard deviation, cannot be differentiated from those of a stationary process. Thus, the future return levels can be estimated from the stationary return level of these residuals and an estimation of the future mean and standard deviation. Moreover, an estimation of the future number of rainy days is used to retrieve the return levels for all days. All of these comparisons are made for an ensemble of high quality rainfall time series observed in the Iberian Peninsula over the period 1961-2010, from which we want to estimate a 20-year return level expected in 2020. The evolutions and the impact of the different approaches will be discussed for 3 seasons: fall, spring and winter. Parey S., Hoang T.T.H., Dacunha-Castelle D.: The importance of mean and variance in predicting changes in temperature extremes, Journal of Geophysical Research: Atmospheres, Vol. 118, 1-12, 2013.

  18. 11. INTERIOR OF BEDROOM NUMBER ONE SHOWING OPEN DOOR FROM ...

    Library of Congress Historic Buildings Survey, Historic Engineering Record, Historic Landscapes Survey

    11. INTERIOR OF BEDROOM NUMBER ONE SHOWING OPEN DOOR FROM LIVING ROOM AT EXTREME PHOTO LEFT, OPEN DOOR TO WALK-IN CLOSET AT PHOTO LEFT CENTER, OPEN DOOR TO BATHROOM AT PHOTO CENTER, AND OPEN DOOR TO BEDROOM NUMBER TWO AT EXTREME PHOTO RIGHT. VIEW TO WEST. - Rush Creek Hydroelectric System, Worker Cottage, Rush Creek, June Lake, Mono County, CA

  19. Modeling extreme drought impacts on terrestrial ecosystems when thresholds are exceeded

    NASA Astrophysics Data System (ADS)

    Holm, J. A.; Rammig, A.; Smith, B.; Medvigy, D.; Lichstein, J. W.; Dukes, J. S.; Allen, C. D.; Beier, C.; Larsen, K. S.; Ficken, C. D.; Pockman, W.; Anderegg, W.; Luo, Y.

    2016-12-01

    Recent IPCC Assessment Reports suggest that with predicted climate changes future precipitation- and heat-related extreme events are becoming stronger and more frequent with potential for prolonged droughts. To prepare for these changes and their impacts, we need to develop a better understanding of terrestrial ecosystem responses to extreme drought events. In particular, we focus here on large-extent and long-lasting extreme drought events with noticeable impacts on the functioning of forested ecosystems. While most of ecosystem manipulative experiments have been motivated by ongoing and predicted climate change, the majority only applied relatively moderate droughts, not addressing the "very" extreme tail of these scenarios, i.e. "extreme extremes (EEs)". We explore the response of forest ecosystems to EEs using two demographic-based dynamic global vegetation models (DGVMs) (i.e. ED2, LPJ-GUESS) in which the abundances of different plant functional types, as well as tree size- and age-class structure, are emergent properties of resource competition. We evaluate the model's capabilities to represent extreme drought scenarios (i.e., 50% and 90% reduction in precipitation for 1-year, 2-year, and 4-year drought scenarios) at two dry forested sites: Palo Verde, Costa Rica (i.e. tropical) and EucFACE, Australia (i.e. temperate). Through the DGVM modeling outcomes we determine the following five testable hypotheses for future experiments: 1) EEs cannot be extrapolated from mild extremes due to plant plasticity and functional composition. 2) Response to EEs depends on functional diversity, trait combinations, and phenology, such that both models predicted even after 100 years plant biomass did not recover. 3) Mortality from drought reduces the pressure on resources and prevents further damage by subsequent years of drought. 4) Early successional stands are more vulnerable to extreme droughts while older stand are more resilient. 5) Elevated atmospheric CO2 alleviates impacts of extreme droughts while increased temperature exacerbates mortality. This study highlighted a number of questions about our current understanding of EEs and their corresponding thresholds and tipping points, and provides an analysis of confidence in model representation and accuracy of processes related to EEs.

  20. Measures of dependence for multivariate Lévy distributions

    NASA Astrophysics Data System (ADS)

    Boland, J.; Hurd, T. R.; Pivato, M.; Seco, L.

    2001-02-01

    Recent statistical analysis of a number of financial databases is summarized. Increasing agreement is found that logarithmic equity returns show a certain type of asymptotic behavior of the largest events, namely that the probability density functions have power law tails with an exponent α≈3.0. This behavior does not vary much over different stock exchanges or over time, despite large variations in trading environments. The present paper proposes a class of multivariate distributions which generalizes the observed qualities of univariate time series. A new consequence of the proposed class is the "spectral measure" which completely characterizes the multivariate dependences of the extreme tails of the distribution. This measure on the unit sphere in M-dimensions, in principle completely general, can be determined empirically by looking at extreme events. If it can be observed and determined, it will prove to be of importance for scenario generation in portfolio risk management.

  1. Correlation of seasonal variations in phosphorous and nitrogen species in upper Black Warrior River with duckweed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gabrielson, F.C. Jr.; Malatino, A.M.; Santa Cruz, G.J.

    1980-10-01

    Water samples taken throughout the year from a drainage system that had supported giant duckweed blooms were analyzed for nitrogen and phosphorus. Although seasonal separation of the data indicates possible differences within an imppoundment (Bayview Lake), extreme variations make meaningful conclusions difficult. Daily discharge from a large number of points may have masked seasonal differences. Extensive plant mats were present at minimal levels of nitrogen and phosphorus. The growth rate seemed to be governed more by climate than nutrient conditions. Laboratory investigations indicate that giant duckweed can grow under a wide range of nutrient conditions including high heavy metal concentrations.more » Growth rate data show that without a continual input of nutrients, maximum growth rates do not usually continue beyond 14 to 20 days regardless of the initial single element concentration. With a continuous nutrient input, growth would probably only be inhibited by extreme climate conditions.« less

  2. Extreme AO coronagraphy laboratory demonstration in the context of SPHERE

    NASA Astrophysics Data System (ADS)

    Martinez, P.; Aller Carpentier, E.; Kasper, M.

    2010-10-01

    The exoplanetary science through direct imaging and spectroscopy will largely expand with the very soon raise of new instruments at the VLT (SPHERE), Gemini (GPI), and Subaru (HiCIAO) observatories. All these ground-based adaptive optics instruments include extremely high performance adaptive optics (XAO) system, advanced starlight cancellation techniques (e.g. coronagraphy), and speckle calibration techniques (e.g. spectral, angular, or polarimetry). In this context we report laboratory results obtained with the High-Order Test bench (HOT), the adaptive optics facility at the European Southern Observatory headquarters. Under 0.5 arcsec dynamical seeing, efficiently corrected by an XAO system delivering H-band Strehl ratio above 90%, we discuss contrast levels obtained with an apodized pupil Lyot coronagraph using differential imaging techniques (spectral and polarimetric). Accounting for system differences (e.g. deformable mirror actuator number), we demonstrate a good agreement between experimental results and expectations for SPHERE, or GPI, while we already met HiCIAO contrast goals.

  3. On long-term ozone trends at Hohenpeissenberg

    NASA Technical Reports Server (NTRS)

    Claude, H.; Vandersee, W.; Wege, K.

    1994-01-01

    More than 2000 ozone soundings and a large number of Dobson observations have been performed since 1967 in a unique procedure. The achieved very homogeneous data sets were used to evaluate significant long-term trends both in the troposphere and the stratosphere. The trend amounts to about plus 2 percent per year in the troposphere and to about minus 0.5 percent per year in the stratosphere. Extremely low ozone records obtained during winter 1991/92 are discussed in the light of the long term series. The winter mean of the ozone column is the lowest one of the series. The ozone deficit occurred mainly in the lower stratosphere. One cause may be the Pinatubo cloud. Even compared with the extreme winter mean following the El Chichon eruption the ozone content was lower. Additionally ozone was reduced by dynamical effects due to unusual weather situations.

  4. Neurologic complications in common wrist and hand surgical procedures

    PubMed Central

    Verdecchia, Nicole; Johnson, Julie; Baratz, Mark; Orebaugh, Steven

    2018-01-01

    Nerve dysfunction after upper extremity orthopedic surgery is a recognized complication, and may result from a variety of different causes. Hand and wrist surgery require incisions and retraction that necessarily border on small peripheral nerves, which may be difficult to identify and protect with absolute certainty. This article reviews the rates and ranges of reported nerve dysfunction with respect to common surgical interventions for the distal upper extremity, including wrist arthroplasty, wrist arthrodesis, wrist arthroscopy, distal radius open reduction and internal fixation, carpal tunnel release, and thumb carpometacarpal surgery. A relatively large range of neurologic complications is reported, however many of the studies cited involve relatively small numbers of patients, and only rarely are neurologic complications included as primary outcome measures. Knowledge of these neurologic outcomes should help the surgeon to better counsel patients with regard to perioperative risk, as well as provide insight into workup and management of any adverse neurologic outcomes that may arise.

  5. Membrane bioreactors for treating waste streams.

    PubMed

    Howell, J A; Arnot, T C; Liu, W

    2003-03-01

    Membrane bioreactors (MBRs) have a number of advantages for treating wastewater containing large quantities of BOD. This paper reviews the inherent advantages of an MBR, which include high potential biomass loadings, lower sludge yields, and retention of specialized organisms that may not settle well in clarifiers. A major problem in effluent treatment occurs when mixed inorganic and organic wastes occur with high concentrations of pollutants. Inorganics that might cause extremes of pH and/or salinity will inhibit microbial growth and only specialized organisms can survive under these conditions. Refractory organics are only biodegraded with difficulty by specialized organisms, which usually do not resist the extreme inorganic environments. The use of membrane bioreactors to help separate the micro-organisms from the inorganic compounds, yet permit the organics to permeate, has been developed in two different designs that are outlined in this paper. The use of membrane contactors in a multimembrane stripping system to treat acidic chlorinated wastes is proposed and discussed.

  6. Sometimes two arms are enough--an unusual life-stage in brittle stars (Echinodermata: Ophiuroidea).

    PubMed

    Stöhr, Sabine; Alme, Øydis

    2015-08-03

    Off West Africa (Angola-Morocco), benthos samples were collected in the years 2005-2012. These contained 124 specimens of brittle stars with two long arms and three extremely short or absent arms and an elongated, narrow disc. These unusual brittle stars, as well as 33 specimens with five fully developed arms, were identified as Amphiura ungulata. The specimens with unequal arms were juvenile stages, whereas adults had five equal arms. The large number of specimens with unequal arms suggests that this condition is not the result of damage and regeneration, but a normal growth pattern in this species. This study documents the morphology by SEM, amends the species description, and discusses possible explanations for the evolution of this condition. Although brittle star species with unequal arm growth have been reported, this is an extreme case that was unknown before this study.

  7. The use of ERTS-1 satellite data in Great Lakes mesometeorological studies

    NASA Technical Reports Server (NTRS)

    Lyons, W. A. (Principal Investigator)

    1972-01-01

    The author has identified the following significant results. In the original proposal, it was hoped that ERTS could, with its extremely high resolution and multispectral capability, detect many meteorological phenomena occurring at the low end of the mesoscale motion spectrum (1 - 100 km). This included convective cloud phenomena, internal wave patterns, air pollution, snow squalls, etc. For meteorologists, ERTS-1 has more than lived up to initial hopes. First-look inspection of images has produced a large number of truly remarkable finds. Some of the most significant are: (1) Images of Lake Ontario during late summer have revealed several extremely good examples of lake breeze frontal cloud patterns. (2) Detection of suspended particulates from Chicago-Gary industrial complex in the 50,000 to 150,000 tons/year category. (3) Inadvertant weather modification due to anthropogenic condensation and ice nuclei from urban areas.

  8. Extremely high efficient nanoreactor with Au@ZnO catalyst for photocatalysis

    NASA Astrophysics Data System (ADS)

    Su, Chung-Yi; Yang, Tung-Han; Gurylev, Vitaly; Huang, Sheng-Hsin; Wu, Jenn-Ming; Perng, Tsong-Pyng

    2015-10-01

    We fabricated a photocatalytic Au@ZnO@PC (polycarbonate) nanoreactor composed of monolayered Au nanoparticles chemisorbed on conformal ZnO nanochannel arrays within the PC membrane. A commercial PC membrane was used as the template for deposition of a ZnO shell into the pores by atomic layer deposition (ALD). Thioctic acid (TA) with sufficient steric stabilization was used as a molecular linker for functionalization of Au nanoparticles in a diameter of 10 nm. High coverage of Au nanoparticles anchored on the inner wall of ZnO nanochannels greatly improved the photocatalytic activity for degradation of Rhodamine B. The membrane nanoreactor achieved 63% degradation of Rhodamine B within only 26.88 ms of effective reaction time owing to its superior mass transfer efficiency based on Damköhler number analysis. Mass transfer limitation can be eliminated in the present study due to extremely large surface-to-volume ratio of the membrane nanoreactor.

  9. Analysis on flood generation processes by means of a continuous simulation model

    NASA Astrophysics Data System (ADS)

    Fiorentino, M.; Gioia, A.; Iacobellis, V.; Manfreda, S.

    2006-03-01

    In the present research, we exploited a continuous hydrological simulation to investigate on key variables responsible of flood peak formation. With this purpose, a distributed hydrological model (DREAM) is used in cascade with a rainfall generator (IRP-Iterated Random Pulse) to simulate a large number of extreme events providing insight into the main controls of flood generation mechanisms. Investigated variables are those used in theoretically derived probability distribution of floods based on the concept of partial contributing area (e.g. Iacobellis and Fiorentino, 2000). The continuous simulation model is used to investigate on the hydrological losses occurring during extreme events, the variability of the source area contributing to the flood peak and its lag-time. Results suggest interesting simplification for the theoretical probability distribution of floods according to the different climatic and geomorfologic environments. The study is applied to two basins located in Southern Italy with different climatic characteristics.

  10. Drivers and implications of recent large fire years in boreal North America

    NASA Astrophysics Data System (ADS)

    Veraverbeke, S.; Rogers, B. M.; Goulden, M.; Jandt, R.; Miller, C. E.; Wiggins, E. B.; Randerson, J. T.

    2016-12-01

    High latitude ecosystems are rapidly transforming because of climate change. Boreal North America recently experienced two exceptionally large fire years: 2014 in the Northwest Territories, Canada, and 2015 in Alaska, USA. We used geospatial climate, lightning, fire, and vegetation datasets to assess the mechanisms contributing to these recent extreme years and to the causes of recent decadal-scale changes in fire dynamics. We found that the two events had a record number of lightning ignitions and unusually high levels of burning near the boreal treeline, contributing to emissions of 164 ± 32 Tg C in the Northwest Territories and 65 ± 13 Tg C in Interior Alaska. The annual number ignitions in both regions displayed a significant increasing trend since 1975, driven by an increase in lightning ignitions. We found that vapor pressure deficit (VPD) in June, lightning, and ignition events were significantly correlated on interannual timescales. Future climate-driven increases in VPD and lightning near the treeline ecotone may enable northward forest expansion within tundra ecosystems.

  11. Fast and Epsilon-Optimal Discretized Pursuit Learning Automata.

    PubMed

    Zhang, JunQi; Wang, Cheng; Zhou, MengChu

    2015-10-01

    Learning automata (LA) are powerful tools for reinforcement learning. A discretized pursuit LA is the most popular one among them. During an iteration its operation consists of three basic phases: 1) selecting the next action; 2) finding the optimal estimated action; and 3) updating the state probability. However, when the number of actions is large, the learning becomes extremely slow because there are too many updates to be made at each iteration. The increased updates are mostly from phases 1 and 3. A new fast discretized pursuit LA with assured ε -optimality is proposed to perform both phases 1 and 3 with the computational complexity independent of the number of actions. Apart from its low computational complexity, it achieves faster convergence speed than the classical one when operating in stationary environments. This paper can promote the applications of LA toward the large-scale-action oriented area that requires efficient reinforcement learning tools with assured ε -optimality, fast convergence speed, and low computational complexity for each iteration.

  12. Experimental methods for studying microbial survival in extraterrestrial environments.

    PubMed

    Olsson-Francis, Karen; Cockell, Charles S

    2010-01-01

    Microorganisms can be used as model systems for studying biological responses to extraterrestrial conditions; however, the methods for studying their response are extremely challenging. Since the first high altitude microbiological experiment in 1935 a large number of facilities have been developed for short- and long-term microbial exposure experiments. Examples are the BIOPAN facility, used for short-term exposure, and the EXPOSE facility aboard the International Space Station, used for long-term exposure. Furthermore, simulation facilities have been developed to conduct microbiological experiments in the laboratory environment. A large number of microorganisms have been used for exposure experiments; these include pure cultures and microbial communities. Analyses of these experiments have involved both culture-dependent and independent methods. This review highlights and discusses the facilities available for microbiology experiments, both in space and in simulation environments. A description of the microorganisms and the techniques used to analyse survival is included. Finally we discuss the implications of microbiological studies for future missions and for space applications. Copyright 2009 Elsevier B.V. All rights reserved.

  13. Ensemble-based evaluation of extreme water levels for the eastern Baltic Sea

    NASA Astrophysics Data System (ADS)

    Eelsalu, Maris; Soomere, Tarmo

    2016-04-01

    The risks and damages associated with coastal flooding that are naturally associated with an increase in the magnitude of extreme storm surges are one of the largest concerns of countries with extensive low-lying nearshore areas. The relevant risks are even more contrast for semi-enclosed water bodies such as the Baltic Sea where subtidal (weekly-scale) variations in the water volume of the sea substantially contribute to the water level and lead to large spreading of projections of future extreme water levels. We explore the options for using large ensembles of projections to more reliably evaluate return periods of extreme water levels. Single projections of the ensemble are constructed by means of fitting several sets of block maxima with various extreme value distributions. The ensemble is based on two simulated data sets produced in the Swedish Meteorological and Hydrological Institute. A hindcast by the Rossby Centre Ocean model is sampled with a resolution of 6 h and a similar hindcast by the circulation model NEMO with a resolution of 1 h. As the annual maxima of water levels in the Baltic Sea are not always uncorrelated, we employ maxima for calendar years and for stormy seasons. As the shape parameter of the Generalised Extreme Value distribution changes its sign and substantially varies in magnitude along the eastern coast of the Baltic Sea, the use of a single distribution for the entire coast is inappropriate. The ensemble involves projections based on the Generalised Extreme Value, Gumbel and Weibull distributions. The parameters of these distributions are evaluated using three different ways: maximum likelihood method and method of moments based on both biased and unbiased estimates. The total number of projections in the ensemble is 40. As some of the resulting estimates contain limited additional information, the members of pairs of projections that are highly correlated are assigned weights 0.6. A comparison of the ensemble-based projection of extreme water levels and their return periods with similar estimates derived from local observations reveals an interesting pattern of match and mismatch. The match is almost perfect in measurement sites where local effects (e.g., wave-induced set-up or local surge in very shallow areas that are not resolved by circulation models) do not contribute to the observed values of water level. There is, however, substantial mismatch between projected and observed extreme values for most of the Estonian coast. The mismatch is largest for sections that are open to high waves and for several bays that are deeply cut into mainland but open for predominant strong wind directions. Detailed quantification of this mismatch eventually makes it possible to develop substantially improved estimates of extreme water levels in sections where local effects considerably contribute into the total water level.

  14. The New York City Operations Support Tool: Supporting Water Supply Operations for Millions in an Era of Changing Patterns in Hydrological Extreme Events

    NASA Astrophysics Data System (ADS)

    Matonse, A. H.; Porter, J. H.; Frei, A.

    2015-12-01

    Providing an average 1.1 billion gallons (~ 4.2 x 106 cubic meters) of drinking water per day to approximately nine million people in New York City (NYC) and four upstate counties, the NYC water supply is among the world's largest unfiltered systems. In addition to providing a reliable water supply in terms of water quantity and quality, the city has to fulfill other flow objectives to serve downstream communities. At times, such as during extreme hydrological events, water quality issues may restrict water usage for parts of the system. To support a risk-based water supply decision making process NYC has developed the Operations Support Tool (OST). OST combines a water supply systems model with reservoir water quality models, near real time data ingestion, data base management and an ensemble hydrological forecast. A number of reports have addressed the frequency and intensities of extreme hydrological events across the continental US. In the northeastern US studies have indicated an increase in the frequency of extremely large precipitation and streamflow events during the most recent decades. During this presentation we describe OST and, using case studies we demonstrate how this tool has been useful to support operational decisions. We also want to motivate a discussion about how undergoing changes in patterns of hydrological extreme events elevate the challenge faced by water supply managers and the role of the scientific community to integrate nonstationarity approaches in hydrologic forecast and modeling.

  15. Mitigation of environmental problems in Lake Victoria, East Africa: causal chain and policy options analyses.

    PubMed

    Odada, Eric O; Olago, Daniel O; Kulindwa, Kassim; Ntiba, Micheni; Wandiga, Shem

    2004-02-01

    Lake Victoria is an international waterbody that offers the riparian communities a large number of extremely important environmental services. Over the past three decades or so, the lake has come under increasing and considerable pressure from a variety of interlinked human activities such as overfishing, species introductions, industrial pollution, eutrophication, and sedimentation. In this paper we examine the root causes for overfishing and pollution in Lake Victoria and give possible policy options that can help remediate or mitigate the environmental degradation.

  16. Matrix Perturbation Techniques in Structural Dynamics

    NASA Technical Reports Server (NTRS)

    Caughey, T. K.

    1973-01-01

    Matrix perturbation are developed techniques which can be used in the dynamical analysis of structures where the range of numerical values in the matrices extreme or where the nature of the damping matrix requires that complex valued eigenvalues and eigenvectors be used. The techniques can be advantageously used in a variety of fields such as earthquake engineering, ocean engineering, aerospace engineering and other fields concerned with the dynamical analysis of large complex structures or systems of second order differential equations. A number of simple examples are included to illustrate the techniques.

  17. Space Situational Awareness of Large Numbers of Payloads from a Single Deployment

    DTIC Science & Technology

    2014-09-01

    challenges [12]. 3. CHIPSAT CLOUD DEPLOYMENT AND CATALOGING One of the six satellites deployed as part of the April 18, 2014, Falcon 9 launch to the ISS was...Cloud Modeling Techniques,” Journal of Spacecraft and Rockets , Vol. 33, No. 4, 550–555, 1996. 2. Swinerd, G.G., Barrows, S.P., and Crowther, R., “Short... Big Sky, Montana, 2013. 11. Voss, H.D., Dailey, J.F., Crowley, J.C., et al., “TSAT Globalstar ELaNa-5 Extremely Low-Earth Orbit (ELEO) Satellite,” 28th

  18. Ear Deformations Give Bats a Physical Mechanism for Fast Adaptation of Ultrasonic Beam Patterns

    NASA Astrophysics Data System (ADS)

    Gao, Li; Balakrishnan, Sreenath; He, Weikai; Yan, Zhen; Müller, Rolf

    2011-11-01

    A large number of mammals, including humans, have intricate outer ear shapes that diffract incoming sound in a direction- and frequency-specific manner. Through this physical process, the outer ear shapes encode sound-source information into the sensory signals from each ear. Our results show that horseshoe bats could dynamically control these diffraction processes through fast nonrigid ear deformations. The bats’ ear shapes can alter between extreme configurations in about 100 ms and thereby change their acoustic properties in ways that would suit different acoustic sensing tasks.

  19. Mid-Century Warming in the Los Angeles Region and its Uncertainty using Dynamical and Statistical Downscaling

    NASA Astrophysics Data System (ADS)

    Sun, F.; Hall, A. D.; Walton, D.; Capps, S. B.; Qu, X.; Huang, H. J.; Berg, N.; Jousse, A.; Schwartz, M.; Nakamura, M.; Cerezo-Mota, R.

    2012-12-01

    Using a combination of dynamical and statistical downscaling techniques, we projected mid-21st century warming in the Los Angeles region at 2-km resolution. To account for uncertainty associated with the trajectory of future greenhouse gas emissions, we examined projections for both "business-as-usual" (RCP8.5) and "mitigation" (RCP2.6) emissions scenarios from the Fifth Coupled Model Intercomparison Project (CMIP5). To account for the considerable uncertainty associated with choice of global climate model, we downscaled results for all available global climate models in CMIP5. For the business-as-usual scenario, we find that by the mid-21st century, the most likely warming is roughly 2.6°C averaged over the region's land areas, with a 95% confidence that the warming lies between 0.9 and 4.2°C. The high resolution of the projections reveals a pronounced spatial pattern in the warming: High elevations and inland areas separated from the coast by at least one mountain complex warm 20 to 50% more than the areas near the coast or within the Los Angeles basin. This warming pattern is especially apparent in summertime. The summertime warming contrast between the inland and coastal zones has a large effect on the most likely expected number of extremely hot days per year. Coastal locations and areas within the Los Angeles basin see roughly two to three times the number of extremely hot days, while high elevations and inland areas typically experience approximately three to five times the number of extremely hot days. Under the mitigation emissions scenario, the most likely warming and increase in heat extremes are somewhat smaller. However, the majority of the warming seen in the business-as-usual scenario still occurs at all locations in the most likely case under the mitigation scenario, and heat extremes still increase significantly. This warming study is the first part of a series studies of our project. More climate change impacts on the Santa Ana wind, rainfall, snowfall and snowmelt, cloud and surface hydrology are forthcoming and could be found in www.atmos.ucla.edu/csrl.he ensemble-mean, annual-mean surface air temperature change and its uncertainty from the available CMIP5 GCMs under the RCP8.5 (left) and RCP2.6 (right) emissions scenarios, unit: °C.

  20. Estimation of breeding values using selected pedigree records.

    PubMed

    Morton, Richard; Howarth, Jordan M

    2005-06-01

    Fish bred in tanks or ponds cannot be easily tagged individually. The parentage of any individual may be determined by DNA fingerprinting, but is sufficiently expensive that large numbers cannot be so finger-printed. The measurement of the objective trait can be made on a much larger sample relatively cheaply. This article deals with experimental designs for selecting individuals to be finger-printed and for the estimation of the individual and family breeding values. The general setup provides estimates for both genetic effects regarded as fixed or random and for fixed effects due to known regressors. The family effects can be well estimated when even very small numbers are finger-printed, provided that they are the individuals with the most extreme phenotypes.

  1. Future Gamma-Ray Observations of Pulsars and their Environments

    NASA Technical Reports Server (NTRS)

    Thompson, David J.

    2006-01-01

    Pulsars and pulsar wind nebulae seen at gamma-ray energies offer insight into particle acceleration to very high energies under extreme conditions. Pulsed emission provides information about the geometry and interaction processes in the magnetospheres of these rotating neutron stars, while the pulsar wind nebulae yield information about high-energy particles interacting with their surroundings. During the next decade, a number of new and expanded gamma-ray facilities will become available for pulsar studies, including Astro-rivelatore Gamma a Immagini LEggero (AGILE) and Gamma-ray Large Area Space Telescope (GLAST) in space and a number of higher-energy ground-based systems. This review describes the capabilities of such observatories to answer some of the open questions about the highest-energy processes involving neutron stars.

  2. Lux in obscuro II: photon orbits of extremal AdS black holes revisited

    NASA Astrophysics Data System (ADS)

    Tang, Zi-Yu; Ong, Yen Chin; Wang, Bin

    2017-12-01

    A large class of spherically symmetric static extremal black hole spacetimes possesses a stable null photon sphere on their horizons. For the extremal Kerr-Newman family, the photon sphere only really coincides with the horizon in the sense clarified by Doran. The condition under which a photon orbit is stable on an asymptotically flat extremal Kerr-Newman black hole horizon has recently been clarified; it is found that a sufficiently large angular momentum destabilizes the photon orbit, whereas an electrical charge tends to stabilize it. We investigated the effect of a negative cosmological constant on this observation, and found the same behavior in the case of extremal asymptotically Kerr-Newman-AdS black holes in (3+1) -dimensions. In (2+1) -dimensions, in the presence of an electrical charge, the angular momentum never becomes large enough to destabilize the photon orbit. We comment on the instabilities of black hole spacetimes with a stable photon orbit.

  3. Practice makes perfect in memory recall.

    PubMed

    Romani, Sandro; Katkov, Mikhail; Tsodyks, Misha

    2016-04-01

    A large variability in performance is observed when participants recall briefly presented lists of words. The sources of such variability are not known. Our analysis of a large data set of free recall revealed a small fraction of participants that reached an extremely high performance, including many trials with the recall of complete lists. Moreover, some of them developed a number of consistent input-position-dependent recall strategies, in particular recalling words consecutively ("chaining") or in groups of consecutively presented words ("chunking"). The time course of acquisition and particular choice of positional grouping were variable among participants. Our results show that acquiring positional strategies plays a crucial role in improvement of recall performance. © 2016 Romani et al.; Published by Cold Spring Harbor Laboratory Press.

  4. North American Extreme Temperature Events and Related Large Scale Meteorological Patterns: A Review of Statistical Methods, Dynamics, Modeling, and Trends

    NASA Technical Reports Server (NTRS)

    Grotjahn, Richard; Black, Robert; Leung, Ruby; Wehner, Michael F.; Barlow, Mathew; Bosilovich, Michael G.; Gershunov, Alexander; Gutowski, William J., Jr.; Gyakum, John R.; Katz, Richard W.; hide

    2015-01-01

    The objective of this paper is to review statistical methods, dynamics, modeling efforts, and trends related to temperature extremes, with a focus upon extreme events of short duration that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). The statistics, dynamics, and modeling sections of this paper are written to be autonomous and so can be read separately. Methods to define extreme events statistics and to identify and connect LSMPs to extreme temperature events are presented. Recent advances in statistical techniques connect LSMPs to extreme temperatures through appropriately defined covariates that supplement more straightforward analyses. Various LSMPs, ranging from synoptic to planetary scale structures, are associated with extreme temperature events. Current knowledge about the synoptics and the dynamical mechanisms leading to the associated LSMPs is incomplete. Systematic studies of: the physics of LSMP life cycles, comprehensive model assessment of LSMP-extreme temperature event linkages, and LSMP properties are needed. Generally, climate models capture observed properties of heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreak frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Modeling studies have identified the impact of large-scale circulation anomalies and landatmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs to more specifically understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated. The paper concludes with unresolved issues and research questions.

  5. [Probiotics: innocuousness, prevention and risks].

    PubMed

    Brunser, Oscar

    2017-01-01

    Probiotics have been defined as live microorganisms which, when ingested in adequate numbers, confer health benefits to the host. They are currently consumed without any age restrictions and adverse effects such as sepsis, a marker of the risk of invasion of the bloodstream, are extremely infrequent. However, some health professionals express doubts about probiotics being truly innocuous. This review discusses the incidence of sepsis secondary to probiotics use, mainly lactobacilli and bifidobacteria, evaluated through molecular biology or classic culture techniques, showing that sepsis in large numbers of individuals along decennia is extremely low, of the order of 0,02% en some centers or as low as 1 case/million population in France. These data are important considering the use different species and strains of these microorganisms. Few studies which have reported other adverse effects but many of these have problems with their design that cast doubt about the validity of their results. On the contrary, it has been shown that probiotic microorganisms exert positive stimulatory effects on innate and acquired immunity, with decrease of the manifestations of atopy and eczema. These positive effects are further evidenced by the beneficial effects of many species of probiotics in preventing necrotizing enterocolitis in patients as functionally labile as premature-born babies.

  6. An experimental point of view on hydration/solvation in halophilic proteins

    PubMed Central

    Talon, Romain; Coquelle, Nicolas; Madern, Dominique; Girard, Eric

    2014-01-01

    Protein-solvent interactions govern the behaviors of proteins isolated from extreme halophiles. In this work, we compared the solvent envelopes of two orthologous tetrameric malate dehydrogenases (MalDHs) from halophilic and non-halophilic bacteria. The crystal structure of the MalDH from the non-halophilic bacterium Chloroflexus aurantiacus (Ca MalDH) solved, de novo, at 1.7 Å resolution exhibits numerous water molecules in its solvation shell. We observed that a large number of these water molecules are arranged in pentagonal polygons in the first hydration shell of Ca MalDH. Some of them are clustered in large networks, which cover non-polar amino acid surface. The crystal structure of MalDH from the extreme halophilic bacterium Salinibacter ruber (Sr) solved at 1.55 Å resolution shows that its surface is strongly enriched in acidic amino acids. The structural comparison of these two models is the first direct observation of the relative impact of acidic surface enrichment on the water structure organization between a halophilic protein and its non-adapted counterpart. The data show that surface acidic amino acids disrupt pentagonal water networks in the hydration shell. These crystallographic observations are discussed with respect to halophilic protein behaviors in solution PMID:24600446

  7. An experimental point of view on hydration/solvation in halophilic proteins.

    PubMed

    Talon, Romain; Coquelle, Nicolas; Madern, Dominique; Girard, Eric

    2014-01-01

    Protein-solvent interactions govern the behaviors of proteins isolated from extreme halophiles. In this work, we compared the solvent envelopes of two orthologous tetrameric malate dehydrogenases (MalDHs) from halophilic and non-halophilic bacteria. The crystal structure of the MalDH from the non-halophilic bacterium Chloroflexus aurantiacus (Ca MalDH) solved, de novo, at 1.7 Å resolution exhibits numerous water molecules in its solvation shell. We observed that a large number of these water molecules are arranged in pentagonal polygons in the first hydration shell of Ca MalDH. Some of them are clustered in large networks, which cover non-polar amino acid surface. The crystal structure of MalDH from the extreme halophilic bacterium Salinibacter ruber (Sr) solved at 1.55 Å resolution shows that its surface is strongly enriched in acidic amino acids. The structural comparison of these two models is the first direct observation of the relative impact of acidic surface enrichment on the water structure organization between a halophilic protein and its non-adapted counterpart. The data show that surface acidic amino acids disrupt pentagonal water networks in the hydration shell. These crystallographic observations are discussed with respect to halophilic protein behaviors in solution.

  8. Relating Regional Arctic Sea Ice and climate extremes over Europe

    NASA Astrophysics Data System (ADS)

    Ionita-Scholz, Monica; Grosfeld, Klaus; Lohmann, Gerrit; Scholz, Patrick

    2016-04-01

    The potential increase of temperature extremes under climate change is a major threat to society, as temperature extremes have a deep impact on environment, hydrology, agriculture, society and economy. Hence, the analysis of the mechanisms underlying their occurrence, including their relationships with the large-scale atmospheric circulation and sea ice concentration, is of major importance. At the same time, the decline in Arctic sea ice cover during the last 30 years has been widely documented and it is clear that this change is having profound impacts at regional as well as planetary scale. As such, this study aims to investigate the relation between the autumn regional sea ice concentration variability and cold winters in Europe, as identified by the numbers of cold nights (TN10p), cold days (TX10p), ice days (ID) and consecutive frost days (CFD). We analyze the relationship between Arctic sea ice variation in autumn (September-October-November) averaged over eight different Arctic regions (Barents/Kara Seas, Beaufort Sea, Chukchi/Bering Seas, Central Arctic, Greenland Sea, Labrador Sea/Baffin Bay, Laptev/East Siberian Seas and Northern Hemisphere) and variations in atmospheric circulation and climate extreme indices in the following winter season over Europe using composite map analysis. Based on the composite map analysis it is shown that the response of the winter extreme temperatures over Europe is highly correlated/connected to changes in Arctic sea ice variability. However, this signal is not symmetrical for the case of high and low sea ice years. Moreover, the response of temperatures extreme over Europe to sea ice variability over the different Arctic regions differs substantially. The regions which have the strongest impact on the extreme winter temperature over Europe are: Barents/Kara Seas, Beaufort Sea, Central Arctic and the Northern Hemisphere. For the years of high sea ice concentration in the Barents/Kara Seas there is a reduction in the number of cold nights, cold days, ice days and consecutive frost days over the western part of Europe. In the opposite case of low sea ice concentration over the Barents/Kara Seas an increase of up to 8 days/winter of cold nights and days is observed over the whole Europe and an increase of up to 4 days/winter in the number of ID and CFD is observed over the same regions. The cold winters over Europe (low sea ice years) are associated with anomalous anticyclone and the downstream development of a mid-latitude trough, which in turn favours the advection of cold air from the north, providing favourable conditions for severe winters over Europe. We suggest that these results can help to improve the seasonal predictions of winter extreme events over Europe. Due to the non-linear response to high vs. low sea ice years, the skill of the predictions might depend on the sign and amplitude of the anomalies.

  9. Polygenic determinants in extremes of high-density lipoprotein cholesterol[S

    PubMed Central

    Dron, Jacqueline S.; Wang, Jian; Low-Kam, Cécile; Khetarpal, Sumeet A.; Robinson, John F.; McIntyre, Adam D.; Ban, Matthew R.; Cao, Henian; Rhainds, David; Dubé, Marie-Pierre; Rader, Daniel J.; Lettre, Guillaume; Tardif, Jean-Claude

    2017-01-01

    HDL cholesterol (HDL-C) remains a superior biochemical predictor of CVD risk, but its genetic basis is incompletely defined. In patients with extreme HDL-C concentrations, we concurrently evaluated the contributions of multiple large- and small-effect genetic variants. In a discovery cohort of 255 unrelated lipid clinic patients with extreme HDL-C levels, we used a targeted next-generation sequencing panel to evaluate rare variants in known HDL metabolism genes, simultaneously with common variants bundled into a polygenic trait score. Two additional cohorts were used for validation and included 1,746 individuals from the Montréal Heart Institute Biobank and 1,048 individuals from the University of Pennsylvania. Findings were consistent between cohorts: we found rare heterozygous large-effect variants in 18.7% and 10.9% of low- and high-HDL-C patients, respectively. We also found common variant accumulation, indicated by extreme polygenic trait scores, in an additional 12.8% and 19.3% of overall cases of low- and high-HDL-C extremes, respectively. Thus, the genetic basis of extreme HDL-C concentrations encountered clinically is frequently polygenic, with contributions from both rare large-effect and common small-effect variants. Multiple types of genetic variants should be considered as contributing factors in patients with extreme dyslipidemia. PMID:28870971

  10. Polygenic determinants in extremes of high-density lipoprotein cholesterol.

    PubMed

    Dron, Jacqueline S; Wang, Jian; Low-Kam, Cécile; Khetarpal, Sumeet A; Robinson, John F; McIntyre, Adam D; Ban, Matthew R; Cao, Henian; Rhainds, David; Dubé, Marie-Pierre; Rader, Daniel J; Lettre, Guillaume; Tardif, Jean-Claude; Hegele, Robert A

    2017-11-01

    HDL cholesterol (HDL-C) remains a superior biochemical predictor of CVD risk, but its genetic basis is incompletely defined. In patients with extreme HDL-C concentrations, we concurrently evaluated the contributions of multiple large- and small-effect genetic variants. In a discovery cohort of 255 unrelated lipid clinic patients with extreme HDL-C levels, we used a targeted next-generation sequencing panel to evaluate rare variants in known HDL metabolism genes, simultaneously with common variants bundled into a polygenic trait score. Two additional cohorts were used for validation and included 1,746 individuals from the Montréal Heart Institute Biobank and 1,048 individuals from the University of Pennsylvania. Findings were consistent between cohorts: we found rare heterozygous large-effect variants in 18.7% and 10.9% of low- and high-HDL-C patients, respectively. We also found common variant accumulation, indicated by extreme polygenic trait scores, in an additional 12.8% and 19.3% of overall cases of low- and high-HDL-C extremes, respectively. Thus, the genetic basis of extreme HDL-C concentrations encountered clinically is frequently polygenic, with contributions from both rare large-effect and common small-effect variants. Multiple types of genetic variants should be considered as contributing factors in patients with extreme dyslipidemia. Copyright © 2017 by the American Society for Biochemistry and Molecular Biology, Inc.

  11. Large-scale femtoliter droplet array for digital counting of single biomolecules.

    PubMed

    Kim, Soo Hyeon; Iwai, Shino; Araki, Suguru; Sakakihara, Shouichi; Iino, Ryota; Noji, Hiroyuki

    2012-12-07

    We present a novel device employing one million femtoliter droplets immobilized on a substrate for the quantitative detection of extremely low concentrations of biomolecules in a sample. Surface-modified polystyrene beads carrying either zero or a single biomolecule-reporter enzyme complex are efficiently isolated into femtoliter droplets formed on hydrophilic-in-hydrophobic surfaces. Using a conventional micropipette, this is achieved by sequential injection first with an aqueous solution containing beads, and then with fluorinated oil. The concentration of target biomolecules is estimated from the ratio of the number of signal-emitting droplets to the total number of trapped beads (digital counting). The performance of our digital counting device was demonstrated by detecting a streptavidin-β-galactosidase conjugate with a limit of detection (LOD) of 10 zM. The sensitivity of our device was >20-fold higher than that noted in previous studies where a smaller number of reactors (fifty thousand reactors) were used. Such a low LOD was achieved because of the large number of droplets in an array, allowing simultaneous examination of a large number of beads. When combined with bead-based enzyme-linked immunosorbent assay (digital ELISA), the LOD for the detection of prostate specific antigen reached 2 aM. This value, again, was improved over that noted in a previous study, because of the decreased coefficient of variance of the background measurement determined by the Poisson noise. Our digital counting device using one million droplets has great potential as a highly sensitive, portable immunoassay device that could be used to diagnose diseases.

  12. Mid-21st-century climate changes increase predicted fire occurrence and fire season length, Northern Rocky Mountains, United States

    USGS Publications Warehouse

    Riley, Karin L.; Loehman, Rachel A.

    2016-01-01

    Climate changes are expected to increase fire frequency, fire season length, and cumulative area burned in the western United States. We focus on the potential impact of mid-21st-century climate changes on annual burn probability, fire season length, and large fire characteristics including number and size for a study area in the Northern Rocky Mountains. Although large fires are rare they account for most of the area burned in western North America, burn under extreme weather conditions, and exhibit behaviors that preclude methods of direct control. Allocation of resources, development of management plans, and assessment of fire effects on ecosystems all require an understanding of when and where fires are likely to burn, particularly under altered climate regimes that may increase large fire occurrence. We used the large fire simulation model FSim to model ignition, growth, and containment of wildfires under two climate scenarios: contemporary (based on instrumental weather) and mid-century (based on an ensemble average of global climate models driven by the A1B SRES emissions scenario). Modeled changes in fire patterns include increased annual burn probability, particularly in areas of the study region with relatively short contemporary fire return intervals; increased individual fire size and annual area burned; and fewer years without large fires. High fire danger days, represented by threshold values of Energy Release Component (ERC), are projected to increase in number, especially in spring and fall, lengthening the climatic fire season. For fire managers, ERC is an indicator of fire intensity potential and fire economics, with higher ERC thresholds often associated with larger, more expensive fires. Longer periods of elevated ERC may significantly increase the cost and complexity of fire management activities, requiring new strategies to maintain desired ecological conditions and limit fire risk. Increased fire activity (within the historical range of frequency and severity, and depending on the extent to which ecosystems are adapted) may maintain or restore ecosystem functionality; however, in areas that are highly departed from historical fire regimes or where there is disequilibrium between climate and vegetation, ecosystems may be rapidly and persistently altered by wildfires, especially those that burn under extreme conditions.

  13. North American extreme temperature events and related large scale meteorological patterns: A review of statistical methods, dynamics, modeling, and trends

    DOE PAGES

    Grotjahn, Richard; Black, Robert; Leung, Ruby; ...

    2015-05-22

    This paper reviews research approaches and open questions regarding data, statistical analyses, dynamics, modeling efforts, and trends in relation to temperature extremes. Our specific focus is upon extreme events of short duration (roughly less than 5 days) that affect parts of North America. These events are associated with large scale meteorological patterns (LSMPs). Methods used to define extreme events statistics and to identify and connect LSMPs to extreme temperatures are presented. Recent advances in statistical techniques can connect LSMPs to extreme temperatures through appropriately defined covariates that supplements more straightforward analyses. A wide array of LSMPs, ranging from synoptic tomore » planetary scale phenomena, have been implicated as contributors to extreme temperature events. Current knowledge about the physical nature of these contributions and the dynamical mechanisms leading to the implicated LSMPs is incomplete. There is a pressing need for (a) systematic study of the physics of LSMPs life cycles and (b) comprehensive model assessment of LSMP-extreme temperature event linkages and LSMP behavior. Generally, climate models capture the observed heat waves and cold air outbreaks with some fidelity. However they overestimate warm wave frequency and underestimate cold air outbreaks frequency, and underestimate the collective influence of low-frequency modes on temperature extremes. Climate models have been used to investigate past changes and project future trends in extreme temperatures. Overall, modeling studies have identified important mechanisms such as the effects of large-scale circulation anomalies and land-atmosphere interactions on changes in extreme temperatures. However, few studies have examined changes in LSMPs more specifically to understand the role of LSMPs on past and future extreme temperature changes. Even though LSMPs are resolvable by global and regional climate models, they are not necessarily well simulated so more research is needed to understand the limitations of climate models and improve model skill in simulating extreme temperatures and their associated LSMPs. Furthermore, the paper concludes with unresolved issues and research questions.« less

  14. An Investigation of Three Extremity Armor Systems: Determination of Physiological, Biomechanical, and Physical Performance Effects and Quantification of Body Area Coverage

    DTIC Science & Technology

    2012-03-19

    THREE EXTREMITY ARMOR SYSTEMS: DETERMINATION OF PHYSIOLOGICAL, BIOMECHANICAL, AND PHYSICAL PERFORMANCE EFFECTS AND QUANTIFICATION OF BODY AREA...PHYSICAL PERFORMANCE EFFECTS AND QUANTIFICATION OF BODY AREA COVERAGE 5a. CONTRACT NUMBER MIPR #M9545006MPR6CC7 5b. GRANT NUMBER 5c. PROGRAM ELEMENT...WORK UNIT NUMBER 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER NATICK/TR-12/014 9

  15. Recent progress in 3-D imaging of sea freight containers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fuchs, Theobald, E-mail: theobold.fuchs@iis.fraunhofer.de; Schön, Tobias, E-mail: theobold.fuchs@iis.fraunhofer.de; Sukowski, Frank

    The inspection of very large objects like sea freight containers with X-ray Computed Tomography (CT) is an emerging technology. A complete 3-D CT scan of a see-freight container takes several hours. Of course, this is too slow to apply it to a large number of containers. However, the benefits of a 3-D CT for sealed freight are obvious: detection of potential threats or illicit cargo without being confronted with legal complications or high time consumption and risks for the security personnel during a manual inspection. Recently distinct progress was made in the field of reconstruction of projections with only amore » relatively low number of angular positions. Instead of today’s 500 to 1000 rotational steps, as needed for conventional CT reconstruction techniques, this new class of algorithms provides the potential to reduce the number of projection angles approximately by a factor of 10. The main drawback of these advanced iterative methods is the high consumption for numerical processing. But as computational power is getting steadily cheaper, there will be practical applications of these complex algorithms in a foreseeable future. In this paper, we discuss the properties of iterative image reconstruction algorithms and show results of their application to CT of extremely large objects scanning a sea-freight container. A specific test specimen is used to quantitatively evaluate the image quality in terms of spatial and contrast resolution and depending on different number of projections.« less

  16. Epidemiology of extremity fractures in the Netherlands.

    PubMed

    Beerekamp, M S H; de Muinck Keizer, R J O; Schep, N W L; Ubbink, D T; Panneman, M J M; Goslings, J C

    2017-07-01

    Insight in epidemiologic data of extremity fractures is relevant to identify people at risk. By analyzing age- and gender specific fracture incidence and treatment patterns we may adjust future policy, take preventive measures and optimize health care management. Current epidemiologic data on extremity fractures and their treatment are scarce, outdated or aiming at a small spectrum of fractures. The aim of this study was to assess trends in incidence and treatment of extremity fractures between 2004 and 2012 in relation to gender and age. We used a combination of national registries of patients aged ≥ 16 years with extremity fractures. Fractures were coded by the International Classification of Diseases (ICD) 10, and allocated to an anatomic region. ICD-10 codes were used for combining the data of the registries. Absolute numbers, incidences, number of patients treated in university hospitals and surgically treated patients were reported. A binary logistic regression was used to calculate trends during the study period. From 2004 to 2012 the Dutch population aged ≥16 years grew from 13,047,018 to 13,639,412 inhabitants, particularly in the higher age groups of 46 years and older. The absolute number of extremity fractures increased significantly from 129,188 to 176,129 (OR 1.308 [1.299-1.318]), except for forearm and lower leg fractures. Incidences increased significantly (3-4%) for wrist, hand/finger, hip/upper leg, ankle and foot/toe fractures. In contrast to the older age categories from 66 years and older, in younger age categories from 16 to 35 years, fractures of the extremities were more frequent in men than in women. Treatments gradually moved towards non-university hospitals for all except forearm fractures. Both relative and absolute numbers increased for surgical treatments of clavicle/shoulder, forearm, wrist and hand/finger fractures. Contrarily, lower extremity fractures showed an increase in non-surgical treatment, except for lower leg fractures. During the study period, we observed an increasing incidence of extremity fractures and a shift towards surgical treatment. Patient numbers in university hospitals declined. If these trends continue, policy makers would be well advised to consider the changing demands in extremity fracture treatment and pro-actively increase capacity and resources. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Characterizing the Spatial Contiguity of Extreme Precipitation over the US in the Recent Past

    NASA Astrophysics Data System (ADS)

    Touma, D. E.; Swain, D. L.; Diffenbaugh, N. S.

    2016-12-01

    The spatial characteristics of extreme precipitation over an area can define the hydrologic response in a basin, subsequently affecting the flood risk in the region. Here, we examine the spatial extent of extreme precipitation in the US by defining its "footprint": a contiguous area of rainfall exceeding a certain threshold (e.g., 90th percentile) on a given day. We first characterize the climatology of extreme rainfall footprint sizes across the US from 1980-2015 using Daymet, a high-resolution observational gridded rainfall dataset. We find that there are distinct regional and seasonal differences in average footprint sizes of extreme daily rainfall. In the winter, the Midwest shows footprints exceeding 500,000 sq. km while the Front Range exhibits footprints of 10,000 sq. km. Alternatively, the summer average footprint size is generally smaller and more uniform across the US, ranging from 10,000 sq. km in the Southwest to 100,000 sq. km in Montana and North Dakota. Moreover, we find that there are some significant increasing trends of average footprint size between 1980-2015, specifically in the Southwest in the winter and the Northeast in the spring. While gridded daily rainfall datasets allow for a practical framework in calculating footprint size, this calculation heavily depends on the interpolation methods that have been used in creating the dataset. Therefore, we assess footprint size using the GHCN-Daily station network and use geostatistical methods to define footprints of extreme rainfall directly from station data. Compared to the findings from Daymet, preliminary results using this method show fewer small daily footprint sizes over the US while large footprints are of similar number and magnitude to Daymet. Overall, defining the spatial characteristics of extreme rainfall as well as observed and expected changes in these characteristics allows us to better understand the hydrologic response to extreme rainfall and how to better characterize flood risks.

  18. Drivers and seasonal predictability of extreme wind speeds in the ECMWF System 4 and a statistical model

    NASA Astrophysics Data System (ADS)

    Walz, M. A.; Donat, M.; Leckebusch, G. C.

    2017-12-01

    As extreme wind speeds are responsible for large socio-economic losses in Europe, a skillful prediction would be of great benefit for disaster prevention as well as for the actuarial community. Here we evaluate patterns of large-scale atmospheric variability and the seasonal predictability of extreme wind speeds (e.g. >95th percentile) in the European domain in the dynamical seasonal forecast system ECMWF System 4, and compare to the predictability based on a statistical prediction model. The dominant patterns of atmospheric variability show distinct differences between reanalysis and ECMWF System 4, with most patterns in System 4 extended downstream in comparison to ERA-Interim. The dissimilar manifestations of the patterns within the two models lead to substantially different drivers associated with the occurrence of extreme winds in the respective model. While the ECMWF System 4 is shown to provide some predictive power over Scandinavia and the eastern Atlantic, only very few grid cells in the European domain have significant correlations for extreme wind speeds in System 4 compared to ERA-Interim. In contrast, a statistical model predicts extreme wind speeds during boreal winter in better agreement with the observations. Our results suggest that System 4 does not seem to capture the potential predictability of extreme winds that exists in the real world, and therefore fails to provide reliable seasonal predictions for lead months 2-4. This is likely related to the unrealistic representation of large-scale patterns of atmospheric variability. Hence our study points to potential improvements of dynamical prediction skill by improving the simulation of large-scale atmospheric dynamics.

  19. Nonparametric Regression Subject to a Given Number of Local Extreme Value

    DTIC Science & Technology

    2001-07-01

    compilation report: ADP013708 thru ADP013761 UNCLASSIFIED Nonparametric regression subject to a given number of local extreme value Ali Majidi and Laurie...locations of the local extremes for the smoothing algorithm. 280 A. Majidi and L. Davies 3 The smoothing problem We make the smoothing problem precise...is the solution of QP3. k--oo 282 A. Majidi and L. Davies FiG. 2. The captions top-left, top-right, bottom-left, bottom-right show the result of the

  20. Extreme phenophase delays and their relationship with natural forcings in Beijing over the past 260 years.

    PubMed

    Liu, Yang; Zhang, Mingqing; Fang, Xiuqi

    2018-03-20

    By merging reconstructed phenological series from published articles and observations of China Phenology Observation Network (CPON), the first blooming date of Amygdalus davidiana (FBA) in Beijing between 1741 and 2000 is reconstructed. The Butterworth method is used to remove the multi-year variations for generating the phenological series of annual variations in the first blooming date of A. davidiana. The extreme delay years in the phenological series are identified using the percentage threshold method. The characteristics of the extreme delays and the correspondence of these events with natural forcings are analysed. The main results are as follows. In annual phenological series, the extreme delays appeared in single year as main feature, only A.D.1800-1801, 1816-1817 and 1983-1984 were the events of two consecutively extreme years. Approximately 85% of the extreme delays occurred during 1-2 years after the large volcanic eruptions (VEI ≥ 4) in the eastern rim or the western rim of the Pacific Ocean, as the same proportion of the extreme delays followed El Niño events. About 73% years of the extreme delays fall in the valleys of sunspot cycles or the Dalton minimum period in the year or the previous year. According to the certainty factor (CF), the large eruptions have the greatest influence to the extreme delays; sunspot activity is the second, and ENSO is the last one. The extreme phenological delayed year is most likely to occur after a large eruption, which particularly occurs during El Niño year and its previous several years were in the descending portion or valley of sunspot phase.

  1. Extreme phenophase delays and their relationship with natural forcings in Beijing over the past 260 years

    NASA Astrophysics Data System (ADS)

    Liu, Yang; Zhang, Mingqing; Fang, Xiuqi

    2018-03-01

    By merging reconstructed phenological series from published articles and observations of China Phenology Observation Network (CPON), the first blooming date of Amygdalus davidiana (FBA) in Beijing between 1741 and 2000 is reconstructed. The Butterworth method is used to remove the multi-year variations for generating the phenological series of annual variations in the first blooming date of A. davidiana. The extreme delay years in the phenological series are identified using the percentage threshold method. The characteristics of the extreme delays and the correspondence of these events with natural forcings are analysed. The main results are as follows. In annual phenological series, the extreme delays appeared in single year as main feature, only A.D.1800-1801, 1816-1817 and 1983-1984 were the events of two consecutively extreme years. Approximately 85% of the extreme delays occurred during 1-2 years after the large volcanic eruptions (VEI ≥ 4) in the eastern rim or the western rim of the Pacific Ocean, as the same proportion of the extreme delays followed El Niño events. About 73% years of the extreme delays fall in the valleys of sunspot cycles or the Dalton minimum period in the year or the previous year. According to the certainty factor (CF), the large eruptions have the greatest influence to the extreme delays; sunspot activity is the second, and ENSO is the last one. The extreme phenological delayed year is most likely to occur after a large eruption, which particularly occurs during El Niño year and its previous several years were in the descending portion or valley of sunspot phase.

  2. Intentional Voice Command Detection for Trigger-Free Speech Interface

    NASA Astrophysics Data System (ADS)

    Obuchi, Yasunari; Sumiyoshi, Takashi

    In this paper we introduce a new framework of audio processing, which is essential to achieve a trigger-free speech interface for home appliances. If the speech interface works continually in real environments, it must extract occasional voice commands and reject everything else. It is extremely important to reduce the number of false alarms because the number of irrelevant inputs is much larger than the number of voice commands even for heavy users of appliances. The framework, called Intentional Voice Command Detection, is based on voice activity detection, but enhanced by various speech/audio processing techniques such as emotion recognition. The effectiveness of the proposed framework is evaluated using a newly-collected large-scale corpus. The advantages of combining various features were tested and confirmed, and the simple LDA-based classifier demonstrated acceptable performance. The effectiveness of various methods of user adaptation is also discussed.

  3. Filtering of non-linear instabilities. [from finite difference solution of fluid dynamics equations

    NASA Technical Reports Server (NTRS)

    Khosla, P. K.; Rubin, S. G.

    1979-01-01

    For Courant numbers larger than one and cell Reynolds numbers larger than two, oscillations and in some cases instabilities are typically found with implicit numerical solutions of the fluid dynamics equations. This behavior has sometimes been associated with the loss of diagonal dominance of the coefficient matrix. It is shown here that these problems can in fact be related to the choice of the spatial differences, with the resulting instability related to aliasing or nonlinear interaction. Appropriate 'filtering' can reduce the intensity of these oscillations and in some cases possibly eliminate the instability. These filtering procedures are equivalent to a weighted average of conservation and non-conservation differencing. The entire spectrum of filtered equations retains a three-point character as well as second-order spatial accuracy. Burgers equation has been considered as a model. Several filters are examined in detail, and smooth solutions have been obtained for extremely large cell Reynolds numbers.

  4. A discrete Markov metapopulation model for persistence and extinction of species.

    PubMed

    Thompson, Colin J; Shtilerman, Elad; Stone, Lewi

    2016-09-07

    A simple discrete generation Markov metapopulation model is formulated for studying the persistence and extinction dynamics of a species in a given region which is divided into a large number of sites or patches. Assuming a linear site occupancy probability from one generation to the next we obtain exact expressions for the time evolution of the expected number of occupied sites and the mean-time to extinction (MTE). Under quite general conditions we show that the MTE, to leading order, is proportional to the logarithm of the initial number of occupied sites and in precise agreement with similar expressions for continuous time-dependent stochastic models. Our key contribution is a novel application of generating function techniques and simple asymptotic methods to obtain a second order asymptotic expression for the MTE which is extremely accurate over the entire range of model parameter values. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. BPS magnetic monopole bags

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Ki-Myeong; Weinberg, Erick J.; Physics Department, Columbia University, New York, New York 10027

    2009-01-15

    We explore the characteristics of spherical bags made of large numbers of BPS magnetic monopoles. There are two extreme limits. In the Abelian bag, N zeros of the Higgs field are arranged in a quasiregular lattice on a sphere of radius R{sub cr}{approx}N/v, where v is the Higgs vacuum expectation value. The massive gauge fields of the theory are largely confined to a thin shell at this radius that separates an interior with almost vanishing magnetic and Higgs fields from an exterior region with long-range Coulomb magnetic and Higgs fields. In the other limiting case, which we term a non-Abelianmore » bag, the N zeros of the Higgs field are all the origin, but there is again a thin shell of radius R{sub cr}. In this case the region enclosed by this shell can be viewed as a large monopole core, with small Higgs field but nontrivial massive and massless gauge fields.« less

  6. Managing protected health information in distributed research network environments: automated review to facilitate collaboration.

    PubMed

    Bredfeldt, Christine E; Butani, Amy; Padmanabhan, Sandhyasree; Hitz, Paul; Pardee, Roy

    2013-03-22

    Multi-site health sciences research is becoming more common, as it enables investigation of rare outcomes and diseases and new healthcare innovations. Multi-site research usually involves the transfer of large amounts of research data between collaborators, which increases the potential for accidental disclosures of protected health information (PHI). Standard protocols for preventing release of PHI are extremely vulnerable to human error, particularly when the shared data sets are large. To address this problem, we developed an automated program (SAS macro) to identify possible PHI in research data before it is transferred between research sites. The macro reviews all data in a designated directory to identify suspicious variable names and data patterns. The macro looks for variables that may contain personal identifiers such as medical record numbers and social security numbers. In addition, the macro identifies dates and numbers that may identify people who belong to small groups, who may be identifiable even in the absences of traditional identifiers. Evaluation of the macro on 100 sample research data sets indicated a recall of 0.98 and precision of 0.81. When implemented consistently, the macro has the potential to streamline the PHI review process and significantly reduce accidental PHI disclosures.

  7. ARTS: automated randomization of multiple traits for study design.

    PubMed

    Maienschein-Cline, Mark; Lei, Zhengdeng; Gardeux, Vincent; Abbasi, Taimur; Machado, Roberto F; Gordeuk, Victor; Desai, Ankit A; Saraf, Santosh; Bahroos, Neil; Lussier, Yves

    2014-06-01

    Collecting data from large studies on high-throughput platforms, such as microarray or next-generation sequencing, typically requires processing samples in batches. There are often systematic but unpredictable biases from batch-to-batch, so proper randomization of biologically relevant traits across batches is crucial for distinguishing true biological differences from experimental artifacts. When a large number of traits are biologically relevant, as is common for clinical studies of patients with varying sex, age, genotype and medical background, proper randomization can be extremely difficult to prepare by hand, especially because traits may affect biological inferences, such as differential expression, in a combinatorial manner. Here we present ARTS (automated randomization of multiple traits for study design), which aids researchers in study design by automatically optimizing batch assignment for any number of samples, any number of traits and any batch size. ARTS is implemented in Perl and is available at github.com/mmaiensc/ARTS. ARTS is also available in the Galaxy Tool Shed, and can be used at the Galaxy installation hosted by the UIC Center for Research Informatics (CRI) at galaxy.cri.uic.edu. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. The transcriptome of Bathymodiolus azoricus gill reveals expression of genes from endosymbionts and free-living deep-sea bacteria.

    PubMed

    Egas, Conceição; Pinheiro, Miguel; Gomes, Paula; Barroso, Cristina; Bettencourt, Raul

    2012-08-01

    Deep-sea environments are largely unexplored habitats where a surprising number of species may be found in large communities, thriving regardless of the darkness, extreme cold, and high pressure. Their unique geochemical features result in reducing environments rich in methane and sulfides, sustaining complex chemosynthetic ecosystems that represent one of the most surprising findings in oceans in the last 40 years. The deep-sea Lucky Strike hydrothermal vent field, located in the Mid Atlantic Ridge, is home to large vent mussel communities where Bathymodiolus azoricus represents the dominant faunal biomass, owing its survival to symbiotic associations with methylotrophic or methanotrophic and thiotrophic bacteria. The recent transcriptome sequencing and analysis of gill tissues from B. azoricus revealed a number of genes of bacterial origin, hereby analyzed to provide a functional insight into the gill microbial community. The transcripts supported a metabolically active microbiome and a variety of mechanisms and pathways, evidencing also the sulfur and methane metabolisms. Taxonomic affiliation of transcripts and 16S rRNA community profiling revealed a microbial community dominated by thiotrophic and methanotrophic endosymbionts of B. azoricus and the presence of a Sulfurovum-like epsilonbacterium.

  9. An intelligent surveillance platform for large metropolitan areas with dense sensor deployment.

    PubMed

    Fernández, Jorge; Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M; Carro, Belén; Sánchez-Esguevillas, Antonio; Alonso-López, Jesus A; Smilansky, Zeev

    2013-06-07

    This paper presents an intelligent surveillance platform based on the usage of large numbers of inexpensive sensors designed and developed inside the European Eureka Celtic project HuSIMS. With the aim of maximizing the number of deployable units while keeping monetary and resource/bandwidth costs at a minimum, the surveillance platform is based on the usage of inexpensive visual sensors which apply efficient motion detection and tracking algorithms to transform the video signal in a set of motion parameters. In order to automate the analysis of the myriad of data streams generated by the visual sensors, the platform's control center includes an alarm detection engine which comprises three components applying three different Artificial Intelligence strategies in parallel. These strategies are generic, domain-independent approaches which are able to operate in several domains (traffic surveillance, vandalism prevention, perimeter security, etc.). The architecture is completed with a versatile communication network which facilitates data collection from the visual sensors and alarm and video stream distribution towards the emergency teams. The resulting surveillance system is extremely suitable for its deployment in metropolitan areas, smart cities, and large facilities, mainly because cheap visual sensors and autonomous alarm detection facilitate dense sensor network deployments for wide and detailed coverage.

  10. An approach to radiation safety department benchmarking in academic and medical facilities.

    PubMed

    Harvey, Richard P

    2015-02-01

    Based on anecdotal evidence and networking with colleagues at other facilities, it has become evident that some radiation safety departments are not adequately staffed and radiation safety professionals need to increase their staffing levels. Discussions with management regarding radiation safety department staffing often lead to similar conclusions. Management acknowledges the Radiation Safety Officer (RSO) or Director of Radiation Safety's concern but asks the RSO to provide benchmarking and justification for additional full-time equivalents (FTEs). The RSO must determine a method to benchmark and justify additional staffing needs while struggling to maintain a safe and compliant radiation safety program. Benchmarking and justification are extremely important tools that are commonly used to demonstrate the need for increased staffing in other disciplines and are tools that can be used by radiation safety professionals. Parameters that most RSOs would expect to be positive predictors of radiation safety staff size generally are and can be emphasized in benchmarking and justification report summaries. Facilities with large radiation safety departments tend to have large numbers of authorized users, be broad-scope programs, be subject to increased controls regulations, have large clinical operations, have significant numbers of academic radiation-producing machines, and have laser safety responsibilities.

  11. Methodological Considerations in Estimation of Phenotype Heritability Using Genome-Wide SNP Data, Illustrated by an Analysis of the Heritability of Height in a Large Sample of African Ancestry Adults

    PubMed Central

    Chen, Fang; He, Jing; Zhang, Jianqi; Chen, Gary K.; Thomas, Venetta; Ambrosone, Christine B.; Bandera, Elisa V.; Berndt, Sonja I.; Bernstein, Leslie; Blot, William J.; Cai, Qiuyin; Carpten, John; Casey, Graham; Chanock, Stephen J.; Cheng, Iona; Chu, Lisa; Deming, Sandra L.; Driver, W. Ryan; Goodman, Phyllis; Hayes, Richard B.; Hennis, Anselm J. M.; Hsing, Ann W.; Hu, Jennifer J.; Ingles, Sue A.; John, Esther M.; Kittles, Rick A.; Kolb, Suzanne; Leske, M. Cristina; Monroe, Kristine R.; Murphy, Adam; Nemesure, Barbara; Neslund-Dudas, Christine; Nyante, Sarah; Ostrander, Elaine A; Press, Michael F.; Rodriguez-Gil, Jorge L.; Rybicki, Ben A.; Schumacher, Fredrick; Stanford, Janet L.; Signorello, Lisa B.; Strom, Sara S.; Stevens, Victoria; Van Den Berg, David; Wang, Zhaoming; Witte, John S.; Wu, Suh-Yuh; Yamamura, Yuko; Zheng, Wei; Ziegler, Regina G.; Stram, Alexander H.; Kolonel, Laurence N.; Marchand, Loïc Le; Henderson, Brian E.; Haiman, Christopher A.; Stram, Daniel O.

    2015-01-01

    Height has an extremely polygenic pattern of inheritance. Genome-wide association studies (GWAS) have revealed hundreds of common variants that are associated with human height at genome-wide levels of significance. However, only a small fraction of phenotypic variation can be explained by the aggregate of these common variants. In a large study of African-American men and women (n = 14,419), we genotyped and analyzed 966,578 autosomal SNPs across the entire genome using a linear mixed model variance components approach implemented in the program GCTA (Yang et al Nat Genet 2010), and estimated an additive heritability of 44.7% (se: 3.7%) for this phenotype in a sample of evidently unrelated individuals. While this estimated value is similar to that given by Yang et al in their analyses, we remain concerned about two related issues: (1) whether in the complete absence of hidden relatedness, variance components methods have adequate power to estimate heritability when a very large number of SNPs are used in the analysis; and (2) whether estimation of heritability may be biased, in real studies, by low levels of residual hidden relatedness. We addressed the first question in a semi-analytic fashion by directly simulating the distribution of the score statistic for a test of zero heritability with and without low levels of relatedness. The second question was addressed by a very careful comparison of the behavior of estimated heritability for both observed (self-reported) height and simulated phenotypes compared to imputation R2 as a function of the number of SNPs used in the analysis. These simulations help to address the important question about whether today's GWAS SNPs will remain useful for imputing causal variants that are discovered using very large sample sizes in future studies of height, or whether the causal variants themselves will need to be genotyped de novo in order to build a prediction model that ultimately captures a large fraction of the variability of height, and by implication other complex phenotypes. Our overall conclusions are that when study sizes are quite large (5,000 or so) the additive heritability estimate for height is not apparently biased upwards using the linear mixed model; however there is evidence in our simulation that a very large number of causal variants (many thousands) each with very small effect on phenotypic variance will need to be discovered to fill the gap between the heritability explained by known versus unknown causal variants. We conclude that today's GWAS data will remain useful in the future for causal variant prediction, but that finding the causal variants that need to be predicted may be extremely laborious. PMID:26125186

  12. Methodological Considerations in Estimation of Phenotype Heritability Using Genome-Wide SNP Data, Illustrated by an Analysis of the Heritability of Height in a Large Sample of African Ancestry Adults.

    PubMed

    Chen, Fang; He, Jing; Zhang, Jianqi; Chen, Gary K; Thomas, Venetta; Ambrosone, Christine B; Bandera, Elisa V; Berndt, Sonja I; Bernstein, Leslie; Blot, William J; Cai, Qiuyin; Carpten, John; Casey, Graham; Chanock, Stephen J; Cheng, Iona; Chu, Lisa; Deming, Sandra L; Driver, W Ryan; Goodman, Phyllis; Hayes, Richard B; Hennis, Anselm J M; Hsing, Ann W; Hu, Jennifer J; Ingles, Sue A; John, Esther M; Kittles, Rick A; Kolb, Suzanne; Leske, M Cristina; Millikan, Robert C; Monroe, Kristine R; Murphy, Adam; Nemesure, Barbara; Neslund-Dudas, Christine; Nyante, Sarah; Ostrander, Elaine A; Press, Michael F; Rodriguez-Gil, Jorge L; Rybicki, Ben A; Schumacher, Fredrick; Stanford, Janet L; Signorello, Lisa B; Strom, Sara S; Stevens, Victoria; Van Den Berg, David; Wang, Zhaoming; Witte, John S; Wu, Suh-Yuh; Yamamura, Yuko; Zheng, Wei; Ziegler, Regina G; Stram, Alexander H; Kolonel, Laurence N; Le Marchand, Loïc; Henderson, Brian E; Haiman, Christopher A; Stram, Daniel O

    2015-01-01

    Height has an extremely polygenic pattern of inheritance. Genome-wide association studies (GWAS) have revealed hundreds of common variants that are associated with human height at genome-wide levels of significance. However, only a small fraction of phenotypic variation can be explained by the aggregate of these common variants. In a large study of African-American men and women (n = 14,419), we genotyped and analyzed 966,578 autosomal SNPs across the entire genome using a linear mixed model variance components approach implemented in the program GCTA (Yang et al Nat Genet 2010), and estimated an additive heritability of 44.7% (se: 3.7%) for this phenotype in a sample of evidently unrelated individuals. While this estimated value is similar to that given by Yang et al in their analyses, we remain concerned about two related issues: (1) whether in the complete absence of hidden relatedness, variance components methods have adequate power to estimate heritability when a very large number of SNPs are used in the analysis; and (2) whether estimation of heritability may be biased, in real studies, by low levels of residual hidden relatedness. We addressed the first question in a semi-analytic fashion by directly simulating the distribution of the score statistic for a test of zero heritability with and without low levels of relatedness. The second question was addressed by a very careful comparison of the behavior of estimated heritability for both observed (self-reported) height and simulated phenotypes compared to imputation R2 as a function of the number of SNPs used in the analysis. These simulations help to address the important question about whether today's GWAS SNPs will remain useful for imputing causal variants that are discovered using very large sample sizes in future studies of height, or whether the causal variants themselves will need to be genotyped de novo in order to build a prediction model that ultimately captures a large fraction of the variability of height, and by implication other complex phenotypes. Our overall conclusions are that when study sizes are quite large (5,000 or so) the additive heritability estimate for height is not apparently biased upwards using the linear mixed model; however there is evidence in our simulation that a very large number of causal variants (many thousands) each with very small effect on phenotypic variance will need to be discovered to fill the gap between the heritability explained by known versus unknown causal variants. We conclude that today's GWAS data will remain useful in the future for causal variant prediction, but that finding the causal variants that need to be predicted may be extremely laborious.

  13. Temporal Clustering of Regional-Scale Extreme Precipitation Events in Southern Switzerland

    NASA Astrophysics Data System (ADS)

    Barton, Yannick; Giannakaki, Paraskevi; Von Waldow, Harald; Chevalier, Clément; Pfhal, Stephan; Martius, Olivia

    2017-04-01

    Temporal clustering of extreme precipitation events on subseasonal time scales is a form of compound extremes and is of crucial importance for the formation of large-scale flood events. Here, the temporal clustering of regional-scale extreme precipitation events in southern Switzerland is studied. These precipitation events are relevant for the flooding of lakes in southern Switzerland and northern Italy. This research determines whether temporal clustering is present and then identifies the dynamics that are responsible for the clustering. An observation-based gridded precipitation dataset of Swiss daily rainfall sums and ECMWF reanalysis datasets are used. To analyze the clustering in the precipitation time series a modified version of Ripley's K function is used. It determines the average number of extreme events in a time period, to characterize temporal clustering on subseasonal time scales and to determine the statistical significance of the clustering. Significant clustering of regional-scale precipitation extremes is found on subseasonal time scales during the fall season. Four high-impact clustering episodes are then selected and the dynamics responsible for the clustering are examined. During the four clustering episodes, all heavy precipitation events were associated with an upperlevel breaking Rossby wave over western Europe and in most cases strong diabatic processes upstream over the Atlantic played a role in the amplification of these breaking waves. Atmospheric blocking downstream over eastern Europe supported this wave breaking during two of the clustering episodes. During one of the clustering periods, several extratropical transitions of tropical cyclones in the Atlantic contributed to the formation of high-amplitude ridges over the Atlantic basin and downstream wave breaking. During another event, blocking over Alaska assisted the phase locking of the Rossby waves downstream over the Atlantic.

  14. Dynamical analysis of extreme precipitation in the US northeast based on large-scale meteorological patterns

    NASA Astrophysics Data System (ADS)

    Agel, Laurie; Barlow, Mathew; Colby, Frank; Binder, Hanin; Catto, Jennifer L.; Hoell, Andrew; Cohen, Judah

    2018-05-01

    Previous work has identified six large-scale meteorological patterns (LSMPs) of dynamic tropopause height associated with extreme precipitation over the Northeast US, with extreme precipitation defined as the top 1% of daily station precipitation. Here, we examine the three-dimensional structure of the tropopause LSMPs in terms of circulation and factors relevant to precipitation, including moisture, stability, and synoptic mechanisms associated with lifting. Within each pattern, the link between the different factors and extreme precipitation is further investigated by comparing the relative strength of the factors between days with and without the occurrence of extreme precipitation. The six tropopause LSMPs include two ridge patterns, two eastern US troughs, and two troughs centered over the Ohio Valley, with a strong seasonality associated with each pattern. Extreme precipitation in the ridge patterns is associated with both convective mechanisms (instability combined with moisture transport from the Great Lakes and Western Atlantic) and synoptic forcing related to Great Lakes storm tracks and embedded shortwaves. Extreme precipitation associated with eastern US troughs involves intense southerly moisture transport and strong quasi-geostrophic forcing of vertical velocity. Ohio Valley troughs are associated with warm fronts and intense warm conveyor belts that deliver large amounts of moisture ahead of storms, but little direct quasi-geostrophic forcing. Factors that show the largest difference between days with and without extreme precipitation include integrated moisture transport, low-level moisture convergence, warm conveyor belts, and quasi-geostrophic forcing, with the relative importance varying between patterns.

  15. Higher-order organisation of extremely amplified, potentially functional and massively methylated 5S rDNA in European pikes (Esox sp.).

    PubMed

    Symonová, Radka; Ocalewicz, Konrad; Kirtiklis, Lech; Delmastro, Giovanni Battista; Pelikánová, Šárka; Garcia, Sonia; Kovařík, Aleš

    2017-05-18

    Pikes represent an important genus (Esox) harbouring a pre-duplication karyotype (2n = 2x = 50) of economically important salmonid pseudopolyploids. Here, we have characterized the 5S ribosomal RNA genes (rDNA) in Esox lucius and its closely related E. cisalpinus using cytogenetic, molecular and genomic approaches. Intragenomic homogeneity and copy number estimation was carried out using Illumina reads. The higher-order structure of rDNA arrays was investigated by the analysis of long PacBio reads. Position of loci on chromosomes was determined by FISH. DNA methylation was analysed by methylation-sensitive restriction enzymes. The 5S rDNA loci occupy exclusively (peri)centromeric regions on 30-38 acrocentric chromosomes in both E. lucius and E. cisalpinus. The large number of loci is accompanied by extreme amplification of genes (>20,000 copies), which is to the best of our knowledge one of the highest copy number of rRNA genes in animals ever reported. Conserved secondary structures of predicted 5S rRNAs indicate that most of the amplified genes are potentially functional. Only few SNPs were found in genic regions indicating their high homogeneity while intergenic spacers were more heterogeneous and several families were identified. Analysis of 10-30 kb-long molecules sequenced by the PacBio technology (containing about 40% of total 5S rDNA) revealed that the vast majority (96%) of genes are organised in large several kilobase-long blocks. Dispersed genes or short tandems were less common (4%). The adjacent 5S blocks were directly linked, separated by intervening DNA and even inverted. The 5S units differing in the intergenic spacers formed both homogeneous and heterogeneous (mixed) blocks indicating variable degree of homogenisation between the loci. Both E. lucius and E. cisalpinus 5S rDNA was heavily methylated at CG dinucleotides. Extreme amplification of 5S rRNA genes in the Esox genome occurred in the absence of significant pseudogenisation suggesting its recent origin and/or intensive homogenisation processes. The dense methylation of units indicates that powerful epigenetic mechanisms have evolved in this group of fish to silence amplified genes. We discuss how the higher-order repeat structures impact on homogenisation of 5S rDNA in the genome.

  16. A Bit-Encoding Based New Data Structure for Time and Memory Efficient Handling of Spike Times in an Electrophysiological Setup.

    PubMed

    Ljungquist, Bengt; Petersson, Per; Johansson, Anders J; Schouenborg, Jens; Garwicz, Martin

    2018-04-01

    Recent neuroscientific and technical developments of brain machine interfaces have put increasing demands on neuroinformatic databases and data handling software, especially when managing data in real time from large numbers of neurons. Extrapolating these developments we here set out to construct a scalable software architecture that would enable near-future massive parallel recording, organization and analysis of neurophysiological data on a standard computer. To this end we combined, for the first time in the present context, bit-encoding of spike data with a specific communication format for real time transfer and storage of neuronal data, synchronized by a common time base across all unit sources. We demonstrate that our architecture can simultaneously handle data from more than one million neurons and provide, in real time (< 25 ms), feedback based on analysis of previously recorded data. In addition to managing recordings from very large numbers of neurons in real time, it also has the capacity to handle the extensive periods of recording time necessary in certain scientific and clinical applications. Furthermore, the bit-encoding proposed has the additional advantage of allowing an extremely fast analysis of spatiotemporal spike patterns in a large number of neurons. Thus, we conclude that this architecture is well suited to support current and near-future Brain Machine Interface requirements.

  17. Impact of an extreme climatic event on community assembly.

    PubMed

    Thibault, Katherine M; Brown, James H

    2008-03-04

    Extreme climatic events are predicted to increase in frequency and magnitude, but their ecological impacts are poorly understood. Such events are large, infrequent, stochastic perturbations that can change the outcome of entrained ecological processes. Here we show how an extreme flood event affected a desert rodent community that has been monitored for 30 years. The flood (i) caused catastrophic, species-specific mortality; (ii) eliminated the incumbency advantage of previously dominant species; (iii) reset long-term population and community trends; (iv) interacted with competitive and metapopulation dynamics; and (v) resulted in rapid, wholesale reorganization of the community. This and a previous extreme rainfall event were punctuational perturbations-they caused large, rapid population- and community-level changes that were superimposed on a background of more gradual trends driven by climate and vegetation change. Captured by chance through long-term monitoring, the impacts of such large, infrequent events provide unique insights into the processes that structure ecological communities.

  18. The Relationship between Spatial and Temporal Magnitude Estimation of Scientific Concepts at Extreme Scales

    NASA Astrophysics Data System (ADS)

    Price, Aaron; Lee, H.

    2010-01-01

    Many astronomical objects, processes, and events exist and occur at extreme scales of spatial and temporal magnitudes. Our research draws upon the psychological literature, replete with evidence of linguistic and metaphorical links between the spatial and temporal domains, to compare how students estimate spatial and temporal magnitudes associated with objects and processes typically taught in science class.. We administered spatial and temporal scale estimation tests, with many astronomical items, to 417 students enrolled in 12 undergraduate science courses. Results show that while the temporal test was more difficult, students’ overall performance patterns between the two tests were mostly similar. However, asymmetrical correlations between the two tests indicate that students think of the extreme ranges of spatial and temporal scales in different ways, which is likely influenced by their classroom experience. When making incorrect estimations, students tended to underestimate the difference between the everyday scale and the extreme scales on both tests. This suggests the use of a common logarithmic mental number line for both spatial and temporal magnitude estimation. However, there are differences between the two tests in the errors student make in the everyday range. Among the implications discussed is the use of spatio-temporal reference frames, instead of smooth bootstrapping, to help students maneuver between scales of magnitude and the use of logarithmic transformations between reference frames. Implications for astronomy range from learning about spectra to large scale galaxy structure.

  19. Influence of turbulence, orientation, and site configuration on the response of buildings to extreme wind.

    PubMed

    Aly, Aly Mousaad

    2014-01-01

    Atmospheric turbulence results from the vertical movement of air, together with flow disturbances around surface obstacles which make low- and moderate-level winds extremely irregular. Recent advancements in wind engineering have led to the construction of new facilities for testing residential homes at relatively high Reynolds numbers. However, the generation of a fully developed turbulence in these facilities is challenging. The author proposed techniques for the testing of residential buildings and architectural features in flows that lack fully developed turbulence. While these methods are effective for small structures, the extension of the approach for large and flexible structures is not possible yet. The purpose of this study is to investigate the role of turbulence in the response of tall buildings to extreme winds. In addition, the paper presents a detailed analysis to investigate the influence of upstream terrain conditions, wind direction angle (orientation), and the interference effect from the surrounding on the response of high-rise buildings. The methodology presented can be followed to help decision makers to choose among innovative solutions like aerodynamic mitigation, structural member size adjustment, and/or damping enhancement, with an objective to improve the resiliency and the serviceability of buildings.

  20. Development of an instrument for assessing workstyle in checkout cashier work (BAsIK).

    PubMed

    Kjellberg, Katarina; Palm, Peter; Josephson, Malin

    2012-01-01

    Checkout cashier work consists of handling a large number of items during a work shift, which implies repetitive movements of the shoulders, arms and hands/wrists, and a high work rate. The work is associated with a high prevalence of disorders in the neck and upper extremity. The concept of workstyle explains how ergonomic and psychosocial factors interact in the development of work-related upper extremity disorders. The aim of the project was to develop an instrument for the occupational health services to be used in the efforts to prevent upper extremity disorders in checkout cashier work. The instrument is based on the workstyle concept and is intended to be used as a tool to identify high-risk workstyle and needs for interventions, such as training and education. The instrument, BAsIK, consists of four parts; a questionnaire about workstyle, an observation protocol for work technique, a checklist about the design of the checkout and a questionnaire about work organization. The instrument was developed by selecting workstyle items developed for office work and adapting them to checkout cashier work, discussions with researchers and ergonomists, focus-group interviews with cashiers, observations of video recordings of cashiers, and studies of existing guidelines and checklists.

  1. Influence of Turbulence, Orientation, and Site Configuration on the Response of Buildings to Extreme Wind

    PubMed Central

    2014-01-01

    Atmospheric turbulence results from the vertical movement of air, together with flow disturbances around surface obstacles which make low- and moderate-level winds extremely irregular. Recent advancements in wind engineering have led to the construction of new facilities for testing residential homes at relatively high Reynolds numbers. However, the generation of a fully developed turbulence in these facilities is challenging. The author proposed techniques for the testing of residential buildings and architectural features in flows that lack fully developed turbulence. While these methods are effective for small structures, the extension of the approach for large and flexible structures is not possible yet. The purpose of this study is to investigate the role of turbulence in the response of tall buildings to extreme winds. In addition, the paper presents a detailed analysis to investigate the influence of upstream terrain conditions, wind direction angle (orientation), and the interference effect from the surrounding on the response of high-rise buildings. The methodology presented can be followed to help decision makers to choose among innovative solutions like aerodynamic mitigation, structural member size adjustment, and/or damping enhancement, with an objective to improve the resiliency and the serviceability of buildings. PMID:24701140

  2. Altered neuromuscular control of leg stiffness following soccer-specific exercise.

    PubMed

    Oliver, Jon L; De Ste Croix, Mark B A; Lloyd, Rhodri S; Williams, Craig A

    2014-11-01

    To examine changes to neuromuscular control of leg stiffness following 42 min of soccer-specific exercise. Ten youth soccer players, aged 15.8 ± 0.4 years, stature 1.73 ± 0.06 m and mass 59.8 ± 9.7 kg, hopped on a force plate at a self-selected frequency before and after simulated soccer exercise performed on a non-motorised treadmill. During hopping, muscle activity was measured using surface electromyography from four lower limb muscles and analysed to determine feedforward- and feedback-mediated activity, as well as co-contraction. There was a small, non-significant change in stiffness following exercise (26.6 ± 10.6 vs. 24.0 ± 7.0 kN m(-1), p > 0.05, ES = 0.25), with half the group increasing and half decreasing their stiffness. Changes in stiffness were significantly related to changes in centre of mass (CoM) displacement (r = 0.90, p < 0.01, extremely large correlation) but not changes in peak ground reaction force (r = 0.58, p > 0.05, large correlation). A number of significant relationships were observed between changes in stiffness and CoM displacement with changes in feedforward, feedback and eccentric muscle activity of the soleus and vastus lateralis muscles following exercise (r = 0.64-0.98, p < 0.05, large-extremely large correlations), but not with changes in co-contraction (r = 0.11-0.55, p > 0.05, small-large correlations). Following soccer-specific exercise individual changes in feedforward- and reflex-mediated activity of the soleus and vastus lateralis, and not co-contraction around the knee and ankle, modulate changes in CoM displacement and leg stiffness.

  3. Initial Low-Reynolds Number Iced Aerodynamic Performance for CRM Wing

    NASA Technical Reports Server (NTRS)

    Woodard, Brian; Diebold, Jeff; Broeren, Andy; Potapczuk, Mark; Lee, Sam; Bragg, Michael

    2015-01-01

    NASA, FAA, ONERA, and other partner organizations have embarked on a significant, collaborative research effort to address the technical challenges associated with icing on large scale, three-dimensional swept wings. These are extremely complex phenomena important to the design, certification and safe operation of small and large transport aircraft. There is increasing demand to balance trade-offs in aircraft efficiency, cost and noise that tend to compete directly with allowable performance degradations over an increasing range of icing conditions. Computational fluid dynamics codes have reached a level of maturity that they are being proposed by manufacturers for use in certification of aircraft for flight in icing. However, sufficient high-quality data to evaluate their performance on iced swept wings are not currently available in the public domain and significant knowledge gaps remain.

  4. GM2 gangliosidosis in Saudi Arabia: multiple mutations and considerations for future carrier screening.

    PubMed

    Kaya, Namik; Al-Owain, Mohammad; Abudheim, Nada; Al-Zahrani, Jawaher; Colak, Dilek; Al-Sayed, Moeen; Milanlioglu, Aysel; Ozand, Pinar T; Alkuraya, Fowzan S

    2011-06-01

    The GM2 gangliosidose, Tay-Sachs and Sandhoff diseases, are a class of lysosomal storage diseases in which relentless neurodegeneration results in devastating neurological disability and premature death. Primary prevention is the most effective intervention since no effective therapy is currently available. An extremely successful model for the prevention of GM2 gangliosidosis in the Ashkenazi Jewish community is largely attributable to the very limited number of founder mutations in that population. Consistent with our previous observation of allelic heterogeneity in consanguineous populations, we show here that these diseases are largely caused by private mutations which present a major obstacle in replicating the Ashkenazi success story. Alternative solutions are proposed which can also be implemented for other autosomal recessive diseases in our population. Copyright © 2011 Wiley-Liss, Inc.

  5. Dedifferentiated Liposarcoma of Sigmoid Mesocolon - A Case Report.

    PubMed

    Constantinoiu, Silviu; Achim, Ion-Florin; Cretu, Oana-Eliza; Dumitru, Tatiana; Constantin, Adrian; Enache, Simona; Mates, Ioan Nicolae

    2016-01-01

    Dedifferentiated liposarcoma is a liposarcoma that contains a well-differentiated liposarcoma component juxtaposed to areas of high-grade non-lipogenic sarcoma and was believed to occur from well-differentiated liposarcoma after several years. Dedifferentiated liposarcoma most commonly occurs in the retroperitoneum, while an intraperitoneal location is extremely rare, only seven cases have been reported in literature. Many pathologists recognize that a large number of intra-abdominal poorly differentiated sarcomas are dedifferentiated liposarcomas. We present the case of a 73 years old patient known with multiple cardiovascular comorbidities, stroke sequelae and a large abdominal mass evolving for 3 years. He was referred to our clinic for abdominal pain and bowel disorders. Instead of all clinical and imagistic aspects suggested a gastrointestinal stromal tumour, the histological exam revealed the diagnosis of a dedifferentiated liposarcoma. Celsius.

  6. Comparison of Reconstruction and Control algorithms on the ESO end-to-end simulator OCTOPUS

    NASA Astrophysics Data System (ADS)

    Montilla, I.; Béchet, C.; Lelouarn, M.; Correia, C.; Tallon, M.; Reyes, M.; Thiébaut, É.

    Extremely Large Telescopes are very challenging concerning their Adaptive Optics requirements. Their diameters, the specifications demanded by the science for which they are being designed for, and the planned use of Extreme Adaptive Optics systems, imply a huge increment in the number of degrees of freedom in the deformable mirrors. It is necessary to study new reconstruction algorithms to implement the real time control in Adaptive Optics at the required speed. We have studied the performance, applied to the case of the European ELT, of three different algorithms: the matrix-vector multiplication (MVM) algorithm, considered as a reference; the Fractal Iterative Method (FrIM); and the Fourier Transform Reconstructor (FTR). The algorithms have been tested on ESO's OCTOPUS software, which simulates the atmosphere, the deformable mirror, the sensor and the closed-loop control. The MVM is the default reconstruction and control method implemented in OCTOPUS, but it scales in O(N2) operations per loop so it is not considered as a fast algorithm for wave-front reconstruction and control on an Extremely Large Telescope. The two other methods are the fast algorithms studied in the E-ELT Design Study. The performance, as well as their response in the presence of noise and with various atmospheric conditions, has been compared using a Single Conjugate Adaptive Optics configuration for a 42 m diameter ELT, with a total amount of 5402 actuators. Those comparisons made on a common simulator allow to enhance the pros and cons of the various methods, and give us a better understanding of the type of reconstruction algorithm that an ELT demands.

  7. Benefits of an ultra large and multiresolution ensemble for estimating available wind power

    NASA Astrophysics Data System (ADS)

    Berndt, Jonas; Hoppe, Charlotte; Elbern, Hendrik

    2016-04-01

    In this study we investigate the benefits of an ultra large ensemble with up to 1000 members including multiple nesting with a target horizontal resolution of 1 km. The ensemble shall be used as a basis to detect events of extreme errors in wind power forecasting. Forecast value is the wind vector at wind turbine hub height (~ 100 m) in the short range (1 to 24 hour). Current wind power forecast systems rest already on NWP ensemble models. However, only calibrated ensembles from meteorological institutions serve as input so far, with limited spatial resolution (˜10 - 80 km) and member number (˜ 50). Perturbations related to the specific merits of wind power production are yet missing. Thus, single extreme error events which are not detected by such ensemble power forecasts occur infrequently. The numerical forecast model used in this study is the Weather Research and Forecasting Model (WRF). Model uncertainties are represented by stochastic parametrization of sub-grid processes via stochastically perturbed parametrization tendencies and in conjunction via the complementary stochastic kinetic-energy backscatter scheme already provided by WRF. We perform continuous ensemble updates by comparing each ensemble member with available observations using a sequential importance resampling filter to improve the model accuracy while maintaining ensemble spread. Additionally, we use different ensemble systems from global models (ECMWF and GFS) as input and boundary conditions to capture different synoptic conditions. Critical weather situations which are connected to extreme error events are located and corresponding perturbation techniques are applied. The demanding computational effort is overcome by utilising the supercomputer JUQUEEN at the Forschungszentrum Juelich.

  8. GESTATIONAL AGE AT BIRTH AND RISK OF TESTICULAR CANCER

    PubMed Central

    Crump, Casey; Sundquist, Kristina; Winkleby, Marilyn A.; Sieh, Weiva; Sundquist, Jan

    2011-01-01

    Most testicular germ cell tumors originate from carcinoma in situ cells in fetal life, possibly related to sex hormone imbalances in early pregnancy. Previous studies of association between gestational age at birth and testicular cancer have yielded discrepant results and have not examined extreme preterm birth. Our objective was to determine whether low gestational age at birth is independently associated with testicular cancer in later life. We conducted a national cohort study of 354,860 men born in Sweden in 1973–1979, including 19,214 born preterm (gestational age <37 weeks) of whom 1,279 were born extremely preterm (22–29 weeks), followed for testicular cancer incidence through 2008. A total of 767 testicular cancers (296 seminomas and 471 nonseminomatous germ cell tumors) were identified in 11.2 million person-years of follow-up. Extreme preterm birth was associated with an increased risk of testicular cancer (hazard ratio 3.95; 95% CI, 1.67–9.34) after adjusting for other perinatal factors, family history of testicular cancer, and cryptorchidism. Only five cases (three seminomas and two nonseminomas) occurred among men born extremely preterm, limiting the precision of risk estimates. No association was found between later preterm birth, post-term birth, or low or high fetal growth and testicular cancer. These findings suggest that extreme but not later preterm birth may be independently associated with testicular cancer in later life. They are based on a small number of cases and will need confirmation in other large cohorts. Elucidation of the key prenatal etiologic factors may potentially lead to preventive interventions in early life. PMID:22314417

  9. Evaluating the extreme precipitation events using a mesoscale atmopshere model

    NASA Astrophysics Data System (ADS)

    Yucel, I.; Onen, A.

    2012-04-01

    Evidence is showing that global warming or climate change has a direct influence on changes in precipitation and the hydrological cycle. Extreme weather events such as heavy rainfall and flooding are projected to become much more frequent as climate warms. Mesoscale atmospheric models coupled with land surface models provide efficient forecasts for meteorological events in high lead time and therefore they should be used for flood forecasting and warning issues as they provide more continuous monitoring of precipitation over large areas. This study examines the performance of the Weather Research and Forecasting (WRF) model in producing the temporal and spatial characteristics of the number of extreme precipitation events observed in West Black Sea Region of Turkey. Extreme precipitation events usually resulted in flood conditions as an associated hydrologic response of the basin. The performance of the WRF system is further investigated by using the three dimensional variational (3D-VAR) data assimilation scheme within WRF. WRF performance with and without data assimilation at high spatial resolution (4 km) is evaluated by making comparison with gauge precipitation and satellite-estimated rainfall data from Multi Precipitation Estimates (MPE). WRF-derived precipitation showed capabilities in capturing the timing of the precipitation extremes and in some extent spatial distribution and magnitude of the heavy rainfall events. These precipitation characteristics are enhanced with the use of 3D-VAR scheme in WRF system. Data assimilation improved area-averaged precipitation forecasts by 9 percent and at some points there exists quantitative match in precipitation events, which are critical for hydrologic forecast application.

  10. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  11. Topographic relationships for design rainfalls over Australia

    NASA Astrophysics Data System (ADS)

    Johnson, F.; Hutchinson, M. F.; The, C.; Beesley, C.; Green, J.

    2016-02-01

    Design rainfall statistics are the primary inputs used to assess flood risk across river catchments. These statistics normally take the form of Intensity-Duration-Frequency (IDF) curves that are derived from extreme value probability distributions fitted to observed daily, and sub-daily, rainfall data. The design rainfall relationships are often required for catchments where there are limited rainfall records, particularly catchments in remote areas with high topographic relief and hence some form of interpolation is required to provide estimates in these areas. This paper assesses the topographic dependence of rainfall extremes by using elevation-dependent thin plate smoothing splines to interpolate the mean annual maximum rainfall, for periods from one to seven days, across Australia. The analyses confirm the important impact of topography in explaining the spatial patterns of these extreme rainfall statistics. Continent-wide residual and cross validation statistics are used to demonstrate the 100-fold impact of elevation in relation to horizontal coordinates in explaining the spatial patterns, consistent with previous rainfall scaling studies and observational evidence. The impact of the complexity of the fitted spline surfaces, as defined by the number of knots, and the impact of applying variance stabilising transformations to the data, were also assessed. It was found that a relatively large number of 3570 knots, suitably chosen from 8619 gauge locations, was required to minimise the summary error statistics. Square root and log data transformations were found to deliver marginally superior continent-wide cross validation statistics, in comparison to applying no data transformation, but detailed assessments of residuals in complex high rainfall regions with high topographic relief showed that no data transformation gave superior performance in these regions. These results are consistent with the understanding that in areas with modest topographic relief, as for most of the Australian continent, extreme rainfall is closely aligned with elevation, but in areas with high topographic relief the impacts of topography on rainfall extremes are more complex. The interpolated extreme rainfall statistics, using no data transformation, have been used by the Australian Bureau of Meteorology to produce new IDF data for the Australian continent. The comprehensive methods presented for the evaluation of gridded design rainfall statistics will be useful for similar studies, in particular the importance of balancing the need for a continentally-optimum solution that maintains sufficient definition at the local scale.

  12. Background Noises Versus Intraseasonal Variation Signals: Small vs. Large Convective Cloud Objects From CERES Aqua Observations

    NASA Technical Reports Server (NTRS)

    Xu, Kuan-Man

    2015-01-01

    During inactive phases of Madden-Julian Oscillation (MJO), there are plenty of deep but small convective systems and far fewer deep and large ones. During active phases of MJO, a manifestation of an increase in the occurrence of large and deep cloud clusters results from an amplification of large-scale motions by stronger convective heating. This study is designed to quantitatively examine the roles of small and large cloud clusters during the MJO life cycle. We analyze the cloud object data from Aqua CERES (Clouds and the Earth's Radiant Energy System) observations between July 2006 and June 2010 for tropical deep convective (DC) and cirrostratus (CS) cloud object types according to the real-time multivariate MJO index, which assigns the tropics to one of the eight MJO phases each day. The cloud object is a contiguous region of the earth with a single dominant cloud-system type. The criteria for defining these cloud types are overcast footprints and cloud top pressures less than 400 hPa, but DC has higher cloud optical depths (=10) than those of CS (<10). The size distributions, defined as the footprint numbers as a function of cloud object diameters, for particular MJO phases depart greatly from the combined (8-phase) distribution at large cloud-object diameters due to the reduced/increased numbers of cloud objects related to changes in the large-scale environments. The medium diameter corresponding to the combined distribution is determined and used to partition all cloud objects into "small" and "large" groups of a particular phase. The two groups corresponding to the combined distribution have nearly equal numbers of footprints. The medium diameters are 502 km for DC and 310 km for cirrostratus. The range of the variation between two extreme phases (typically, the most active and depressed phases) for the small group is 6-11% in terms of the numbers of cloud objects and the total footprint numbers. The corresponding range for the large group is 19-44%. In terms of the probability density functions of radiative and cloud physical properties, there are virtually no differences between the MJO phases for the small group, but there are significant differences for the large groups for both DC and CS types. These results suggest that the intreseasonal variation signals reside at the large cloud clusters while the small cloud clusters represent the background noises resulting from various types of the tropical waves with different wavenumbers and propagation speeds/directions.

  13. A Review of Recent Advances in Research on Extreme Heat Events

    NASA Technical Reports Server (NTRS)

    Horton, Radley M.; Mankin, Justin S.; Lesk, Corey; Coffel, Ethan; Raymond, Colin

    2016-01-01

    Reviewing recent literature, we report that changes in extreme heat event characteristics such as magnitude, frequency, and duration are highly sensitive to changes in mean global-scale warming. Numerous studies have detected significant changes in the observed occurrence of extreme heat events, irrespective of how such events are defined. Further, a number of these studies have attributed present-day changes in the risk of individual heat events and the documented global-scale increase in such events to anthropogenic-driven warming. Advances in process-based studies of heat events have focused on the proximate land-atmosphere interactions through soil moisture anomalies, and changes in occurrence of the underlying atmospheric circulation associated with heat events in the mid-latitudes. While evidence for a number of hypotheses remains limited, climate change nevertheless points to tail risks of possible changes in heat extremes that could exceed estimates generated from model outputs of mean temperature. We also explore risks associated with compound extreme events and nonlinear impacts associated with extreme heat.

  14. Rare royal families in honeybees, Apis mellifera

    NASA Astrophysics Data System (ADS)

    Moritz, Robin F. A.; Lattorff, H. Michael G.; Neumann, Peter; Kraus, F. Bernhard; Radloff, Sarah E.; Hepburn, H. Randall

    2005-10-01

    The queen is the dominant female in the honeybee colony, Apis mellifera, and controls reproduction. Queen larvae are selected by the workers and are fed a special diet (royal jelly), which determines caste. Because queens mate with many males a large number of subfamilies coexist in the colony. As a consequence, there is a considerable potential for conflict among the subfamilies over queen rearing. Here we show that honeybee queens are not reared at random but are preferentially reared from rare “royal” subfamilies, which have extremely low frequencies in the colony's worker force but a high frequency in the queens reared.

  15. Surrogate waveform models

    NASA Astrophysics Data System (ADS)

    Blackman, Jonathan; Field, Scott; Galley, Chad; Scheel, Mark; Szilagyi, Bela; Tiglio, Manuel

    2015-04-01

    With the advanced detector era just around the corner, there is a strong need for fast and accurate models of gravitational waveforms from compact binary coalescence. Fast surrogate models can be built out of an accurate but slow waveform model with minimal to no loss in accuracy, but may require a large number of evaluations of the underlying model. This may be prohibitively expensive if the underlying is extremely slow, for example if we wish to build a surrogate for numerical relativity. We examine alternate choices to building surrogate models which allow for a more sparse set of input waveforms. Research supported in part by NSERC.

  16. A Review of Computational Intelligence Methods for Eukaryotic Promoter Prediction.

    PubMed

    Singh, Shailendra; Kaur, Sukhbir; Goel, Neelam

    2015-01-01

    In past decades, prediction of genes in DNA sequences has attracted the attention of many researchers but due to its complex structure it is extremely intricate to correctly locate its position. A large number of regulatory regions are present in DNA that helps in transcription of a gene. Promoter is one such region and to find its location is a challenging problem. Various computational methods for promoter prediction have been developed over the past few years. This paper reviews these promoter prediction methods. Several difficulties and pitfalls encountered by these methods are also detailed, along with future research directions.

  17. Mental health in Colombia

    PubMed Central

    Chaskel, Roberto; Gaviria, Silvia L.; Espinel, Zelde; Taborda, Eliana; Vanegas, Roland; Shultz, James M.

    2015-01-01

    A hallmark of Colombia is population-wide exposure to violence. To understand the realities of mental health in Colombia requires attention to the historical context of 60 years of unrelenting armed conflict overlaid upon high rates of homicide, gang activity and prevalent gender-based and intra-familial violence. The number of patients affected by trauma is extremely large, and the population burden of alcohol misuse and illicit drug use is significant. These patterns have brought the subspecialties of trauma and addiction psychiatry to the forefront, and highlight the need for novel treatments that integrate psychotherapeutic and psychopharmacological modalities. PMID:29093873

  18. Extended hydrodynamic theory of the peak and minimum pool boiling heat fluxes

    NASA Technical Reports Server (NTRS)

    Linehard, J. H.; Dhir, V. K.

    1973-01-01

    The hydrodynamic theory of the extreme pool boiling heat fluxes is expanded to embrace a variety of problems that have not previously been analyzed. These problems include the prediction of the peak heat flux on a variety of finite heaters, the influence of viscosity on the Taylor and Helmoltz instability mechanisms with application to film boiling and to the peak heat flux in viscous liquids, the formalization of the analogy between high-current-density electrolysis and boiling, and the description of boiling in the low-gravity limit. The predictions are verified with a large number of new data.

  19. FASTPM: a new scheme for fast simulations of dark matter and haloes

    NASA Astrophysics Data System (ADS)

    Feng, Yu; Chu, Man-Yat; Seljak, Uroš; McDonald, Patrick

    2016-12-01

    We introduce FASTPM, a highly scalable approximated particle mesh (PM) N-body solver, which implements the PM scheme enforcing correct linear displacement (1LPT) evolution via modified kick and drift factors. Employing a two-dimensional domain decomposing scheme, FASTPM scales extremely well with a very large number of CPUs. In contrast to Comoving-Lagrangian (COLA) approach, we do not require to split the force or track separately the 2LPT solution, reducing the code complexity and memory requirements. We compare FASTPM with different number of steps (Ns) and force resolution factor (B) against three benchmarks: halo mass function from friends-of-friends halo finder; halo and dark matter power spectrum; and cross-correlation coefficient (or stochasticity), relative to a high-resolution TREEPM simulation. We show that the modified time stepping scheme reduces the halo stochasticity when compared to COLA with the same number of steps and force resolution. While increasing Ns and B improves the transfer function and cross-correlation coefficient, for many applications FASTPM achieves sufficient accuracy at low Ns and B. For example, Ns = 10 and B = 2 simulation provides a substantial saving (a factor of 10) of computing time relative to Ns = 40, B = 3 simulation, yet the halo benchmarks are very similar at z = 0. We find that for abundance matched haloes the stochasticity remains low even for Ns = 5. FASTPM compares well against less expensive schemes, being only 7 (4) times more expensive than 2LPT initial condition generator for Ns = 10 (Ns = 5). Some of the applications where FASTPM can be useful are generating a large number of mocks, producing non-linear statistics where one varies a large number of nuisance or cosmological parameters, or serving as part of an initial conditions solver.

  20. Properties of Extreme Precipitation and Their Uncertainties in 3-year GPM Precipitation Radar Data

    NASA Astrophysics Data System (ADS)

    Liu, N.; Liu, C.

    2017-12-01

    Extreme high precipitation rates are often related to flash floods and have devastating impacts on human society and the environments. To better understand these rare events, 3-year Precipitation Features (PFs) are defined by grouping the contiguous areas with nonzero near-surface precipitation derived using Global Precipitation Measurement (GPM) Ku band Precipitation Radar (KuPR). The properties of PFs with extreme precipitation rates greater than 20, 50, 100 mm/hr, such as the geographical distribution, volumetric precipitation contribution, seasonal and diurnal variations, are examined. In addition to the large seasonal and regional variations, the rare extreme precipitation rates often have a larger contribution to the local total precipitation. Extreme precipitation rates occur more often over land than over ocean. The challenges in the retrieval of extreme precipitation might be from the attenuation correction and large uncertainties in the Z-R relationships from near-surface radar reflectivity to precipitation rates. These potential uncertainties are examined by using collocated ground based radar reflectivity and precipitation retrievals.

  1. The spatiotemporal changes in precipitation extremes over Canada and their connections to large-scale climate patterns

    NASA Astrophysics Data System (ADS)

    Yang, Y.; Gan, T. Y.; Tan, X.

    2017-12-01

    In the past few decades, there have been more extreme climate events around the world, and Canada has also suffered from numerous extreme precipitation events. In this paper, trend analysis, change point analysis, probability distribution function, principal component analysis and wavelet analysis were used to investigate the spatial and temporal patterns of extreme precipitation in Canada. Ten extreme precipitation indices were calculated using long-term daily precipitation data from 164 gauging stations. Several large-scale climate patterns such as El Niño-Southern Oscillation (ENSO), Pacific Decadal Oscillation (PDO), Pacific-North American (PNA), and North Atlantic Oscillation (NAO) were selected to analyze the relationships between extreme precipitation and climate indices. Convective Available Potential Energy (CAPE), specific humidity, and surface temperature were employed to investigate the potential causes of the trends.The results show statistically significant positive trends for most indices, which indicate increasing extreme precipitation. The majority of indices display more increasing trends along the southern border of Canada while decreasing trends dominate in the central Canadian Prairies (CP). In addition, strong connections are found between the extreme precipitation and climate indices and the effects of climate pattern differ for each region. The seasonal CAPE, specific humidity, and temperature are found to be closely related to Canadian extreme precipitation.

  2. NeuroTessMesh: A Tool for the Generation and Visualization of Neuron Meshes and Adaptive On-the-Fly Refinement.

    PubMed

    Garcia-Cantero, Juan J; Brito, Juan P; Mata, Susana; Bayona, Sofia; Pastor, Luis

    2017-01-01

    Gaining a better understanding of the human brain continues to be one of the greatest challenges for science, largely because of the overwhelming complexity of the brain and the difficulty of analyzing the features and behavior of dense neural networks. Regarding analysis, 3D visualization has proven to be a useful tool for the evaluation of complex systems. However, the large number of neurons in non-trivial circuits, together with their intricate geometry, makes the visualization of a neuronal scenario an extremely challenging computational problem. Previous work in this area dealt with the generation of 3D polygonal meshes that approximated the cells' overall anatomy but did not attempt to deal with the extremely high storage and computational cost required to manage a complex scene. This paper presents NeuroTessMesh, a tool specifically designed to cope with many of the problems associated with the visualization of neural circuits that are comprised of large numbers of cells. In addition, this method facilitates the recovery and visualization of the 3D geometry of cells included in databases, such as NeuroMorpho, and provides the tools needed to approximate missing information such as the soma's morphology. This method takes as its only input the available compact, yet incomplete, morphological tracings of the cells as acquired by neuroscientists. It uses a multiresolution approach that combines an initial, coarse mesh generation with subsequent on-the-fly adaptive mesh refinement stages using tessellation shaders. For the coarse mesh generation, a novel approach, based on the Finite Element Method, allows approximation of the 3D shape of the soma from its incomplete description. Subsequently, the adaptive refinement process performed in the graphic card generates meshes that provide good visual quality geometries at a reasonable computational cost, both in terms of memory and rendering time. All the described techniques have been integrated into NeuroTessMesh, available to the scientific community, to generate, visualize, and save the adaptive resolution meshes.

  3. A statistical mechanics approach to computing rare transitions in multi-stable turbulent geophysical flows

    NASA Astrophysics Data System (ADS)

    Laurie, J.; Bouchet, F.

    2012-04-01

    Many turbulent flows undergo sporadic random transitions, after long periods of apparent statistical stationarity. For instance, paths of the Kuroshio [1], the Earth's magnetic field reversal, atmospheric flows [2], MHD experiments [3], 2D turbulence experiments [4,5], 3D flows [6] show this kind of behavior. The understanding of this phenomena is extremely difficult due to the complexity, the large number of degrees of freedom, and the non-equilibrium nature of these turbulent flows. It is however a key issue for many geophysical problems. A straightforward study of these transitions, through a direct numerical simulation of the governing equations, is nearly always impracticable. This is mainly a complexity problem, due to the large number of degrees of freedom involved for genuine turbulent flows, and the extremely long time between two transitions. In this talk, we consider two-dimensional and geostrophic turbulent models, with stochastic forces. We consider regimes where two or more attractors coexist. As an alternative to direct numerical simulation, we propose a non-equilibrium statistical mechanics approach to the computation of this phenomenon. Our strategy is based on large deviation theory [7], derived from a path integral representation of the stochastic process. Among the trajectories connecting two non-equilibrium attractors, we determine the most probable one. Moreover, we also determine the transition rates, and in which cases this most probable trajectory is a typical one. Interestingly, we prove that in the class of models we consider, a mechanism exists for diffusion over sets of connected attractors. For the type of stochastic forces that allows this diffusion, the transition between attractors is not a rare event. It is then very difficult to characterize the flow as bistable. However for another class of stochastic forces, this diffusion mechanism is prevented, and genuine bistability or multi-stability is observed. We discuss how these results are probably connected to the long debated existence of multi-stability in the atmosphere and oceans.

  4. "Complex" Posttraumatic Stress Disorder/Disorders of Extreme Stress (CP/DES) in Sexually Abused Children: An Exloratory Study.

    ERIC Educational Resources Information Center

    Hall, Darlene Kordich

    1999-01-01

    Compares three groups of young sexually abused children on seven "Complex" Posttraumatic Stress Disorder/Disorders of Extreme Stress (CP/DES) indices. As cumulative number of types of trauma increased, the number of CP/DES symptoms rose. Results suggest that CP/DES also characterizes sexually abused children, especially those who have…

  5. Shifting patterns of mild weather in response to projected radiative forcing

    NASA Astrophysics Data System (ADS)

    van der Wiel, Karin; Kapnick, Sarah; Vecchi, Gabriel

    2017-04-01

    Traditionally, climate change research has focused on changes in mean climate (e.g. global mean temperature, sea level rise, glacier melt) or change in extreme events (e.g. hurricanes, extreme precipitation, droughts, heat waves, wild fires). Though extreme events have the potential to disrupt society, extreme conditions are rare by definition. In contrast, mild weather occurs frequently and many human activities are built around it. Examples of such activities include football games, dog walks, bike rides, and outdoor weddings, but also activities of direct economic impact, e.g. construction work, infrastructure projects, road or rail transportation, air travel, and landscaping projects. Absence of mild weather impacts society in various way, understanding current and future mild weather is therefore of high scientific interest. We present a global analysis of mild weather based on simple and relatable criteria and we explore changes in mild weather occurrence in response to radiative forcing. A high-resolution global climate model, GFDL HiFLOR, is used to allow for investigation of local features and changes. In response to RCP4.5, we find a slight global mean decrease in the annual number of mild days projected both in the near future (-4 d/yr, 2016-2035) and at the end of this century (-10 d/yr, 2081-2100). Projected regional and seasonal redistributions of mild days are substantially greater. Tropical regions are projected to see large decreases, in the mid-latitudes small increases in the number of mild days are projected. Mediterranean climates are projected to see a shift of mild weather away from the local summer to the shoulder seasons. These changes are larger than the interannual variability of mild weather caused by El Niño-Southern Oscillation. Finally, we use reanalysis data to show an observed global decrease in the recent past, and we verify that these observed regional changes in mild weather resemble the projections.

  6. The Climatology of Extreme Surge-Producing Extratropical Cyclones in Observations and Models

    NASA Astrophysics Data System (ADS)

    Catalano, A. J.; Broccoli, A. J.; Kapnick, S. B.

    2016-12-01

    Extreme coastal storms devastate heavily populated areas around the world by producing powerful winds that can create a large storm surge. Both tropical and extratropical cyclones (ETCs) occur over the northwestern Atlantic Ocean, and the risks associated with ETCs can be just as severe as those associated with tropical storms (e.g. high winds, storm surge). At The Battery in New York City, 17 of the 20 largest storm surge events were a consequence of extratropical cyclones (ETCs), which are more prevalent than tropical cyclones in the northeast region of the United States. Therefore, we analyze the climatology of ETCs that are capable of producing a large storm surge along the northeastern coast of the United States. For a historical analysis, water level data was collected from National Oceanic and Atmospheric Administration (NOAA) tide gauges at three separate locations (Sewell's Pt., VA, The Battery, NY, and Boston, MA). We perform a k-means cluster analysis of sea level pressure from the ECMWF 20th Century Reanalysis dataset (ERA-20c) to explore the natural sets of observed storms with similar characteristics. We then composite cluster results with features of atmospheric circulation to observe the influence of interannual and multidecadal variability such as the North Atlantic Oscillation. Since observational records contain a small number of well-documented ETCs, the capability of a high-resolution coupled climate model to realistically simulate such extreme coastal storms will also be assessed. Global climate models provide a means of simulating a much larger sample of extreme events, allowing for better resolution of the tail of the distribution. We employ a tracking algorithm to identify ETCs in a multi-century simulation under present-day conditions. Quantitative comparisons of cyclolysis, cyclogenesis, and cyclone densities of simulated ETCs and storms from recent history (using reanalysis products) are conducted.

  7. Heart failure in numbers: Estimates for the 21st century in Portugal.

    PubMed

    Fonseca, Cândida; Brás, Daniel; Araújo, Inês; Ceia, Fátima

    2018-02-01

    Heart failure is a major public health problem that affects a large number of individuals and is associated with high mortality and morbidity. This study aims to estimate the probable scenario for HF prevalence and its consequences in the short-, medium- and long-term in Portugal. This assessment is based on the EPICA (Epidemiology of Heart Failure and Learning) project, which was designed to estimate the prevalence of chronic heart failure in mainland Portugal in 1998. Estimates of heart failure prevalence were performed for individuals aged over 25 years, distributed by age group and gender, based on data from the 2011 Census by Statistics Portugal. The expected demographic changes, particularly the marked aging of the population, mean that a large number of Portuguese will likely be affected by this syndrome. Assuming that current clinical practices are maintained, the prevalence of heart failure in mainland Portugal will increase by 30% by 2035 and by 33% by 2060, compared to 2011, resulting in 479 921 and 494 191 affected individuals, respectively. In addition to the large number of heart failure patients expected, it is estimated that the hospitalizations and mortality associated with this syndrome will significantly increase its economic impact. Therefore, it is extremely important to raise awareness of this syndrome, as this will favor diagnosis and early referral of patients, facilitating better management of heart failure and helping to decrease the burden it imposes on Portugal. Copyright © 2017 Sociedade Portuguesa de Cardiologia. Publicado por Elsevier España, S.L.U. All rights reserved.

  8. Multiple Auto-Adapting Color Balancing for Large Number of Images

    NASA Astrophysics Data System (ADS)

    Zhou, X.

    2015-04-01

    This paper presents a powerful technology of color balance between images. It does not only work for small number of images but also work for unlimited large number of images. Multiple adaptive methods are used. To obtain color seamless mosaic dataset, local color is adjusted adaptively towards the target color. Local statistics of the source images are computed based on the so-called adaptive dodging window. The adaptive target colors are statistically computed according to multiple target models. The gamma function is derived from the adaptive target and the adaptive source local stats. It is applied to the source images to obtain the color balanced output images. Five target color surface models are proposed. They are color point (or single color), color grid, 1st, 2nd and 3rd 2D polynomials. Least Square Fitting is used to obtain the polynomial target color surfaces. Target color surfaces are automatically computed based on all source images or based on an external target image. Some special objects such as water and snow are filtered by percentage cut or a given mask. Excellent results are achieved. The performance is extremely fast to support on-the-fly color balancing for large number of images (possible of hundreds of thousands images). Detailed algorithm and formulae are described. Rich examples including big mosaic datasets (e.g., contains 36,006 images) are given. Excellent results and performance are presented. The results show that this technology can be successfully used in various imagery to obtain color seamless mosaic. This algorithm has been successfully using in ESRI ArcGis.

  9. Very large hail occurrence in Poland from 2007 to 2015

    NASA Astrophysics Data System (ADS)

    Pilorz, Wojciech

    2015-10-01

    Very large hail is known as a presence of a hailstone greater or equal to 5 cm in diameter. This phenomenon is rare but its significant consequences, not only to agriculture but also to automobiles, households and people outdoor makes it essential thing to examine. Hail appearance is strictly connected with storms frequency and its kind. The most hail-endangered kind of storm is supercell storm. Geographical distribution of hailstorms was compared with geographical distribution of storms in Poland. Similarities were found. The area of the largest number of storms is southeastern Poland. Analyzed European Severe Weather Database (ESWD) data showed that most of very large hail reports occurred in this part of Poland. The probable reason for this situation is the longest period of lasting tropical airmasses in southeastern Poland. Spatial distribution analysis shows also more hail incidents over Upper Silesia, Lesser Poland, Subcarpathia and Świętokrzyskie regions. The information source about hail occurrence was ESWD - open database, where everyone can add report and find reports which meet given search criteria. 69 hailstorms in the period of 2007 - 2015 were examined. They caused 121 very large hail reports. It was found that there is large disproportion in number of hailstorms and hail reports between individual years. Very large hail season in Poland begins in May and ends in September with cumulation in July. Most of hail occurs between 12:00 and 17:00 UTC, but there were some cases of very large (one extremely large) hail at night and early morning hours. However very large hail is a spectacular phenomenon, its local character determines potentially high information loss rate and it is the most significant problem in hail research.

  10. Extreme Value Theory and the New Sunspot Number Series

    NASA Astrophysics Data System (ADS)

    Acero, F. J.; Carrasco, V. M. S.; Gallego, M. C.; García, J. A.; Vaquero, J. M.

    2017-04-01

    Extreme value theory was employed to study solar activity using the new sunspot number index. The block maxima approach was used at yearly (1700-2015), monthly (1749-2016), and daily (1818-2016) scales, selecting the maximum sunspot number value for each solar cycle, and the peaks-over-threshold (POT) technique was used after a declustering process only for the daily data. Both techniques led to negative values for the shape parameters. This implies that the extreme sunspot number value distribution has an upper bound. The return level (RL) values obtained from the POT approach were greater than when using the block maxima technique. Regarding the POT approach, the 110 year (550 and 1100 year) RLs were lower (higher) than the daily maximum observed sunspot number value of 528. Furthermore, according to the block maxima approach, the 10-cycle RL lay within the block maxima daily sunspot number range, as expected, but it was striking that the 50- and 100-cycle RLs were also within that range. Thus, it would seem that the RL is reaching a plateau, and, although one must be cautious, it would be difficult to attain sunspot number values greater than 550. The extreme value trends from the four series (yearly, monthly, and daily maxima per solar cycle, and POT after declustering the daily data) were analyzed with the Mann-Kendall test and Sen’s method. Only the negative trend of the daily data with the POT technique was statistically significant.

  11. The persistence of the large volumes in black holes

    NASA Astrophysics Data System (ADS)

    Ong, Yen Chin

    2015-08-01

    Classically, black holes admit maximal interior volumes that grow asymptotically linearly in time. We show that such volumes remain large when Hawking evaporation is taken into account. Even if a charged black hole approaches the extremal limit during this evolution, its volume continues to grow; although an exactly extremal black hole does not have a "large interior". We clarify this point and discuss the implications of our results to the information loss and firewall paradoxes.

  12. Distributed sensor coordination for advanced energy systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tumer, Kagan

    Motivation: The ability to collect key system level information is critical to the safe, efficient and reliable operation of advanced power systems. Recent advances in sensor technology have enabled some level of decision making directly at the sensor level. However, coordinating large numbers of sensors, particularly heterogeneous sensors, to achieve system level objectives such as predicting plant efficiency, reducing downtime or predicting outages requires sophisticated coordination algorithms. Indeed, a critical issue in such systems is how to ensure the interaction of a large number of heterogenous system components do not interfere with one another and lead to undesirable behavior. Objectivesmore » and Contributions: The long-term objective of this work is to provide sensor deployment, coordination and networking algorithms for large numbers of sensors to ensure the safe, reliable, and robust operation of advanced energy systems. Our two specific objectives are to: 1. Derive sensor performance metrics for heterogeneous sensor networks. 2. Demonstrate effectiveness, scalability and reconfigurability of heterogeneous sensor network in advanced power systems. The key technical contribution of this work is to push the coordination step to the design of the objective functions of the sensors, allowing networks of heterogeneous sensors to be controlled. By ensuring that the control and coordination is not specific to particular sensor hardware, this approach enables the design and operation of large heterogeneous sensor networks. In addition to the coordination coordination mechanism, this approach allows the system to be reconfigured in response to changing needs (e.g., sudden external events requiring new responses) or changing sensor network characteristics (e.g., sudden changes to plant condition). Impact: The impact of this work extends to a large class of problems relevant to the National Energy Technology Laboratory including sensor placement, heterogeneous sensor coordination, and sensor network control in advanced power systems. Each application has specific needs, but they all share the one crucial underlying problem: how to ensure that the interactions of a large number of heterogenous agents lead to coordinated system behavior. This proposal describes a new paradigm that addresses that very issue in a systematic way. Key Results and Findings: All milestones have been completed. Our results demonstrate that by properly shaping agent objective functions, we can develop large (up to 10,000 devices) heterogeneous sensor networks with key desirable properties. The first milestone shows that properly choosing agent-specific objective functions increases system performance by up to 99.9% compared to global evaluations. The second milestone shows evolutionary algorithms learn excellent sensor network coordination policies prior to network deployment, and these policies can be refined online once the network is deployed. The third milestone shows the resulting sensor networks networks are extremely robust to sensor noise, where networks with up to 25% sensor noise are capable of providing measurements with errors on the order of 10⁻³. The fourth milestone shows the resulting sensor networks are extremely robust to sensor failure, with 25% of the sensors in the system failing resulting in no significant performance losses after system reconfiguration.« less

  13. Extremely correlated Fermi liquid theory of the t-J model in 2 dimensions: low energy properties

    NASA Astrophysics Data System (ADS)

    Shastry, B. Sriram; Mai, Peizhi

    2018-01-01

    Low energy properties of the metallic state of the two-dimensional t-J model are presented for second neighbor hopping with hole-doping (t\\prime ≤slant 0) and electron-doping (t\\prime > 0), with various superexchange energy J. We use a closed set of equations for the Greens functions obtained from the extremely correlated Fermi liquid theory. These equations reproduce the known low energies features of the large U Hubbard model in infinite dimensions. The density and temperature dependent quasiparticle weight, decay rate and the peak spectral heights over the Brillouin zone are calculated. We also calculate the resistivity, Hall conductivity, Hall number and cotangent Hall angle. The spectral features display high thermal sensitivity at modest T for density n≳ 0.8, implying a suppression of the effective Fermi-liquid temperature by two orders of magnitude relative to the bare bandwidth. The cotangent Hall angle exhibits a T 2 behavior at low T, followed by an interesting kink at higher T. The Hall number exhibits strong renormalization due to correlations. Flipping the sign of t\\prime changes the curvature of the resistivity versus T curves between convex and concave. Our results provide a natural route for understanding the observed difference in the temperature dependent resistivity of strongly correlated electron-doped and hole-doped matter.

  14. The Effects of Statistical Multiplicity of Infection on Virus Quantification and Infectivity Assays.

    PubMed

    Mistry, Bhaven A; D'Orsogna, Maria R; Chou, Tom

    2018-06-19

    Many biological assays are employed in virology to quantify parameters of interest. Two such classes of assays, virus quantification assays (VQAs) and infectivity assays (IAs), aim to estimate the number of viruses present in a solution and the ability of a viral strain to successfully infect a host cell, respectively. VQAs operate at extremely dilute concentrations, and results can be subject to stochastic variability in virus-cell interactions. At the other extreme, high viral-particle concentrations are used in IAs, resulting in large numbers of viruses infecting each cell, enough for measurable change in total transcription activity. Furthermore, host cells can be infected at any concentration regime by multiple particles, resulting in a statistical multiplicity of infection and yielding potentially significant variability in the assay signal and parameter estimates. We develop probabilistic models for statistical multiplicity of infection at low and high viral-particle-concentration limits and apply them to the plaque (VQA), endpoint dilution (VQA), and luciferase reporter (IA) assays. A web-based tool implementing our models and analysis is also developed and presented. We test our proposed new methods for inferring experimental parameters from data using numerical simulations and show improvement on existing procedures in all limits. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  15. The role of categorization and scale endpoint comparisons in numerical information processing: A two-process model.

    PubMed

    Tao, Tao; Wyer, Robert S; Zheng, Yuhuang

    2017-03-01

    We propose a two-process conceptualization of numerical information processing to describe how people form impressions of a score that is described along a bounded scale. According to the model, people spontaneously categorize a score as high or low. Furthermore, they compare the numerical discrepancy between the score and the endpoint of the scale to which it is closer, if they are not confident of their categorization, and use implications of this comparison as a basis for judgment. As a result, their evaluation of the score is less extreme when the range of numbers along the scale is large (e.g., from 0 to 100) than when it is small (from 0 to 10). Six experiments support this two-process model and demonstrate its generalizability. Specifically, the magnitude of numbers composing the scale has less impact on judgments (a) when the score being evaluated is extreme, (b) when individuals are unmotivated to engage in endpoint comparison processes (i.e., they are low in need for cognition), and (c) when they are unable to do so (i.e., they are under cognitive load). Moreover, the endpoint to which individuals compare the score can depend on their regulatory focus. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Hydatid cyst of the uterus.

    PubMed Central

    Başgül, A; Kavak, Z N; Gökaslan, H; Küllü, S

    2002-01-01

    BACKGROUND: Hydatidosis is a common zoonosis that affects a large number of humans and animals, especially in poorly developed countries. The infesting parasite has four forms named Echinococcus granulosis, E. multilocularis, E. vogeli and E. oligarthrus (very rare in humans). The most frequently involved organs are liver followed by the lung. The involvement of the genital tract is rare and the occurrence in the uterus is an extreme rarity. We report a case of hydatid cyst in the uterus. CASE: A 70-year-old female with a history of hydatid cysts of the liver, was admitted to hospital after complaining of low abdominal pains. On physical and gynecological examinations, no pathological finding was detected. However, the uterus was significantly large for a postmenopausal patient. Transvaginal sonography (TS) revealed a cystic mass in the uterus with a size of 7 x 6 cm. After further examinations a subtotal hysterectomy was performed. Microscopic examination showed scolices of Echinococcus granulosis. CONCLUSION: Hydatid cysts in the genital tract are rare and the occurrence in the uterus is an extreme rarity. Differentiation between hydatid cyst and malignant disease of the related organ is difficult. To avoid misdiagnosis, a careful examination of pelvic masses should be carried out in endemic areas for detection of hydatid cysts. PMID:12530482

  17. Climate, icing, and wild arctic reindeer: past relationships and future prospects.

    PubMed

    Hansen, Brage Bremset; Aanes, Ronny; Herfindal, Ivar; Kohler, Jack; Saether, Bernt-Erik

    2011-10-01

    Across the Arctic, heavy rain-on-snow (ROS) is an "extreme" climatic event that is expected to become increasingly frequent with global warming. This has potentially large ecosystem implications through changes in snowpack properties and ground-icing, which can block the access to herbivores' winter food and thereby suppress their population growth rates. However, the supporting empirical evidence for this is still limited. We monitored late winter snowpack properties to examine the causes and consequences of ground-icing in a Svalbard reindeer (Rangifer tarandus platyrhynchus) metapopulation. In this high-arctic area, heavy ROS occurred annually, and ground-ice covered from 25% to 96% of low-altitude habitat in the sampling period (2000-2010). The extent of ground-icing increased with the annual number of days with heavy ROS (> or = 10 mm) and had a strong negative effect on reindeer population growth rates. Our results have important implications as a downscaled climate projection (2021-2050) suggests a substantial future increase in ROS and icing. The present study is the first to demonstrate empirically that warmer and wetter winter climate influences large herbivore population dynamics by generating ice-locked pastures. This may serve as an early warning of the importance of changes in winter climate and extreme weather events in arctic ecosystems.

  18. Spawning activity of the Australian lungfish Neoceratodus forsteri in an impoundment.

    PubMed

    Roberts, D T; Mallett, S; Krück, N C; Loh, W; Tibbetts, I

    2014-01-01

    This study assessed the spawning activity of the threatened Australian lungfish Neoceratodus forsteri by measuring egg densities within the artificial habitat of a large impoundment (Lake Wivenhoe, Australia). Eggs were sampled (August to November 2009) from multiple locations across the impoundment, but occurred at highest densities in water shallower than 40 cm along shorelines with a dense cover of submerged terrestrial vegetation. The numbers of eggs declined over the study period and all samples were dominated by early developmental stages and high proportions of unviable eggs. The quality of the littoral spawning habitats declined over the study as flooded terrestrial grasses decomposed and filamentous algae coverage increased. Water temperatures at the spawning site exhibited extreme variations, ranging over 20·4° C in water shallower than 5 cm. Dissolved oxygen concentrations regularly declined to <1 mg l⁻¹ at 40 and 80 cm water depth. Spawning habitats utilised by N. forsteri within impoundments expose embryos to increased risk of desiccation or excessive submergence through water-level variations, and extremes in temperature and dissolved oxygen concentration that present numerous challenges for successful spawning and recruitment of N. forsteri in large impoundment environments. © 2014 The Fisheries Society of the British Isles.

  19. A comparative analysis of support vector machines and extreme learning machines.

    PubMed

    Liu, Xueyi; Gao, Chuanhou; Li, Ping

    2012-09-01

    The theory of extreme learning machines (ELMs) has recently become increasingly popular. As a new learning algorithm for single-hidden-layer feed-forward neural networks, an ELM offers the advantages of low computational cost, good generalization ability, and ease of implementation. Hence the comparison and model selection between ELMs and other kinds of state-of-the-art machine learning approaches has become significant and has attracted many research efforts. This paper performs a comparative analysis of the basic ELMs and support vector machines (SVMs) from two viewpoints that are different from previous works: one is the Vapnik-Chervonenkis (VC) dimension, and the other is their performance under different training sample sizes. It is shown that the VC dimension of an ELM is equal to the number of hidden nodes of the ELM with probability one. Additionally, their generalization ability and computational complexity are exhibited with changing training sample size. ELMs have weaker generalization ability than SVMs for small sample but can generalize as well as SVMs for large sample. Remarkably, great superiority in computational speed especially for large-scale sample problems is found in ELMs. The results obtained can provide insight into the essential relationship between them, and can also serve as complementary knowledge for their past experimental and theoretical comparisons. Copyright © 2012 Elsevier Ltd. All rights reserved.

  20. Efficient reconstruction method for ground layer adaptive optics with mixed natural and laser guide stars.

    PubMed

    Wagner, Roland; Helin, Tapio; Obereder, Andreas; Ramlau, Ronny

    2016-02-20

    The imaging quality of modern ground-based telescopes such as the planned European Extremely Large Telescope is affected by atmospheric turbulence. In consequence, they heavily depend on stable and high-performance adaptive optics (AO) systems. Using measurements of incoming light from guide stars, an AO system compensates for the effects of turbulence by adjusting so-called deformable mirror(s) (DMs) in real time. In this paper, we introduce a novel reconstruction method for ground layer adaptive optics. In the literature, a common approach to this problem is to use Bayesian inference in order to model the specific noise structure appearing due to spot elongation. This approach leads to large coupled systems with high computational effort. Recently, fast solvers of linear order, i.e., with computational complexity O(n), where n is the number of DM actuators, have emerged. However, the quality of such methods typically degrades in low flux conditions. Our key contribution is to achieve the high quality of the standard Bayesian approach while at the same time maintaining the linear order speed of the recent solvers. Our method is based on performing a separate preprocessing step before applying the cumulative reconstructor (CuReD). The efficiency and performance of the new reconstructor are demonstrated using the OCTOPUS, the official end-to-end simulation environment of the ESO for extremely large telescopes. For more specific simulations we also use the MOST toolbox.

  1. Spectral theory of extreme statistics in birth-death systems

    NASA Astrophysics Data System (ADS)

    Meerson, Baruch

    2008-03-01

    Statistics of rare events, or large deviations, in chemical reactions and systems of birth-death type have attracted a great deal of interest in many areas of science including cell biochemistry, astrochemistry, epidemiology, population biology, etc. Large deviations become of vital importance when discrete (non-continuum) nature of a population of ``particles'' (molecules, bacteria, cells, animals or even humans) and stochastic character of interactions can drive the population to extinction. I will briefly review the novel spectral method [1-3] for calculating the extreme statistics of a broad class of birth-death processes and reactions involving a single species. The spectral method combines the probability generating function formalism with the Sturm-Liouville theory of linear differential operators. It involves a controlled perturbative treatment based on a natural large parameter of the problem: the average number of particles/individuals in a stationary or metastable state. For extinction (the first passage) problems the method yields accurate results for the extinction statistics and for the quasi-stationary probability distribution, including the tails, of metastable states. I will demonstrate the power of the method on the example of a branching and annihilation reaction, A ->-2.8mm2mm2A,,A ->-2.8mm2mm , representative of a rather general class of processes. *M. Assaf and B. Meerson, Phys. Rev. Lett. 97, 200602 (2006). *M. Assaf and B. Meerson, Phys. Rev. E 74, 041115 (2006). *M. Assaf and B. Meerson, Phys. Rev. E 75, 031122 (2007).

  2. The analysis of ensembles of moderately saturated interstellar lines

    NASA Technical Reports Server (NTRS)

    Jenkins, E. B.

    1986-01-01

    It is shown that the combined equivalent widths for a large population of Gaussian-like interstellar line components, each with different central optical depths tau(0) and velocity dispersions b, exhibit a curve of growth (COG) which closely mimics that of a single, pure Gaussian distribution in velocity. Two parametric distributions functions for the line populations are considered: a bivariate Gaussian for tau(0) and b and a power law distribution for tau(0) combined with a Gaussian dispersion for b. First, COGs for populations having an extremely large number of nonoverlapping components are derived, and the implications are shown by focusing on the doublet-ratio analysis for a pair of lines whose f-values differ by a factor of two. The consequences of having, instead of an almost infinite number of lines, a relatively small collection of components added together for each member of a doublet are examined. The theory of how the equivalent widths grow for populations of overlapping Gaussian profiles is developed. Examples of the composite COG analysis applied to existing collections of high-resolution interstellar line data are presented.

  3. Zebrafish as a systems toxicology model for developmental neurotoxicity testing.

    PubMed

    Nishimura, Yuhei; Murakami, Soichiro; Ashikawa, Yoshifumi; Sasagawa, Shota; Umemoto, Noriko; Shimada, Yasuhito; Tanaka, Toshio

    2015-02-01

    The developing brain is extremely sensitive to many chemicals. Exposure to neurotoxicants during development has been implicated in various neuropsychiatric and neurological disorders, including autism spectrum disorder, attention deficit hyperactive disorder, schizophrenia, Parkinson's disease, and Alzheimer's disease. Although rodents have been widely used for developmental neurotoxicity testing, experiments using large numbers of rodents are time-consuming, expensive, and raise ethical concerns. Using alternative non-mammalian animal models may relieve some of these pressures by allowing testing of large numbers of subjects while reducing expenses and minimizing the use of mammalian subjects. In this review, we discuss some of the advantages of using zebrafish in developmental neurotoxicity testing, focusing on central nervous system development, neurobehavior, toxicokinetics, and toxicodynamics in this species. We also describe some important examples of developmental neurotoxicity testing using zebrafish combined with gene expression profiling, neuroimaging, or neurobehavioral assessment. Zebrafish may be a systems toxicology model that has the potential to reveal the pathways of developmental neurotoxicity and to provide a sound basis for human risk assessments. © 2014 Japanese Teratology Society.

  4. Magnetic nanorings and manipulation of nanowires

    NASA Astrophysics Data System (ADS)

    Chien, C. L.

    2006-03-01

    The properties of nanoscale entities, such as nanorings and nanowires, and the response of such entities to external fields are dictated by their geometrical shapes and sizes, which can be manipulated by fabrication. We have developed a method for fabricating a large number of nanorings (10^10) of different sizes in the range of 100 nm and ring cross sections. During magnetic reversal, both the vortex state and the rotating onion state appear with different proportions, which depend on the ring diameter, ring cross section, and the profile of the ring cross section. In the case of nanowires in suspension, the large aspect ratio of the nanowires can be exploited for manipulation despite extremely small Reynolds numbers of 10-5. Using AC electric field applied to microelectrodes, both magnetic and non-magnetic nanowires can be efficiently assembled into desired patterns. We also demonstrate rotation of nanowires with precisely controlled rotation speed and chirality, as well as an electrically driven nanowire micromotor a few in size. In collaboration with F. Q. Zhu, D. L. Fan, O. Tchernyshyov, R. C. Cammarata (Johns Hopkins University) and X. C. Zhu and J. G. Zhu (Carnegie-Mellon University).

  5. An origin of good electrical conduction in La{sub 4}BaCu{sub 5}O{sub 13+δ}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mori, Daiki; Asai, Shinichiro; Terasaki, Ichiro, E-mail: terra@cc.nagoya-u.ac.jp

    2015-07-21

    We have prepared a set of polycrystalline samples of the metallic copper oxide La{sub 4}BaCu{sub 5−x}Co{sub x}O{sub 13+δ} (0 ≤ x ≤ 0.35) and have measured the resistivity from 4 to 800 K. All the resistivities show metallic temperature dependence with a small magnitude less than 2 mΩ cm at 800 K, indicating that the metallic conduction is robust against impurities. The robust metallic conduction further suggests that this class of oxide is a promising candidate for electrical leads at high temperature, which might replace platinum. A detailed measurement and analysis on the Hall resistivity have revealed that at least two components are responsible for the electricalmore » conduction, in which a large number of electrons of moderate mobility coexist with a much smaller number of holes of extremely high mobility. This large electron density well screens the impurity potential and retains the metallic conduction against 7% impurity doping.« less

  6. The numerical solution of the Helmholtz equation for wave propagation problems in underwater acoustics

    NASA Technical Reports Server (NTRS)

    Bayliss, A.; Goldstein, C. I.; Turkel, E.

    1984-01-01

    The Helmholtz Equation (-delta-K(2)n(2))u=0 with a variable index of refraction, n, and a suitable radiation condition at infinity serves as a model for a wide variety of wave propagation problems. A numerical algorithm was developed and a computer code implemented that can effectively solve this equation in the intermediate frequency range. The equation is discretized using the finite element method, thus allowing for the modeling of complicated geometrices (including interfaces) and complicated boundary conditions. A global radiation boundary condition is imposed at the far field boundary that is exact for an arbitrary number of propagating modes. The resulting large, non-selfadjoint system of linear equations with indefinite symmetric part is solved using the preconditioned conjugate gradient method applied to the normal equations. A new preconditioner is developed based on the multigrid method. This preconditioner is vectorizable and is extremely effective over a wide range of frequencies provided the number of grid levels is reduced for large frequencies. A heuristic argument is given that indicates the superior convergence properties of this preconditioner.

  7. Diversity and community structure of fungi through a permafrost core profile from the Qinghai-Tibet Plateau of China.

    PubMed

    Hu, Weigang; Zhang, Qi; Li, Dingyao; Cheng, Gang; Mu, Jing; Wu, Qingbai; Niu, Fujun; An, Lizhe; Feng, Huyuan

    2014-12-01

    While a vast number of studies have addressed the prokaryotic diversity in permafrost, characterized by subzero temperatures, low water activity, and extremely low rates of nutrient and metabolite transfer, fungal patterns have received surprisingly limited attention. Here, the fungal diversity and community structure were investigated by culture-dependent technique combined with cloning-restriction fragment length polymorphism (RFLP) analysis of sediments in a 10-m-long permafrost core from the Qinghai-Tibet Plateau of China. A total of 62 fungal phylotypes related to 10 distinct classes representing three phyla were recovered from 5031 clones generated in 13 environmental gene libraries. A large proportion of the phylotypes (25/62) that were distantly related to described fungal species appeared to be novel diversity. Ascomycota was the predominant group of fungi, with respect to both clone and phylotype number. Our results suggested there was the existence of cosmopolitan psychrophilic or psychrotolerant fungi in permafrost sediments, the community composition of fungi varied with increasing depth, while these communities largely distributed according to core layers. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. A stochastic storm surge generator for the German North Sea and the multivariate statistical assessment of the simulation results

    NASA Astrophysics Data System (ADS)

    Wahl, Thomas; Jensen, Jürgen; Mudersbach, Christoph

    2010-05-01

    Storm surges along the German North Sea coastline led to major damages in the past and the risk of inundation is expected to increase in the course of an ongoing climate change. The knowledge of the characteristics of possible storm surges is essential for the performance of integrated risk analyses, e.g. based on the source-pathway-receptor concept. The latter includes the storm surge simulation/analyses (source), modelling of dike/dune breach scenarios (pathway) and the quantification of potential losses (receptor). In subproject 1b of the German joint research project XtremRisK (www.xtremrisk.de), a stochastic storm surge generator for the south-eastern North Sea area is developed. The input data for the multivariate model are high resolution sea level observations from tide gauges during extreme events. Based on 25 parameters (19 sea level parameters and 6 time parameters) observed storm surge hydrographs consisting of three tides are parameterised. Followed by the adaption of common parametric probability distributions and a large number of Monte-Carlo-Simulations, the final reconstruction leads to a set of 100.000 (default) synthetic storm surge events with a one-minute resolution. Such a data set can potentially serve as the basis for a large number of applications. For risk analyses, storm surges with peak water levels exceeding the design water levels are of special interest. The occurrence probabilities of the simulated extreme events are estimated based on multivariate statistics, considering the parameters "peak water level" and "fullness/intensity". In the past, most studies considered only the peak water levels during extreme events, which might not be the most important parameter in any cases. Here, a 2D-Archimedian copula model is used for the estimation of the joint probabilities of the selected parameters, accounting for the structures of dependence overlooking the margins. In coordination with subproject 1a, the results will be used as the input for the XtremRisK subprojects 2 to 4. The project is funded by the German Federal Ministry of Education and Research (BMBF) (Project No. 03 F 0483 B).

  9. PROBABILITIES OF TEMPERATURE EXTREMES IN THE U.S.

    EPA Science Inventory

    The model Temperature Extremes Version 1.0 provides the capability to estimate the probability, for 332 locations in the 50 U.S. states, that an extreme temperature will occur for one or more consecutive days and/or for any number of days in a given month or season, based on stat...

  10. Shift of extreme spring streamflow on the Belorussian rivers and its association with changes of cyclonic activity over Eastern Europe

    NASA Astrophysics Data System (ADS)

    Partasenok, Irina; Chekan, Gregory

    2014-05-01

    The intra-annual distribution of precipitation is the most variable component of the water resources of Belarus. This distribution is controlled by extratropical cyclones from the Atlantic Ocean and Mediterranean that bring most of precipitation to the nation. That's why the aim of our study was to quantify major characteristics of these cyclones and to estimate effects of their passing through the Belorussian territory on regional water budget including floods and low water conditions. We documented the long-term fluctuations of streamflow and occurrence of extreme phenomena on the rivers of Belarus during the post-World War II period. It was established that annual water budget of the nation vary from year to year without systematic tendencies. At the same time, analysis of intra-annual distribution of streamflow reveals significant changes since the 1970s: increase of winter and decrease of spring runoff. As a result, the frequency of extreme spring floods has decreased. These changes in water regime are associated with climatic anomalies caused by large-scale alterations in atmospheric circulation, specifically in trajectories of cyclones. As a manifestation of these circulation changes, we observe increase of the surface air temperatures, more frequent cold season thaws, redistribution of seasonal precipitation totals, and decrease of the fraction of frozen precipitation in the shoulder seasons. Analysis of cyclonic activity over Belarus during the past 60 years in the cold season (December through February) shows the largest number of cyclones in 1950-1970. During this period, the largest number of spring floods caused by snowmelt on the rivers of Belarus was reported. Since 1970, we observe a decrease in the total number of cyclones but also an increasing strength (deepening) of the remaining cyclones in the cold season. That has led to some precipitation increase. During the last four decades, more frequent zonal air movement in the atmosphere and substantial surface air temperature increase in the winter season provoked the prevalence of winter thaw conditions. The thaws interfered with accumulation of snowpack before the beginning of spring snowmelt and promoted decrease in the number of spring floods on the rivers of Belarus.

  11. Spectroscopy of molecules in very high rotational states using an optical centrifuge.

    PubMed

    Yuan, Liwei; Toro, Carlos; Bell, Mack; Mullin, Amy S

    2011-01-01

    We have developed a high power optical centrifuge for measuring the spectroscopy of molecules in extreme rotational states. The optical centrifuge has a pulse energy that is more than 2 orders of magnitude greater than in earlier instruments. The large pulse energy allows us to drive substantial number densities of molecules to extreme rotational states in order to measure new spectroscopic transitions that are not accessible with traditional methods. Here we demonstrate the use of the optical centrifuge for measuring IR transitions of N2O from states that have been inaccessible until now. In these studies, the optical centrifuge drives N2O molecules into states with J ~ 200 and we use high resolution transient IR probing to measure the appearance of population in states with J = 93-99 that result from collisional cooling of the centrifuged molecules. High resolution Doppler broadened line profile measurements yield information about the rotational and translational energy distributions in the optical centrifuge.

  12. Replica and extreme-value analysis of the Jarzynski free-energy estimator

    NASA Astrophysics Data System (ADS)

    Palassini, Matteo; Ritort, Felix

    2008-03-01

    We analyze the Jarzynski estimator of free-energy differences from nonequilibrium work measurements. By a simple mapping onto Derrida's Random Energy Model, we obtain a scaling limit for the expectation of the bias of the estimator. We then derive analytical approximations in three different regimes of the scaling parameter x = log(N)/W, where N is the number of measurements and W the mean dissipated work. Our approach is valid for a generic distribution of the dissipated work, and is based on a replica symmetry breaking scheme for x >> 1, the asymptotic theory of extreme value statistics for x << 1, and a direct approach for x near one. The combination of the three analytic approximations describes well Monte Carlo data for the expectation value of the estimator, for a wide range of values of N, from N=1 to large N, and for different work distributions. Based on these results, we introduce improved free-energy estimators and discuss the application to the analysis of experimental data.

  13. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers.

    PubMed

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems.

  14. Characterization of Halophilic Bacterial Communities in Turda Salt Mine (Romania)

    NASA Astrophysics Data System (ADS)

    Carpa, Rahela; Keul, Anca; Muntean, Vasile; Dobrotă, Cristina

    2014-09-01

    Halophilic organisms are having adaptations to extreme salinity, the majority of them being Archaean, which have the ability to grow at extremely high salt concentrations, (from 3 % to 35 %). Level of salinity causes natural fluctuations in the halophilic populations that inhabit this particular habitat, raising problems in maintaining homeostasis of the osmotic pressure. Samples such as salt and water taken from Turda Salt Mine were analyzed in order to identify the eco-physiological bacterial groups. Considering the number of bacteria of each eco-physiological group, the bacterial indicators of salt quality (BISQ) were calculated and studied for each sample. The phosphatase, catalase and dehydrogenases enzymatic activities were quantitatively determined and the enzymatic indicators of salt quality (EISQ) were calculated. Bacterial isolates were analyzed using 16S rRNA gene sequence analysis. Universal bacterial primers, targeting the consensus region of the bacterial 16S rRNA gene were used. Analysis of a large fragment, of 1499 bp was performed to improve discrimination at the species level.

  15. Extremely Scalable Spiking Neuronal Network Simulation Code: From Laptops to Exascale Computers

    PubMed Central

    Jordan, Jakob; Ippen, Tammo; Helias, Moritz; Kitayama, Itaru; Sato, Mitsuhisa; Igarashi, Jun; Diesmann, Markus; Kunkel, Susanne

    2018-01-01

    State-of-the-art software tools for neuronal network simulations scale to the largest computing systems available today and enable investigations of large-scale networks of up to 10 % of the human cortex at a resolution of individual neurons and synapses. Due to an upper limit on the number of incoming connections of a single neuron, network connectivity becomes extremely sparse at this scale. To manage computational costs, simulation software ultimately targeting the brain scale needs to fully exploit this sparsity. Here we present a two-tier connection infrastructure and a framework for directed communication among compute nodes accounting for the sparsity of brain-scale networks. We demonstrate the feasibility of this approach by implementing the technology in the NEST simulation code and we investigate its performance in different scaling scenarios of typical network simulations. Our results show that the new data structures and communication scheme prepare the simulation kernel for post-petascale high-performance computing facilities without sacrificing performance in smaller systems. PMID:29503613

  16. Extreme brain events: Higher-order statistics of brain resting activity and its relation with structural connectivity

    NASA Astrophysics Data System (ADS)

    Amor, T. A.; Russo, R.; Diez, I.; Bharath, P.; Zirovich, M.; Stramaglia, S.; Cortes, J. M.; de Arcangelis, L.; Chialvo, D. R.

    2015-09-01

    The brain exhibits a wide variety of spatiotemporal patterns of neuronal activity recorded using functional magnetic resonance imaging as the so-called blood-oxygenated-level-dependent (BOLD) signal. An active area of work includes efforts to best describe the plethora of these patterns evolving continuously in the brain. Here we explore the third-moment statistics of the brain BOLD signals in the resting state as a proxy to capture extreme BOLD events. We find that the brain signal exhibits typically nonzero skewness, with positive values for cortical regions and negative values for subcortical regions. Furthermore, the combined analysis of structural and functional connectivity demonstrates that relatively more connected regions exhibit activity with high negative skewness. Overall, these results highlight the relevance of recent results emphasizing that the spatiotemporal location of the relatively large-amplitude events in the BOLD time series contains relevant information to reproduce a number of features of the brain dynamics during resting state in health and disease.

  17. Studies of Coronae and Large Volcanoes on Venus: Constraining the Diverse Outcomes of Small-Scale Mantle Upwellings on Venus

    NASA Technical Reports Server (NTRS)

    Stofan, Ellen R.

    2005-01-01

    Proxemy Research had a grant from NASA to perform science research on upwelling and volcanism on Venus. This was a 3 year Planetary Geology and Geophysics grant to E. Stofan, entitled Coronae and Large volcanoes on Venus. This grant closes on 12/31/05. Here we summarize the scientific progress and accomplishments of this grant. Scientific publications and abstracts of presentations are indicated in the final section. This was a very productive grant and the progress that was made is summarized. Attention is drawn to the publications and abstracts published in each year. The proposal consisted of two tasks, one examining coronae and one studying large volcanoes. The corona task (Task 1) consisted of three parts: 1) a statistical study of the updated corona population, with Sue Smrekar, Lori Glaze, Paula Martin and Steve Baloga; 2) geologic analysis of several specific groups of coronae, with Sue Smrekar and others; and 3) determining the histories and significance of a number of coronae with extreme amounts of volcanism, with Sue Smrekar. Task 2, studies of large volcanoes, consisted of two subtasks. In the first, we studied the geologic history of several volcanoes, with John Guest, Peter Grindrod, Antony Brian and Steve Anderson. In the second subtask, I analyzed a number of Venusian volcanoes with evidence of summit diking along with Peter Grindrod and Francis Nimmo.

  18. Experimental Study of Homogeneous Isotropic Slowly-Decaying Turbulence in Giant Grid-Wind Tunnel Set Up

    NASA Astrophysics Data System (ADS)

    Aliseda, Alberto; Bourgoin, Mickael; Eswirp Collaboration

    2014-11-01

    We present preliminary results from a recent grid turbulence experiment conducted at the ONERA wind tunnel in Modane, France. The ESWIRP Collaboration was conceived to probe the smallest scales of a canonical turbulent flow with very high Reynolds numbers. To achieve this, the largest scales of the turbulence need to be extremely big so that, even with the large separation of scales, the smallest scales would be well above the spatial and temporal resolution of the instruments. The ONERA wind tunnel in Modane (8 m -diameter test section) was chosen as a limit of the biggest large scales achievable in a laboratory setting. A giant inflatable grid (M = 0.8 m) was conceived to induce slowly-decaying homogeneous isotropic turbulence in a large region of the test section, with minimal structural risk. An international team or researchers collected hot wire anemometry, ultrasound anemometry, resonant cantilever anemometry, fast pitot tube anemometry, cold wire thermometry and high-speed particle tracking data of this canonical turbulent flow. While analysis of this large database, which will become publicly available over the next 2 years, has only started, the Taylor-scale Reynolds number is estimated to be between 400 and 800, with Kolmogorov scales as large as a few mm . The ESWIRP Collaboration is formed by an international team of scientists to investigate experimentally the smallest scales of turbulence. It was funded by the European Union to take advantage of the largest wind tunnel in Europe for fundamental research.

  19. Hot days induced by precipitation deficits at the global scale

    PubMed Central

    Mueller, Brigitte; Seneviratne, Sonia I.

    2012-01-01

    Global warming increases the occurrence probability of hot extremes, and improving the predictability of such events is thus becoming of critical importance. Hot extremes have been shown to be induced by surface moisture deficits in some regions. In this study, we assess whether such a relationship holds at the global scale. We find that wide areas of the world display a strong relationship between the number of hot days in the regions’ hottest month and preceding precipitation deficits. The occurrence probability of an above-average number of hot days is over 70% after precipitation deficits in most parts of South America as well as the Iberian Peninsula and Eastern Australia, and over 60% in most of North America and Eastern Europe, while it is below 30–40% after wet conditions in these regions. Using quantile regression analyses, we show that the impact of precipitation deficits on the number of hot days is asymmetric, i.e. extreme high numbers of hot days are most strongly influenced. This relationship also applies to the 2011 extreme event in Texas. These findings suggest that effects of soil moisture-temperature coupling are geographically more widespread than commonly assumed. PMID:22802672

  20. The Number Density Evolution of Extreme Emission Line Galaxies in 3D-HST: Results from a Novel Automated Line Search Technique for Slitless Spectroscopy

    NASA Astrophysics Data System (ADS)

    Maseda, Michael V.; van der Wel, Arjen; Rix, Hans-Walter; Momcheva, Ivelina; Brammer, Gabriel B.; Franx, Marijn; Lundgren, Britt F.; Skelton, Rosalind E.; Whitaker, Katherine E.

    2018-02-01

    The multiplexing capability of slitless spectroscopy is a powerful asset in creating large spectroscopic data sets, but issues such as spectral confusion make the interpretation of the data challenging. Here we present a new method to search for emission lines in the slitless spectroscopic data from the 3D-HST survey utilizing the Wide-Field Camera 3 on board the Hubble Space Telescope. Using a novel statistical technique, we can detect compact (extended) emission lines at 90% completeness down to fluxes of 1.5(3.0)× {10}-17 {erg} {{{s}}}-1 {{cm}}-2, close to the noise level of the grism exposures, for objects detected in the deep ancillary photometric data. Unlike previous methods, the Bayesian nature allows for probabilistic line identifications, namely redshift estimates, based on secondary emission line detections and/or photometric redshift priors. As a first application, we measure the comoving number density of Extreme Emission Line Galaxies (restframe [O III] λ5007 equivalent widths in excess of 500 Å). We find that these galaxies are nearly 10× more common above z ∼ 1.5 than at z ≲ 0.5. With upcoming large grism surveys such as Euclid and WFIRST, as well as grisms featured prominently on the NIRISS and NIRCam instruments on the James Webb Space Telescope, methods like the one presented here will be crucial for constructing emission line redshift catalogs in an automated and well-understood manner. This work is based on observations taken by the 3D-HST Treasury Program and the CANDELS Multi-Cycle Treasury Program with the NASA/ESA HST, which is operated by the Association of Universities for Research in Astronomy, Inc., under NASA contract NAS5-26555.

  1. Climate projection of synoptic patterns forming extremely high wind speed over the Barents Sea

    NASA Astrophysics Data System (ADS)

    Surkova, Galina; Krylov, Aleksey

    2017-04-01

    Frequency of extreme weather events is not very high, but their consequences for the human well-being may be hazardous. These seldom events are not always well simulated by climate models directly. Sometimes it is more effective to analyze numerical projection of large-scale synoptic event generating extreme weather. For example, in mid-latitude surface wind speed depends mainly on the sea level pressure (SLP) field - its configuration and horizontal pressure gradient. This idea was implemented for analysis of extreme wind speed events over the Barents Sea. The calendar of high surface wind speed V (10 m above the surface) was prepared for events with V exceeding 99th percentile value in the central part of the Barents Sea. Analysis of probability distribution function of V was carried out on the base of ERA-Interim reanalysis data (6-hours, 0.75x0.75 degrees of latitude and longitude) for the period 1981-2010. Storm wind events number was found to be 240 days. Sea level pressure field over the sea and surrounding area was selected for each storm wind event. For the climate of the future (scenario RCP8.5), projections of SLP from CMIP5 numerical experiments were used. More than 20 climate models results of projected SLP (2006-2100) over the Barents Sea were correlated with modern storm wind SLP fields. Our calculations showed the positive tendency of annual frequency of storm SLP patterns over the Barents Sea by the end of 21st century.

  2. Extreme Binge Drinking among 12th-Grade Students in the U.S.: Prevalence and Predictors

    PubMed Central

    Patrick, Megan E.; Schulenberg, John E.; Martz, Meghan E.; Maggs, Jennifer L.; O’Malley, Patrick M.; Johnston, Lloyd

    2013-01-01

    Importance The prevalence of underage alcohol use has been studied extensively but binge drinking among youth in the U.S. is not yet well understood. In particular, adolescents may drink much larger amounts than the threshold (5 drinks) often used in definitions of binge drinking. Delineating various levels of binge drinking, including extreme levels, and understanding predictors of such extreme binge drinking among adolescents will benefit public health efforts. Objective To examine the prevalence and predictors of 5+ binge drinking and of 10+ and 15+ extreme binge drinking among 12th graders in the U.S. Design A non-clinical nationally representative sample. Setting High school seniors in the annual Monitoring the Future study between 2005 and 2011. Participants The sample included 16,332 12th graders (modal age 18) in the U.S. Response rates were 79–85%. Main Outcome Measures Prevalence of consuming 5+, 10+, and 15+ drinks in a row in the past two weeks. Results Between 2005 and 2011, 20.2% of high school seniors reported 5+ binge drinking, 10.5% reported 10+ extreme binge drinking, and 5.6% reported 15+ extreme binge drinking in the past 2 weeks. Rates of 5+ binge drinking and 10+ extreme binge drinking have declined since 2005, but rates of 15+ extreme binge drinking have not. Students with college-educated parents were more likely to consume 5+ drinks but less likely to consume 15+ drinks than students whose parents were not college educated. Students from more rural areas were more likely than students from large metropolitan areas to drink 15+ drinks. Socializing with substance-using peers, number of evenings out with friends, substance-related attitudes, and other substance use (cigarettes, marijuana) predicted all three levels of binge and extreme binge drinking. Conclusions Binge drinking at the traditionally defined 5+ drinking level was common among high school seniors representative of all 12th graders in the contiguous U.S. A significant segment of students also reported extreme binge drinking at levels two and three times higher. These data suggest the importance of assessing multiple levels of binge drinking behavior and their predictors among adolescents in order to target effective screening and intervention efforts. PMID:24042318

  3. A Mitogenomic Phylogeny of Living Primates

    PubMed Central

    Finstermeier, Knut; Zinner, Dietmar; Brameier, Markus; Meyer, Matthias; Kreuz, Eva; Hofreiter, Michael; Roos, Christian

    2013-01-01

    Primates, the mammalian order including our own species, comprise 480 species in 78 genera. Thus, they represent the third largest of the 18 orders of eutherian mammals. Although recent phylogenetic studies on primates are increasingly built on molecular datasets, most of these studies have focused on taxonomic subgroups within the order. Complete mitochondrial (mt) genomes have proven to be extremely useful in deciphering within-order relationships even up to deep nodes. Using 454 sequencing, we sequenced 32 new complete mt genomes adding 20 previously not represented genera to the phylogenetic reconstruction of the primate tree. With 13 new sequences, the number of complete mt genomes within the parvorder Platyrrhini was widely extended, resulting in a largely resolved branching pattern among New World monkey families. We added 10 new Strepsirrhini mt genomes to the 15 previously available ones, thus almost doubling the number of mt genomes within this clade. Our data allow precise date estimates of all nodes and offer new insights into primate evolution. One major result is a relatively young date for the most recent common ancestor of all living primates which was estimated to 66-69 million years ago, suggesting that the divergence of extant primates started close to the K/T-boundary. Although some relationships remain unclear, the large number of mt genomes used allowed us to reconstruct a robust primate phylogeny which is largely in agreement with previous publications. Finally, we show that mt genomes are a useful tool for resolving primate phylogenetic relationships on various taxonomic levels. PMID:23874967

  4. Heat Budget of Large Rivers: Sensitivity to Stream Morphology

    NASA Astrophysics Data System (ADS)

    Lancaster, S. T.; Haggerty, R.

    2014-12-01

    In order to assess the feasibility of effecting measurable changes in the heat budget of a large river through restoration, we use a numerical model to analyze the sensitivity of that heat budget to morphological manipulations, specifically those resulting in a narrower main channel with more alcoves. We base model parameters primarily on the gravel-bedded middle Snake River near Marsing, Idaho. The heat budget is represented by an advection-dispersion-reaction equation with, in addition to radiative, evaporative, and sensible heat fluxes, a hyporheic flux term that models lateral flow from the main stream, through bars, and into alcoves and side channels. This term effectively introduces linear dispersion of water temperatures with respect to time, so that the magnitude of the hyporheic term in the heat budget is expected to scale with the ``hyporheic number," defined as , where is dimensionless hyporheic flow rate and is dimensionless mean residence time of water entering the hyporheic zone. Simulations varying the parameters for channel width and hyporheic flow indicate that, for a large river such as the middle Snake River, feasible changes in channel width would produce downstream changes in heat flux an order of magnitude larger than would relatively extreme changes in hyporheic number. Changes, such as reduced channel width and increased hyporheic number, that tend to reduce temperatures in the summer, when temperatures are increasing with time and downstream distance, actually tend to increase temperatures in the fall, when temperatures are decreasing with time and distance.

  5. Springtime extreme moisture transport into the Arctic and its impact on sea ice concentration

    NASA Astrophysics Data System (ADS)

    Yang, Wenchang; Magnusdottir, Gudrun

    2017-05-01

    Recent studies suggest that springtime moisture transport into the Arctic can initiate sea ice melt that extends to a large area in the following summer and fall, which can help explain Arctic sea ice interannual variability. Yet the impact from an individual moisture transport event, especially the extreme ones, is unclear on synoptic to intraseasonal time scales and this is the focus of the current study. Springtime extreme moisture transport into the Arctic from a daily data set is found to be dominant over Atlantic longitudes. Lag composite analysis shows that these extreme events are accompanied by a substantial sea ice concentration reduction over the Greenland-Barents-Kara Seas that lasts around a week. Surface air temperature also becomes anomalously high over these seas and cold to the west of Greenland as well as over the interior Eurasian continent. The blocking weather regime over the North Atlantic is mainly responsible for the extreme moisture transport, occupying more than 60% of the total extreme days, while the negative North Atlantic Oscillation regime is hardly observed at all during the extreme transport days. These extreme moisture transport events appear to be preceded by eastward propagating large-scale tropical convective forcing by as long as 2 weeks but with great uncertainty due to lack of statistical significance.

  6. Extreme interplanetary rotational discontinuities at 1 AU

    NASA Astrophysics Data System (ADS)

    Lepping, R. P.; Wu, C.-C.

    2005-11-01

    This study is concerned with the identification and description of a special subset of four Wind interplanetary rotational discontinuities (from an earlier study of 134 directional discontinuities by Lepping et al. (2003)) with some "extreme" characteristics, in the sense that every case has (1) an almost planar current sheet surface, (2) a very large discontinuity angle (ω), (3) at least moderately strong normal field components (>0.8 nT), and (4) the overall set has a very broad range of transition layer thicknesses, with one being as thick as 50 RE and another at the other extreme being 1.6 RE, most being much thicker than are usually studied. Each example has a well-determined surface normal (n) according to minimum variance analysis and corroborated via time delay checking of the discontinuity with observations at IMP 8 by employing the local surface planarity. From the variance analyses, most of these cases had unusually large ratios of intermediate-to-minimum eigenvalues (λI/λmin), being on average 32 for three cases (with a fourth being much larger), indicating compact current sheet transition zones, another (the fifth) extreme property. For many years there has been a controversy as to the relative distribution of rotational (RDs) to tangential discontinuities (TDs) in the solar wind at 1 AU (and elsewhere, such as between the Sun and Earth), even to the point where some authors have suggested that RDs with large ∣Bn∣s are probably not generated or, if generated, are unstable and therefore very rare. Some of this disagreement apparently has been due to the different selection criteria used, e.g., some allowed eigenvalue ratios (λI/λmin) to be almost an order of magnitude lower than 32 in estimating n, usually introducing unacceptable error in n and therefore also in ∣Bn∣. However, we suggest that RDs may not be so rare at 1 AU, but good quality cases (where ∣Bn∣ confidently exceeds the error in ∣Bn∣) appear to be uncommon, and further, cases of large ∣Bn∣ may indeed be rare. Finally, the issue of estimating the number of RDs-to-TDs was revisited using the full 134 events of the original Lepping et al. (2003) study (which utilized the RDs' propagation speeds for this estimation, an unconventional approach) but now by considering only normal field components, the more conventional approach. This resulted in slightly different conclusions, depending on specific assumptions used, making the unconventional approach suspect.

  7. Optical phased array configuration for an extremely large telescope.

    PubMed

    Meinel, Aden Baker; Meinel, Marjorie Pettit

    2004-01-20

    Extremely large telescopes are currently under consideration by several groups in several countries. Extrapolation of current technology up to 30 m indicates a cost of over dollars 1 billion. Innovative concepts are being explored to find significant cost reductions. We explore the concept of an Optical Phased Array (OPA) telescope. Each element of the OPA is a separate Cassegrain telescope. Collimated beams from the array are sent via an associated set of delay lines to a central beam combiner. This array of small telescope elements offers the possibility of starting with a low-cost array of a few rings of elements, adding structure and additional Cass elements until the desired diameter telescope is attained. We address the salient features of such an extremely large telescope and cost elements relative to more conventional options.

  8. Sequences of extremal radially excited rotating black holes.

    PubMed

    Blázquez-Salcedo, Jose Luis; Kunz, Jutta; Navarro-Lérida, Francisco; Radu, Eugen

    2014-01-10

    In the Einstein-Maxwell-Chern-Simons theory the extremal Reissner-Nordström solution is no longer the single extremal solution with vanishing angular momentum, when the Chern-Simons coupling constant reaches a critical value. Instead a whole sequence of rotating extremal J=0 solutions arises, labeled by the node number of the magnetic U(1) potential. Associated with the same near horizon solution, the mass of these radially excited extremal solutions converges to the mass of the extremal Reissner-Nordström solution. On the other hand, not all near horizon solutions are also realized as global solutions.

  9. Managing protected health information in distributed research network environments: automated review to facilitate collaboration

    PubMed Central

    2013-01-01

    Background Multi-site health sciences research is becoming more common, as it enables investigation of rare outcomes and diseases and new healthcare innovations. Multi-site research usually involves the transfer of large amounts of research data between collaborators, which increases the potential for accidental disclosures of protected health information (PHI). Standard protocols for preventing release of PHI are extremely vulnerable to human error, particularly when the shared data sets are large. Methods To address this problem, we developed an automated program (SAS macro) to identify possible PHI in research data before it is transferred between research sites. The macro reviews all data in a designated directory to identify suspicious variable names and data patterns. The macro looks for variables that may contain personal identifiers such as medical record numbers and social security numbers. In addition, the macro identifies dates and numbers that may identify people who belong to small groups, who may be identifiable even in the absences of traditional identifiers. Results Evaluation of the macro on 100 sample research data sets indicated a recall of 0.98 and precision of 0.81. Conclusions When implemented consistently, the macro has the potential to streamline the PHI review process and significantly reduce accidental PHI disclosures. PMID:23521861

  10. From drop impact physics to spray cooling models: a critical review

    NASA Astrophysics Data System (ADS)

    Breitenbach, Jan; Roisman, Ilia V.; Tropea, Cameron

    2018-03-01

    Spray-wall interaction is an important process encountered in a large number of existing and emerging technologies and is the underlying phenomenon associated with spray cooling. Spray cooling is a very efficient technology, surpassing all other conventional cooling methods, especially those not involving phase change and not exploiting the latent heat of vaporization. However, the effectiveness of spray cooling is dependent on a large number of parameters, including spray characteristics like drop size, velocity and number density, the surface morphology, but also on the temperature range and thermal properties of the materials involved. Indeed, the temperature of the substrate can have significant influence on the hydrodynamics of drop and spray impact, an aspect which is seldom considered in model formulation. This process is extremely complex, thus most design rules to date are highly empirical in nature. On the other hand, significant theoretical progress has been made in recent years about the interaction of single drops with heated walls and improvements to the fundamentals of spray cooling can now be anticipated. The present review has the objective of summarizing some of these recent advances and to establish a framework for future development of more reliable and universal physics-based correlations to describe quantities involved in spray cooling.

  11. Multiscale numerical simulations of magnetoconvection at low magnetic Prandtl and Rossby numbers.

    NASA Astrophysics Data System (ADS)

    Maffei, S.; Calkins, M. A.; Julien, K. A.; Marti, P.

    2017-12-01

    The dynamics of the Earth's outer core is characterized by low values of the Rossby (Ro), Ekman and magnetic Prandtl numbers. These values indicate the large spectra of temporal and spatial scales that need to be accounted for in realistic numerical simulations of the system. Current direct numerical simulation are not capable of reaching this extreme regime, suggesting that a new class of models is required to account for the rich dynamics expected in the natural system. Here we present results from a quasi-geostrophic, multiscale model based on the scale separation implied by the low Ro typical of rapidly rotating systems. We investigate a plane layer geometry where convection is driven by an imposed temperature gradient and the hydrodynamic equations are modified by a large scale magnetic field. Analytical investigation shows that at values of thermal and magnetic Prandtl numbers relevant for liquid metals, the energetic requirements for the onset of convection is not significantly altered even in the presence of strong magnetic fields. Results from strongly forced nonlinear numerical simulations show the presence of an inverse cascade, typical of 2-D turbulence, when no or weak magnetic field is applied. For higher values of the magnetic field the inverse cascade is quenched.

  12. Response of a 2-story test-bed structure for the seismic evaluation of nonstructural systems

    NASA Astrophysics Data System (ADS)

    Soroushian, Siavash; Maragakis, E. "Manos"; Zaghi, Arash E.; Rahmanishamsi, Esmaeel; Itani, Ahmad M.; Pekcan, Gokhan

    2016-03-01

    A full-scale, two-story, two-by-one bay, steel braced-frame was subjected to a number of unidirectional ground motions using three shake tables at the UNR-NEES site. The test-bed frame was designed to study the seismic performance of nonstructural systems including steel-framed gypsum partition walls, suspended ceilings and fire sprinkler systems. The frame can be configured to perform as an elastic or inelastic system to generate large floor accelerations or large inter story drift, respectively. In this study, the dynamic performance of the linear and nonlinear test-beds was comprehensively studied. The seismic performance of nonstructural systems installed in the linear and nonlinear test-beds were assessed during extreme excitations. In addition, the dynamic interactions of the test-bed and installed nonstructural systems are investigated.

  13. A study of the viability of exploiting memory content similarity to improve resilience to memory errors

    DOE PAGES

    Levy, Scott; Ferreira, Kurt B.; Bridges, Patrick G.; ...

    2014-12-09

    Building the next-generation of extreme-scale distributed systems will require overcoming several challenges related to system resilience. As the number of processors in these systems grow, the failure rate increases proportionally. One of the most common sources of failure in large-scale systems is memory. In this paper, we propose a novel runtime for transparently exploiting memory content similarity to improve system resilience by reducing the rate at which memory errors lead to node failure. We evaluate the viability of this approach by examining memory snapshots collected from eight high-performance computing (HPC) applications and two important HPC operating systems. Based on themore » characteristics of the similarity uncovered, we conclude that our proposed approach shows promise for addressing system resilience in large-scale systems.« less

  14. Ultra-broadband ptychography with self-consistent coherence estimation from a high harmonic source

    NASA Astrophysics Data System (ADS)

    Odstrčil, M.; Baksh, P.; Kim, H.; Boden, S. A.; Brocklesby, W. S.; Frey, J. G.

    2015-09-01

    With the aim of improving imaging using table-top extreme ultraviolet sources, we demonstrate coherent diffraction imaging (CDI) with relative bandwidth of 20%. The coherence properties of the illumination probe are identified using the same imaging setup. The presented methods allows for the use of fewer monochromating optics, obtaining higher flux at the sample and thus reach higher resolution or shorter exposure time. This is important in the case of ptychography when a large number of diffraction patterns need to be collected. Our microscopy setup was tested on a reconstruction of an extended sample to show the quality of the reconstruction. We show that high harmonic generation based EUV tabletop microscope can provide reconstruction of samples with a large field of view and high resolution without additional prior knowledge about the sample or illumination.

  15. The GMT-Consortium Large Earth Finder (G-CLEF) : An Optical Echelle Spectrograph for the Giant Magellan Telescope (GMT) with Multi-Object Spectroscopy (MOS) Capability

    NASA Astrophysics Data System (ADS)

    Szentgyorgyi, Andrew

    2017-09-01

    "The GMT-Consortium Large Earth Finder (G-CLEF) is an optical band echelle spectrograph that has been selected as the first light instrument for the Giant Magellan Telescope (GMT). G-CLEF is a general purpose, high dispersion instrument that is fiber fed and capable of extremely precise radial velocity (PRV) measurements. G-CLEF will have a novel multi-object spectroscopy (MOS) capability that will be useful for a number of exoplanet science programs. I describe the general properties of G-CLEF and the systems engineering analyses, especially for PRV, that drove the current G-CLEF design. The requirements for calibration of the MOS channel are presented along with several novel approaches for achieving moderate radial velocity precision in the MOS mode."

  16. Genomic diversity of the human intestinal parasite Entamoeba histolytica

    PubMed Central

    2012-01-01

    Background Entamoeba histolytica is a significant cause of disease worldwide. However, little is known about the genetic diversity of the parasite. We re-sequenced the genomes of ten laboratory cultured lines of the eukaryotic pathogen Entamoeba histolytica in order to develop a picture of genetic diversity across the genome. Results The extreme nucleotide composition bias and repetitiveness of the E. histolytica genome provide a challenge for short-read mapping, yet we were able to define putative single nucleotide polymorphisms in a large portion of the genome. The results suggest a rather low level of single nucleotide diversity, although genes and gene families with putative roles in virulence are among the more polymorphic genes. We did observe large differences in coverage depth among genes, indicating differences in gene copy number between genomes. We found evidence indicating that recombination has occurred in the history of the sequenced genomes, suggesting that E. histolytica may reproduce sexually. Conclusions E. histolytica displays a relatively low level of nucleotide diversity across its genome. However, large differences in gene family content and gene copy number are seen among the sequenced genomes. The pattern of polymorphism indicates that E. histolytica reproduces sexually, or has done so in the past, which has previously been suggested but not proven. PMID:22630046

  17. Emaciation and larval filarioid nematode infection in boreal owls (Aegolius funereus).

    PubMed

    Larrat, Sylvain; Dallaire, André D; Lair, Stéphane

    2012-01-01

    Microfilariae are considered non-pathogenic in wild birds. The objective of the current communication is to report host reactions to microfilarial infection of unusual intensity in emaciated boreal owls (Aegolius funereus). An unusually large number of boreal owls (n = 21) were submitted to the Canadian Cooperative Wildlife Health Center-Quebec Region for post-mortem examination during the winter of 2009. Nineteen out of 21 birds were considered emaciated based on atrophy of adipose tissue and pectoral muscles and suboptimal weight. A microscopic examination of a subset of nine owls revealed the presence of microfilariae in six owls. Three of the birds with a heavy parasite burden had masses of larval nematodes obstructing large vessels of the lungs. The emaciated owls are believed to have died from starvation due to a cyclic decrease in prey abundance in the boreal forest. This cycle also drives winter movements of boreal owls to urbanized areas of southern Quebec, presumably accounting for the large number of birds submitted in 2009. In the most severely infected owls, the extreme microfilarial burden might have caused an alteration in circulatory dynamics, gaseous exchanges and also probably some metabolic cost. Consequently, microfilariae could have significantly contributed to the death of some of these owls.

  18. An Intelligent Surveillance Platform for Large Metropolitan Areas with Dense Sensor Deployment

    PubMed Central

    Fernández, Jorge; Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M.; Carro, Belén; Sánchez-Esguevillas, Antonio; Alonso-López, Jesus A.; Smilansky, Zeev

    2013-01-01

    This paper presents an intelligent surveillance platform based on the usage of large numbers of inexpensive sensors designed and developed inside the European Eureka Celtic project HuSIMS. With the aim of maximizing the number of deployable units while keeping monetary and resource/bandwidth costs at a minimum, the surveillance platform is based on the usage of inexpensive visual sensors which apply efficient motion detection and tracking algorithms to transform the video signal in a set of motion parameters. In order to automate the analysis of the myriad of data streams generated by the visual sensors, the platform's control center includes an alarm detection engine which comprises three components applying three different Artificial Intelligence strategies in parallel. These strategies are generic, domain-independent approaches which are able to operate in several domains (traffic surveillance, vandalism prevention, perimeter security, etc.). The architecture is completed with a versatile communication network which facilitates data collection from the visual sensors and alarm and video stream distribution towards the emergency teams. The resulting surveillance system is extremely suitable for its deployment in metropolitan areas, smart cities, and large facilities, mainly because cheap visual sensors and autonomous alarm detection facilitate dense sensor network deployments for wide and detailed coverage. PMID:23748169

  19. Flood protection diversification to reduce probabilities of extreme losses.

    PubMed

    Zhou, Qian; Lambert, James H; Karvetski, Christopher W; Keisler, Jeffrey M; Linkov, Igor

    2012-11-01

    Recent catastrophic losses because of floods require developing resilient approaches to flood risk protection. This article assesses how diversification of a system of coastal protections might decrease the probabilities of extreme flood losses. The study compares the performance of portfolios each consisting of four types of flood protection assets in a large region of dike rings. A parametric analysis suggests conditions in which diversifications of the types of included flood protection assets decrease extreme flood losses. Increased return periods of extreme losses are associated with portfolios where the asset types have low correlations of economic risk. The effort highlights the importance of understanding correlations across asset types in planning for large-scale flood protection. It allows explicit integration of climate change scenarios in developing flood mitigation strategy. © 2012 Society for Risk Analysis.

  20. Applications of Extreme Value Theory in Public Health.

    PubMed

    Thomas, Maud; Lemaitre, Magali; Wilson, Mark L; Viboud, Cécile; Yordanov, Youri; Wackernagel, Hans; Carrat, Fabrice

    2016-01-01

    We present how Extreme Value Theory (EVT) can be used in public health to predict future extreme events. We applied EVT to weekly rates of Pneumonia and Influenza (P&I) deaths over 1979-2011. We further explored the daily number of emergency department visits in a network of 37 hospitals over 2004-2014. Maxima of grouped consecutive observations were fitted to a generalized extreme value distribution. The distribution was used to estimate the probability of extreme values in specified time periods. An annual P&I death rate of 12 per 100,000 (the highest maximum observed) should be exceeded once over the next 30 years and each year, there should be a 3% risk that the P&I death rate will exceed this value. Over the past 10 years, the observed maximum increase in the daily number of visits from the same weekday between two consecutive weeks was 1133. We estimated at 0.37% the probability of exceeding a daily increase of 1000 on each month. The EVT method can be applied to various topics in epidemiology thus contributing to public health planning for extreme events.

  1. Impacts of Anthropogenic Aerosols on Regional Climate: Extreme Events, Stagnation, and the United States Warming Hole

    NASA Astrophysics Data System (ADS)

    Mascioli, Nora R.

    Extreme temperatures, heat waves, heavy rainfall events, drought, and extreme air pollution events have adverse effects on human health, infrastructure, agriculture and economies. The frequency, magnitude and duration of these events are expected to change in the future in response to increasing greenhouse gases and decreasing aerosols, but future climate projections are uncertain. A significant portion of this uncertainty arises from uncertainty in the effects of aerosol forcing: to what extent were the effects from greenhouse gases masked by aerosol forcing over the historical observational period, and how much will decreases in aerosol forcing influence regional and global climate over the remainder of the 21st century? The observed frequency and intensity of extreme heat and precipitation events have increased in the U.S. over the latter half of the 20th century. Using aerosol only (AER) and greenhouse gas only (GHG) simulations from 1860 to 2005 in the GFDL CM3 chemistry-climate model, I parse apart the competing influences of aerosols and greenhouse gases on these extreme events. I find that small changes in extremes in the "all forcing" simulations reflect cancellations between the effects of increasing anthropogenic aerosols and greenhouse gases. In AER, extreme high temperatures and the number of days with temperatures above the 90th percentile decline over most of the U.S., while in GHG high temperature extremes increase over most of the U.S. The spatial response patterns in AER and GHG are significantly anti-correlated, suggesting a preferred regional mode of response that is largely independent of the type of forcing. Extreme precipitation over the eastern U.S. decreases in AER, particularly in winter, and increases over the eastern and central U.S. in GHG, particularly in spring. Over the 21 st century under the RCP8.5 emissions scenario, the patterns of extreme temperature and precipitation change associated with greenhouse gas forcing dominate. The temperature response pattern in AER and GHG is characterized by strong responses over the western U.S. and weak or opposite signed responses over the southeast U.S., raising the question of whether the observed U.S. "warming hole" could have a forced component. To address this question, I systematically examine observed seasonal temperature trends over all time periods of at least 10 years during 1901-2015. In the northeast and southern U.S., significant summertime cooling occurs from the early 1950s to the mid 1970s, which I partially attribute to increasing anthropogenic aerosol emissions (median fraction of the observed temperature trends explained is 0.69 and 0.17, respectively). In winter, the northeast and southern U.S. cool significantly from the early 1950s to the early 1990s, which I attribute to long-term phase changes in the North Atlantic Oscillation and the Pacific Decadal Oscillation. Rather than being a single phenomenon stemming from a single cause, both the warming hole and its dominant drivers vary by season, region, and time period. Finally, I examine historical and projected future changes in atmospheric stagnation. Stagnation, which is characterized by weak winds and an absence of precipitation, is a meteorological contributor to heat waves, extreme pollution, and drought. Using CM3, I show that regional stagnation trends over the historical period (1860-2005) are driven by changes in anthropogenic aerosol emissions, rather than rising greenhouse gases. In the northeastern and central United States, aerosol-induced changes in surface and upper level winds produce significant decreases in the number of stagnant summer days, while decreasing precipitation in the southeast US increases the number of stagnant summer days. Outside of the U.S., significant drying over eastern China in response to rising aerosol emissions contributed to increased stagnation during 1860-2005. Additionally, this region was found to be particularly sensitive to changes in local aerosol emissions, indicating that decreasing Chinese emissions in efforts to improve air quality will also decrease stagnation. In Europe, I find a dipole response pattern during the historical period wherein stagnation decreases over southern Europe and increases over northern Europe in response to global increases in aerosol emissions. In the future, declining aerosol emissions will likely lead to a reversal of the historical stagnation trends, with increasing greenhouse gases again playing a secondary role. Aerosols have a significant effect on a number of societally important extreme events, including heat waves, intense rainfall events, drought, and stagnation. Further, uncertainty in the strength of aerosol masking of historical greenhouse gas forcing is a significant source of spread in future climate projections. Quantifying these aerosol effects is therefore critical for our ability to accurately project and prepare for future changes in extreme events.

  2. Hand dominance in intravenous drug using patients does not affect peripheral venous access sites identified by ultrasound.

    PubMed

    Kaban, Nicole L; Avitabile, Nicholas C; Siadecki, Sebastian D; Saul, Turandot

    2016-06-01

    The peripheral veins in the arms and forearms of patients with a history of intravenous (IV) drug use may be sclerosed, calcified, or collapsed due to damage from previous injections. These patients may consequently require alternative, more invasive types of vascular access including central venous or intraosseous catheters. We investigated the relationship between hand dominance and the presence of patent upper extremity (UE) veins specifically in patients with a history of IV drug-use. We predicted that injection into the non-dominant UE would occur with a higher frequency than the dominant UE, leading to fewer damaged veins in the dominant UE. If hand dominance affects which upper extremity has more patent veins, providers could focus their first vascular access attempt on the dominant upper extremity. Adult patients were approached for enrollment if they provided a history of IV drug use into one of their upper extremities. Each upper extremity was examined with a high frequency linear transducer in 3 areas: the antecubital crease, forearm and the proximal arm. The number of fully compressible veins ≥1.8 mm in diameter was recorded for each location. The mean vein difference between the numbers of veins in the dominant versus the non-dominant UE was -1.5789. At a .05 significance level, there was insufficient evidence to suggest the number of compressible veins between patients' dominant and non-dominant arms was significantly different (P = .0872.) The number of compressible veins visualized with ultrasound was not greater in the dominant upper extremity as expected. Practitioners may gain more information about potential peripheral venous access sites by asking patients their previous injection practice patterns. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Developing a passive load reduction blade for the DTU 10 MW reference turbine

    NASA Astrophysics Data System (ADS)

    de Vaal, J. B.; Nygaard, T. A.; Stenbro, R.

    2016-09-01

    This paper presents the development of a passive load reduction blade for the DTU 10 MW reference wind turbine, using the aero-hydro-servo-elastic analysis tool 3DFloat. Passive load reduction is achieved by introducing sweep to the path of the blade elastic axis, so that out-of-plane bending deflections result in load alleviating torsional deformations of the blade. Swept blades are designed to yield similar annual energy production as a rotor with a reference straight blade. This is achieved by modifying the aerodynamic twist distribution for swept blades based on non-linear blade deflection under steady state loads. The passive load reduction capability of a blade design is evaluated by running a selection of fatigue- and extreme load cases with the analysis tool 3DFloat and determining equivalent fatigue loads, fatigue damage and extreme loads at the blade root and tower base. The influence of sweep on the flutter speed of a blade design is also investigated. A large number of blade designs are evaluated by varying the parameters defining the sweep path of a blade's elastic axis. Results show that a moderate amount of sweep can effectively reduce equivalent fatigue damage and extreme loads, without significantly reducing the flutter speed, or compromising annual energy production.

  4. PHASE QUANTIZATION STUDY OF SPATIAL LIGHT MODULATOR FOR EXTREME HIGH-CONTRAST IMAGING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dou, Jiangpei; Ren, Deqing, E-mail: jpdou@niaot.ac.cn, E-mail: jiangpeidou@gmail.com

    2016-11-20

    Direct imaging of exoplanets by reflected starlight is extremely challenging due to the large luminosity ratio to the primary star. Wave-front control is a critical technique to attenuate the speckle noise in order to achieve an extremely high contrast. We present a phase quantization study of a spatial light modulator (SLM) for wave-front control to meet the contrast requirement of detection of a terrestrial planet in the habitable zone of a solar-type star. We perform the numerical simulation by employing the SLM with different phase accuracy and actuator numbers, which are related to the achievable contrast. We use an optimizationmore » algorithm to solve the quantization problems that is matched to the controllable phase step of the SLM. Two optical configurations are discussed with the SLM located before and after the coronagraph focal plane mask. The simulation result has constrained the specification for SLM phase accuracy in the above two optical configurations, which gives us a phase accuracy of 0.4/1000 and 1/1000 waves to achieve a contrast of 10{sup -10}. Finally, we have demonstrated that an SLM with more actuators can deliver a competitive contrast performance on the order of 10{sup -10} in comparison to that by using a deformable mirror.« less

  5. Phase Quantization Study of Spatial Light Modulator for Extreme High-contrast Imaging

    NASA Astrophysics Data System (ADS)

    Dou, Jiangpei; Ren, Deqing

    2016-11-01

    Direct imaging of exoplanets by reflected starlight is extremely challenging due to the large luminosity ratio to the primary star. Wave-front control is a critical technique to attenuate the speckle noise in order to achieve an extremely high contrast. We present a phase quantization study of a spatial light modulator (SLM) for wave-front control to meet the contrast requirement of detection of a terrestrial planet in the habitable zone of a solar-type star. We perform the numerical simulation by employing the SLM with different phase accuracy and actuator numbers, which are related to the achievable contrast. We use an optimization algorithm to solve the quantization problems that is matched to the controllable phase step of the SLM. Two optical configurations are discussed with the SLM located before and after the coronagraph focal plane mask. The simulation result has constrained the specification for SLM phase accuracy in the above two optical configurations, which gives us a phase accuracy of 0.4/1000 and 1/1000 waves to achieve a contrast of 10-10. Finally, we have demonstrated that an SLM with more actuators can deliver a competitive contrast performance on the order of 10-10 in comparison to that by using a deformable mirror.

  6. Statistical Downscaling and Bias Correction of Climate Model Outputs for Climate Change Impact Assessment in the U.S. Northeast

    NASA Technical Reports Server (NTRS)

    Ahmed, Kazi Farzan; Wang, Guiling; Silander, John; Wilson, Adam M.; Allen, Jenica M.; Horton, Radley; Anyah, Richard

    2013-01-01

    Statistical downscaling can be used to efficiently downscale a large number of General Circulation Model (GCM) outputs to a fine temporal and spatial scale. To facilitate regional impact assessments, this study statistically downscales (to 1/8deg spatial resolution) and corrects the bias of daily maximum and minimum temperature and daily precipitation data from six GCMs and four Regional Climate Models (RCMs) for the northeast United States (US) using the Statistical Downscaling and Bias Correction (SDBC) approach. Based on these downscaled data from multiple models, five extreme indices were analyzed for the future climate to quantify future changes of climate extremes. For a subset of models and indices, results based on raw and bias corrected model outputs for the present-day climate were compared with observations, which demonstrated that bias correction is important not only for GCM outputs, but also for RCM outputs. For future climate, bias correction led to a higher level of agreements among the models in predicting the magnitude and capturing the spatial pattern of the extreme climate indices. We found that the incorporation of dynamical downscaling as an intermediate step does not lead to considerable differences in the results of statistical downscaling for the study domain.

  7. Spatio-Temporal Changes In Non-Extreme Precipitation Variability Over North America

    NASA Astrophysics Data System (ADS)

    Roque, S.

    2016-12-01

    Precipitation variability encompasses attributes associated with the sequencing and duration of events of the full range of magnitudes. However, climate change studies have largely focused on extreme events. Using analyses of long-term weather station data we show that high frequency events, such as fraction of wet days in a year and average duration of wet and dry periods, are undergoing significant changes across North America. The median increase in fraction of wet days in a year indicates that in 2010, North America experienced an additional 11 days of precipitation compared to 1960 (when the median number of wet days was 96), and wet periods that were 0.14 days longer than those in 1960 (when the median was 1.78 days). Further, these changes in high-frequency precipitation are more prevalent and larger than those associated with extremes. Such trends also exist for events of a range of magnitudes. Results reveal the existence of localized clusters with opposing trends to that of broader geographic variation, which illustrates the role of microclimate and other drivers of trends. Such hitherto unknown patterns have the potential to significantly inform our characterization of the resilience and vulnerability of a broad range of ecosystems, and agricultural and socio-economic systems. They can also set new benchmarks for climate model assessments.

  8. A Test-Length Correction to the Estimation of Extreme Proficiency Levels

    ERIC Educational Resources Information Center

    Magis, David; Beland, Sebastien; Raiche, Gilles

    2011-01-01

    In this study, the estimation of extremely large or extremely small proficiency levels, given the item parameters of a logistic item response model, is investigated. On one hand, the estimation of proficiency levels by maximum likelihood (ML), despite being asymptotically unbiased, may yield infinite estimates. On the other hand, with an…

  9. The challenge of precise orbit determination for STSAT-2C using extremely sparse SLR data

    NASA Astrophysics Data System (ADS)

    Kim, Young-Rok; Park, Eunseo; Kucharski, Daniel; Lim, Hyung-Chul; Kim, Byoungsoo

    2016-03-01

    The Science and Technology Satellite (STSAT)-2C is the first Korean satellite equipped with a laser retro-reflector array for satellite laser ranging (SLR). SLR is the only on-board tracking source for precise orbit determination (POD) of STSAT-2C. However, POD for the STSAT-2C is a challenging issue, as the laser measurements of the satellite are extremely sparse, largely due to the inaccurate two-line element (TLE)-based orbit predictions used by the SLR tracking stations. In this study, POD for the STSAT-2C using extremely sparse SLR data is successfully implemented, and new laser-based orbit predictions are obtained. The NASA/GSFC GEODYN II software and seven-day arcs are used for the SLR data processing of two years of normal points from March 2013 to May 2015. To compensate for the extremely sparse laser tracking, the number of estimation parameters are minimized, and only the atmospheric drag coefficients are estimated with various intervals. The POD results show that the weighted root mean square (RMS) post-fit residuals are less than 10 m, and the 3D day boundaries vary from 30 m to 3 km. The average four-day orbit overlaps are less than 20/330/20 m for the radial/along-track/cross-track components. The quality of the new laser-based prediction is verified by SLR observations, and the SLR residuals show better results than those of previous TLE-based predictions. This study demonstrates that POD for the STSAT-2C can be successfully achieved against extreme sparseness of SLR data, and the results can deliver more accurate predictions.

  10. Science-Driven Approach to Disaster Risk and Crisis Management

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.

    2014-12-01

    Disasters due to natural extreme events continue to grow in number and intensity. Disaster risk and crisis management requires long-term planning, and to undertake that planning, a science-driven approach is needed to understand and assess disaster risks and to help in impact assessment and in recovery processes after a disaster. Science is used in assessments and rapid modeling of the disaster impact, in forecasting triggered hazards and risk (e.g., a tsunami or a landslide after a large earthquake), in contacts with and medical treatment of the affected population, and in some other actions. At the stage of response to disaster, science helps to analyze routinely the disaster happened (e.g., the physical processes led to this extreme event; hidden vulnerabilities; etc.) At the stage of recovery, natural scientists improve the existing regional hazard assessments; engineers try to use new science to produce new materials and technologies to make safer houses and infrastructure. At the stage of disaster risk mitigation new scientific methods and approaches are being developed to study natural extreme events; vulnerability of society is periodically investigated, and the measures for increasing the resilience of society to extremes are developed; existing disaster management regulations are improved. At the stage of preparedness, integrated research on disaster risks should be developed to understand the roots of potential disasters. Enhanced forecasting and early warning systems are to be developed reducing predictive uncertainties, and comprehensive disaster risk assessment is to be undertaken at local, regional, national and global levels. Science education should be improved by introducing trans-disciplinary approach to disaster risks. Science can help society by improving awareness about extreme events, enhancing risk communication with policy makers, media and society, and assisting disaster risk management authorities in organization of local and regional training and exercises.

  11. Gunshot-induced fractures of the extremities: a review of antibiotic and debridement practices.

    PubMed

    Sathiyakumar, Vasanth; Thakore, Rachel V; Stinner, Daniel J; Obremskey, William T; Ficke, James R; Sethi, Manish K

    2015-09-01

    The use of antibiotic prophylaxis and debridement is controversial when treating low- and high-velocity gunshot-induced fractures, and established treatment guidelines are currently unavailable. The purpose of this review was to evaluate the literature for the prophylactic antibiotic and debridement policies for (1) low-velocity gunshot fractures of the extremities, joints, and pelvis and (2) high-velocity gunshot fractures of the extremities. Low-velocity gunshot fractures of the extremities were subcategorized into operative and non-operative cases, whereas low-velocity gunshot fractures of the joints and pelvis were evaluated based on the presence or absence of concomitant bowel injury. In the absence of surgical necessity for fracture care such as concomitant absence of gross wound contamination, vascular injury, large soft-tissue defect, or associated compartment syndrome, the literature suggests that superficial debridement for low-velocity ballistic fractures with administration of antibiotics is a satisfactory alternative to extensive operative irrigation and debridement. In operative cases or those involving bowel injuries secondary to pelvic fractures, the literature provides support for and against extensive debridement but does suggest the use of intravenous antibiotics. For high-velocity ballistic injuries, the literature points towards the practice of extensive immediate debridement with prophylactic intravenous antibiotics. Our systematic review demonstrates weak evidence for superficial debridement of low-velocity ballistic fractures, extensive debridement for high-velocity ballistic injuries, and antibiotic use for both types of injury. Intra-articular fractures seem to warrant debridement, while pelvic fractures with bowel injury have conflicting evidence for debridement but stronger evidence for antibiotic use. Given a relatively low number of studies on this subject, we recommend that further high-quality research on the debridement and antibiotic use for gunshot-induced fractures of the extremities should be conducted before definitive recommendations and guidelines are developed.

  12. How Many Protein Sequences Fold to a Given Structure? A Coevolutionary Analysis.

    PubMed

    Tian, Pengfei; Best, Robert B

    2017-10-17

    Quantifying the relationship between protein sequence and structure is key to understanding the protein universe. A fundamental measure of this relationship is the total number of amino acid sequences that can fold to a target protein structure, known as the "sequence capacity," which has been suggested as a proxy for how designable a given protein fold is. Although sequence capacity has been extensively studied using lattice models and theory, numerical estimates for real protein structures are currently lacking. In this work, we have quantitatively estimated the sequence capacity of 10 proteins with a variety of different structures using a statistical model based on residue-residue co-evolution to capture the variation of sequences from the same protein family. Remarkably, we find that even for the smallest protein folds, such as the WW domain, the number of foldable sequences is extremely large, exceeding the Avogadro constant. In agreement with earlier theoretical work, the calculated sequence capacity is positively correlated with the size of the protein, or better, the density of contacts. This allows the absolute sequence capacity of a given protein to be approximately predicted from its structure. On the other hand, the relative sequence capacity, i.e., normalized by the total number of possible sequences, is an extremely tiny number and is strongly anti-correlated with the protein length. Thus, although there may be more foldable sequences for larger proteins, it will be much harder to find them. Lastly, we have correlated the evolutionary age of proteins in the CATH database with their sequence capacity as predicted by our model. The results suggest a trade-off between the opposing requirements of high designability and the likelihood of a novel fold emerging by chance. Published by Elsevier Inc.

  13. Heterogeneous Sensitivity of Tropical Precipitation Extremes during Growth and Mature Phases of Atmospheric Warming

    NASA Astrophysics Data System (ADS)

    Parhi, P.; Giannini, A.; Lall, U.; Gentine, P.

    2016-12-01

    Assessing and managing risks posed by climate variability and change is challenging in the tropics, from both a socio-economic and a scientific perspective. Most of the vulnerable countries with a limited climate adaptation capability are in the tropics. However, climate projections, particularly of extreme precipitation, are highly uncertain there. The CMIP5 (Coupled Model Inter- comparison Project - Phase 5) inter-model range of extreme precipitation sensitivity to the global temperature under climate change is much larger in the tropics as compared to the extra-tropics. It ranges from nearly 0% to greater than 30% across models (O'Gorman 2012). The uncertainty is also large in historical gauge or satellite based observational records. These large uncertainties in the sensitivity of tropical precipitation extremes highlight the need to better understand how tropical precipitation extremes respond to warming. We hypothesize that one of the factors explaining the large uncertainty is due to differing sensitivities during different phases of warming. We consider the `growth' and `mature' phases of warming under climate variability case- typically associated with an El Niño event. In the remote tropics (away from tropical Pacific Ocean), the response of the precipitation extremes during the two phases can be through different pathways: i) a direct and fast changing radiative forcing in an atmospheric column, acting top-down due to the tropospheric warming, and/or ii) an indirect effect via changes in surface temperatures, acting bottom-up through surface water and energy fluxes. We also speculate that the insights gained here might be useful in interpreting the large sensitivity under climate change scenarios, since the physical mechanisms during the two warming phases under climate variability case, have some correspondence with an increasing and stabilized green house gas emission scenarios.

  14. Aerosol impacts on regional trends in atmospheric stagnation

    NASA Astrophysics Data System (ADS)

    Mascioli, N. R.; Fiore, A. M.; Previdi, M. J.

    2017-12-01

    Extreme pollution events pose a significant threat to human health and are a leading cause of premature mortality worldwide. While emissions of atmospheric pollutants and their precursors are projected to decrease in the future due to air quality legislation, future climate change may affect the underlying meteorological conditions that contribute to extreme pollution events. Stagnation events, characterized by weak winds and an absence of precipitation, contribute to extreme pollution by halting the removal of pollutants via advection and wet deposition. Here, we use a global climate model (GFDL-CM3) to show that regional stagnation trends over the historical period (1860-2005) are driven by changes in anthropogenic aerosol emissions, rather than rising greenhouse gases. In the northeastern and central United States, aerosol-induced changes in surface and upper level winds have produced significant decreases in the number of stagnant summer days, while decreasing precipitation in the southeast US has increased the number of stagnant summer days. Significant drying over eastern China in response to aerosol forcing contributed to increased stagnation. Additionally, this region was found to be particularly sensitive to changes in local emissions, indicating that improving air quality will also lessen stagnation. In Europe, we find a dipole pattern wherein stagnation decreases over southern Europe and increases over northern Europe in response to global increases in aerosol emissions. We hypothesize that this is due to changes in the large-scale circulation patterns associated with a poleward shift of the North Atlantic storm track. We find that in the future, the combination of declining aerosol emissions and the continued rise of greenhouse gas emissions will lead to a reversal of the historical stagnation trends.

  15. Application of the Haines Index in the fire warning system

    NASA Astrophysics Data System (ADS)

    Kalin, Lovro; Marija, Mokoric; Tomislav, Kozaric

    2016-04-01

    Croatia, as all Mediterranean countries, is strongly affected by large wildfires, particularly in the coastal region. In the last two decades the number and intensity of fires has been significantly increased, which is unanimously associated with climate change, e.g. global warming. More extreme fires are observed, and the fire-fighting season has been expanded to June and September. The meteorological support for fire protection and planning is therefore even more important. At the Meteorological and Hydrological Service of Croatia a comprehensive monitoring and warning system has been established. It includes standard components, such as short term forecast of Fire Weather Index (FWI), but long range forecast as well. However, due to more frequent hot and dry seasons, FWI index often does not provide additional information of extremely high fire danger, since it regularly takes the highest values for long periods. Therefore the additional tools have been investigated. One of widely used meteorological products is the Haines index (HI). It provides information of potential fire growth, taking into account only the vertical instability of the atmosphere, and not the state of the fuel. Several analyses and studies carried out at the Service confirmed the correlation of high HI values with large and extreme fires. The Haines index forecast has been used at the Service for several years, employing European Centre for Medium Range Weather Forecast (ECMWF) global prediction model, as well as the limited-area Aladin model. The verification results show that these forecast are reliable, when compared to radiosonde measurements. All these results provided the introduction of the additional fire warnings, that are issued by the Service's Forecast Department.

  16. Response and Resiliency of Wildlife and Vegetation to Large-Scale Wildfires and Climate Change in the North Cascades

    NASA Astrophysics Data System (ADS)

    Bartowitz, K.; Morrison, P.; Romain-Bondi, K.; Smith, C. W.; Warne, L.; McGill, D.

    2016-12-01

    Changing climatic patterns have affected the western US in a variety of ways: decreases in precipitation and snowpack, earlier spring snowmelt, and increased lightning strikes have created a drier, more fire-prone system, despite variability in these characteristics. Wildfires are a natural phenomenon, but have been suppressed for much of the past century. Effects of this evolving fire regime on native vegetation and wildlife are not well understood. Increased frequency and intensity of fires coupled with subsequent drought and extreme heat may inhibit or alter recovery of native ecosystems. We are currently investigating how a mega-fire has affected presence of western gray squirrels (Sciurus griseus, WGS) in the North Cascades, and the mortality, survival, and recovery of vegetation following these fires and extreme drought. The Methow Valley in WA experienced a record-breaking wildfire in 2014, which disturbed nearly 50% of priority habitat of the North Cascades population of WGS. WGS were studied at the same pre and post-fire plots. WGS were present at over half of the post-burn plots (58%). There was a significant difference in the number of WGS hair samples collected in different levels of remaining vegetation: the most in moderate, few in low, and none in high. Vegetation recovery was assessed through field data, and a chronosequence of satellite images and aerial photography. 75% of the 2014 fire burned non-forested vegetation. Ponderosa pine forests comprised the rest. The forests experienced about 70% initial mortality. Recovery of the forest appears slower than in the shrub-steppe. First year seedling survival was poor due to an extremely hot, dry summer, while second year survival appears higher due to a cool, moist spring and summer. One year after a large, multi-severity fire we found WGS may be more resilient to disturbance such as fires than previously thought. Future studies of WGS will help elucidate long-term response to large-scale fires, and aid in management of the state-threatened species. The combination of severe fire and extreme heat/drought may result in shifts from shrub-steppe to grass/forb communities, as well as range contraction of ponderosa pine forests. The study reveals the importance of subsequent climatic conditions on vegetation recovery after a fire.

  17. Cascades of alternating pitchfork and flip bifurcations in H-bridge inverters

    NASA Astrophysics Data System (ADS)

    Avrutin, Viktor; Zhusubaliyev, Zhanybai T.; Mosekilde, Erik

    2017-04-01

    Power electronic DC/AC converters (inverters) play an important role in modern power engineering. These systems are also of considerable theoretical interest because their dynamics is influenced by the presence of two vastly different forcing frequencies. As a consequence, inverter systems may be modeled in terms of piecewise smooth maps with an extremely high number of switching manifolds. We have recently shown that models of this type can demonstrate a complicated bifurcation structure associated with the occurrence of border collisions. Considering the example of a PWM H-bridge single-phase inverter, the present paper discusses a number of unusual phenomena that can occur in piecewise smooth maps with a very large number of switching manifolds. We show in particular how smooth (pitchfork and flip) bifurcations may form a macroscopic pattern that stretches across the overall bifurcation structure. We explain the observed bifurcation phenomena, show under which conditions they occur, and describe them quantitatively by means of an analytic approximation.

  18. Influence of blocking on Northern European and Western Russian heatwaves in large climate model ensembles

    NASA Astrophysics Data System (ADS)

    Schaller, N.; Sillmann, J.; Anstey, J.; Fischer, E. M.; Grams, C. M.; Russo, S.

    2018-05-01

    Better preparedness for summer heatwaves could mitigate their adverse effects on society. This can potentially be attained through an increased understanding of the relationship between heatwaves and one of their main dynamical drivers, atmospheric blocking. In the 1979–2015 period, we find that there is a significant correlation between summer heatwave magnitudes and the number of days influenced by atmospheric blocking in Northern Europe and Western Russia. Using three large global climate model ensembles, we find similar correlations, indicating that these three models are able to represent the relationship between extreme temperature and atmospheric blocking, despite having biases in their simulation of individual climate variables such as temperature or geopotential height. Our results emphasize the need to use large ensembles of different global climate models as single realizations do not always capture this relationship. The three large ensembles further suggest that the relationship between summer heatwaves and atmospheric blocking will not change in the future. This could be used to statistically model heatwaves with atmospheric blocking as a covariate and aid decision-makers in planning disaster risk reduction and adaptation to climate change.

  19. Trojan particles: Large porous carriers of nanoparticles for drug delivery

    PubMed Central

    Tsapis, N.; Bennett, D.; Jackson, B.; Weitz, D. A.; Edwards, D. A.

    2002-01-01

    We have combined the drug release and delivery potential of nanoparticle (NP) systems with the ease of flow, processing, and aerosolization potential of large porous particle (LPP) systems by spray drying solutions of polymeric and nonpolymeric NPs into extremely thin-walled macroscale structures. These hybrid LPPs exhibit much better flow and aerosolization properties than the NPs; yet, unlike the LPPs, which dissolve in physiological conditions to produce molecular constituents, the hybrid LPPs dissolve to produce NPs, with the drug release and delivery advantages associated with NP delivery systems. Formation of the large porous NP (LPNP) aggregates occurs via a spray-drying process that ensures the drying time of the sprayed droplet is sufficiently shorter than the characteristic time for redistribution of NPs by diffusion within the drying droplet, implying a local Peclet number much greater than unity. Additional control over LPNPs physical characteristics is achieved by adding other components to the spray-dried solutions, including sugars, lipids, polymers, and proteins. The ability to produce LPNPs appears to be largely independent of molecular component type as well as the size or chemical nature of the NPs. PMID:12200546

  20. A unified econophysics explanation for the power-law exponents of stock market activity

    NASA Astrophysics Data System (ADS)

    Gabaix, Xavier; Gopikrishnan, Parameswaran; Plerou, Vasiliki; Stanley, Eugene

    2007-08-01

    We survey a theory (first sketched in Nature in 2003, then fleshed out in the Quarterly Journal of Economics in 2006) of the economic underpinnings of the fat-tailed distributions of a number of financial variables, such as returns and trading volume. Our theory posits that they have a common origin in the strategic trading behavior of very large financial institutions in a relatively illiquid market. We show how the fat-tailed distribution of fund sizes can indeed generate extreme returns and volumes, even in the absence of fundamental news. Moreover, we are able to replicate the individually different empirical values of the power-law exponents for each distribution: 3 for returns, 3/2 for volumes, 1 for the assets under management of large investors. Large investors moderate their trades to reduce their price impact; coupled with a concave price impact function, this leads to volumes being more fat-tailed than returns but less fat-tailed than fund sizes. The trades of large institutions also offer a unified explanation for apparently disconnected empirical regularities that are otherwise a challenge for economic theory.

  1. Exact simulation of max-stable processes.

    PubMed

    Dombry, Clément; Engelke, Sebastian; Oesting, Marco

    2016-06-01

    Max-stable processes play an important role as models for spatial extreme events. Their complex structure as the pointwise maximum over an infinite number of random functions makes their simulation difficult. Algorithms based on finite approximations are often inexact and computationally inefficient. We present a new algorithm for exact simulation of a max-stable process at a finite number of locations. It relies on the idea of simulating only the extremal functions, that is, those functions in the construction of a max-stable process that effectively contribute to the pointwise maximum. We further generalize the algorithm by Dieker & Mikosch (2015) for Brown-Resnick processes and use it for exact simulation via the spectral measure. We study the complexity of both algorithms, prove that our new approach via extremal functions is always more efficient, and provide closed-form expressions for their implementation that cover most popular models for max-stable processes and multivariate extreme value distributions. For simulation on dense grids, an adaptive design of the extremal function algorithm is proposed.

  2. Small-scale dynamo at low magnetic Prandtl numbers

    NASA Astrophysics Data System (ADS)

    Schober, Jennifer; Schleicher, Dominik; Bovino, Stefano; Klessen, Ralf S.

    2012-12-01

    The present-day Universe is highly magnetized, even though the first magnetic seed fields were most probably extremely weak. To explain the growth of the magnetic field strength over many orders of magnitude, fast amplification processes need to operate. The most efficient mechanism known today is the small-scale dynamo, which converts turbulent kinetic energy into magnetic energy leading to an exponential growth of the magnetic field. The efficiency of the dynamo depends on the type of turbulence indicated by the slope of the turbulence spectrum v(ℓ)∝ℓϑ, where v(ℓ) is the eddy velocity at a scale ℓ. We explore turbulent spectra ranging from incompressible Kolmogorov turbulence with ϑ=1/3 to highly compressible Burgers turbulence with ϑ=1/2. In this work, we analyze the properties of the small-scale dynamo for low magnetic Prandtl numbers Pm, which denotes the ratio of the magnetic Reynolds number, Rm, to the hydrodynamical one, Re. We solve the Kazantsev equation, which describes the evolution of the small-scale magnetic field, using the WKB approximation. In the limit of low magnetic Prandtl numbers, the growth rate is proportional to Rm(1-ϑ)/(1+ϑ). We furthermore discuss the critical magnetic Reynolds number Rmcrit, which is required for small-scale dynamo action. The value of Rmcrit is roughly 100 for Kolmogorov turbulence and 2700 for Burgers. Furthermore, we discuss that Rmcrit provides a stronger constraint in the limit of low Pm than it does for large Pm. We conclude that the small-scale dynamo can operate in the regime of low magnetic Prandtl numbers if the magnetic Reynolds number is large enough. Thus, the magnetic field amplification on small scales can take place in a broad range of physical environments and amplify week magnetic seed fields on short time scales.

  3. Small-scale dynamo at low magnetic Prandtl numbers.

    PubMed

    Schober, Jennifer; Schleicher, Dominik; Bovino, Stefano; Klessen, Ralf S

    2012-12-01

    The present-day Universe is highly magnetized, even though the first magnetic seed fields were most probably extremely weak. To explain the growth of the magnetic field strength over many orders of magnitude, fast amplification processes need to operate. The most efficient mechanism known today is the small-scale dynamo, which converts turbulent kinetic energy into magnetic energy leading to an exponential growth of the magnetic field. The efficiency of the dynamo depends on the type of turbulence indicated by the slope of the turbulence spectrum v(ℓ)∝ℓ^{ϑ}, where v(ℓ) is the eddy velocity at a scale ℓ. We explore turbulent spectra ranging from incompressible Kolmogorov turbulence with ϑ=1/3 to highly compressible Burgers turbulence with ϑ=1/2. In this work, we analyze the properties of the small-scale dynamo for low magnetic Prandtl numbers Pm, which denotes the ratio of the magnetic Reynolds number, Rm, to the hydrodynamical one, Re. We solve the Kazantsev equation, which describes the evolution of the small-scale magnetic field, using the WKB approximation. In the limit of low magnetic Prandtl numbers, the growth rate is proportional to Rm^{(1-ϑ)/(1+ϑ)}. We furthermore discuss the critical magnetic Reynolds number Rm_{crit}, which is required for small-scale dynamo action. The value of Rm_{crit} is roughly 100 for Kolmogorov turbulence and 2700 for Burgers. Furthermore, we discuss that Rm_{crit} provides a stronger constraint in the limit of low Pm than it does for large Pm. We conclude that the small-scale dynamo can operate in the regime of low magnetic Prandtl numbers if the magnetic Reynolds number is large enough. Thus, the magnetic field amplification on small scales can take place in a broad range of physical environments and amplify week magnetic seed fields on short time scales.

  4. A space-based public service platform for terrestrial rescue operations

    NASA Technical Reports Server (NTRS)

    Fleisig, R.; Bernstein, J.; Cramblit, D. C.

    1977-01-01

    The space-based Public Service Platform (PSP) is a multibeam, high-gain communications relay satellite that can provide a variety of functions for a large number of people on earth equipped with extremely small, very low cost transceivers. This paper describes the PSP concept, the rationale used to derive the concept, the criteria for selecting specific communication functions to be performed, and the advantages of performing such functions via satellite. The discussion focuses on the benefits of using a PSP for natural disaster warning; control of attendant rescue/assistance operations; and rescue of people in downed aircraft, aboard sinking ships, lost or injured on land.

  5. Severe storm electricity

    NASA Technical Reports Server (NTRS)

    Rust, W. D.; Macgorman, D. R.

    1985-01-01

    During FY-85, Researchers conducted a field program and analyzed data. The field program incorporated coordinated measurements made with a NASA U2. Results include the following: (1) ground truth measurements of lightning for comparison with those obtained by the U2; (2) analysis of dual-Doppler radar and dual-VHF lightning mapping data from a supercell storm; (3) analysis of synoptic conditions during three simultaneous storm systems on 13 May 1983 when unusually large numbers of positive cloud-to-ground (+CG) flashes occurred; (4) analysis of extremely low frequency (ELF) wave forms; and (5) an assessment of a cloud -ground strike location system using a combination of mobile laboratory and fixed-base TV video data.

  6. Determination of electron-nucleus collisions geometry with forward neutrons

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zheng, L.; Aschenauer, E.; Lee, J. H.

    2014-12-29

    There are a large number of physics programs one can explore in electron-nucleus collisions at a future electron-ion collider. Collision geometry is very important in these studies, while the measurement for an event-by-event geometric control is rarely discussed in the prior deep-inelastic scattering experiments off a nucleus. This paper seeks to provide some detailed studies on the potential of tagging collision geometries through forward neutron multiplicity measurements with a zero degree calorimeter. As a result, this type of geometry handle, if achieved, can be extremely beneficial in constraining nuclear effects for the electron-nucleus program at an electron-ion collider.

  7. Ursodeoxycholic acid therapy in gallbladder disease, a story not yet completed

    PubMed Central

    Guarino, Michele Pier Luca; Cocca, Silvia; Altomare, Annamaria; Emerenziani, Sara; Cicala, Michele

    2013-01-01

    Gallstone disease represents an important issue in the healthcare system. The principal non-invasive non-surgical medical treatment for cholesterol gallstones is still represented by oral litholysis with bile acids. The first successful and documented dissolution of cholesterol gallstones was achieved in 1972. Since then a large number of investigators all over the world, have been dedicated in biochemical and clinical studies on ursodeoxycholic acid (UDCA), demonstrating its extreme versatility. This editorial is aimed to provide a brief review of recent developments in UDCA use, current indications for its use and, the more recent advances in understanding its effects in terms of an anti-inflammatory drug. PMID:23964136

  8. The Seasonal Predictability of Extreme Wind Events in the Southwest United States

    NASA Astrophysics Data System (ADS)

    Seastrand, Simona Renee

    Extreme wind events are a common phenomenon in the Southwest United States. Entities such as the United States Air Force (USAF) find the Southwest appealing for many reasons, primarily for the an expansive, unpopulated, and electronically unpolluted space for large-scale training and testing. However, wind events can cause hazards for the USAF including: surface wind gusts can impact the take-off and landing of all aircraft, can tip the airframes of large wing-surface aircraft during the performance of maneuvers close to the ground, and can even impact weapons systems. This dissertation is comprised of three sections intended to further our knowledge and understanding of wind events in the Southwest. The first section builds a climatology of wind events for seven locations in the Southwest during the twelve 3-month seasons of the year. The first section further examines the wind events in relation to terrain and the large-scale flow of the atmosphere. The second section builds upon the first by taking the wind events and generating mid-level composites for each of the twelve 3-month seasons. In the third section, teleconnections identified as consistent with the large-scale circulation in the second paper were used as predictor variables to build a Poisson regression model for each of the twelve 3-month seasons. The purpose of this research is to increase our understanding of the climatology of extreme wind events, increase our understanding of how the large-scale circulation influences extreme wind events, and create a model to enhance predictability of extreme wind events in the Southwest. Knowledge from this paper will help protect personnel and property associated with not only the USAF, but all those in the Southwest.

  9. Increasing climate whiplash in 21st century California

    NASA Astrophysics Data System (ADS)

    Swain, D. L.; Langenbrunner, B.; Neelin, J. D.; Hall, A. D.

    2017-12-01

    Temperate "Mediterranean" climate regimes across the globe are particularly susceptible to wide swings between drought and flood—of which California's rapid transition from record multi-year dryness between 2012-2016 to extreme wetness during 2016-2017 provides a dramatic example. The wide-ranging human and environmental impacts of this recent "climate whiplash" event in a highly-populated, economically critical, and biodiverse region highlight the importance of understanding weather and climate extremes at both ends of the hydroclimatic spectrum. Previous studies have examined the potential contribution of anthropogenic warming to recent California extremes, but findings to date have been mixed and primarily drought-focused. Here, we use specific historical California flood and drought events as thresholds for quantifying long-term changes in precipitation extremes using a large ensemble of multi-decadal climate model simulations (CESM-LENS). We find that greenhouse gas emissions are already responsible for a detectable increase in both wet and dry extremes across portions of California, and that increasing 21st century "climate whiplash" will likely yield large increases in the frequency of both rapid "dry-to-wet" transitions and severe flood events over a wide range of timescales. This projected intensification of California's hydrological cycle would seriously challenge the region's existing water storage, conveyance, and flood control infrastructure—even absent large changes in mean precipitation.

  10. Severe Weather in a Changing Climate: Getting to Adaptation

    NASA Astrophysics Data System (ADS)

    Wuebbles, D. J.; Janssen, E.; Kunkel, K.

    2011-12-01

    Analyses of observation records from U.S. weather stations indicate there is an increasing trend over recent decades in certain types of severe weather, especially large precipitation events. Widespread changes in temperature extremes have been observed over the last 50 years. In particular, the number of heat waves globally (and some parts of the U.S.) has increased, and there have been widespread increases in the numbers of warm nights. Also, analyses show that we are now breaking twice as many heat records as cold records in the U.S. Since 1957, there has been an increase in the number of historically top 1% of heavy precipitation events across the U.S. Our new analyses of the repeat or reoccurrence frequencies of large precipitation storms are showing that such events are occurring more often than in the past. The pattern of precipitation change is one of increases generally at higher northern latitudes and drying in the tropics and subtropics over land. It needs to be recognized that every weather event that happens nowadays takes place in the context of the changes in the background climate system. So nothing is entirely "natural" anymore. It's a fallacy to think that individual events are caused entirely by any one thing, either natural variation or human-induced climate change. Every event is influenced by many factors. Human-induced climate change is now a factor in weather events. The changes occurring in precipitation are consistent with the analyses of our changing climate. For extreme precipitation, we know that more precipitation is falling in very heavy events. And we know key reasons why; warmer air holds more water vapor, and so when any given weather system moves through, the extra water dumps can lead to a heavy downpour. As the climate system continues to warm, models of the Earth's climate system indicate severe precipitation events will likely become more commonplace. Water vapor will continue to increase in the atmosphere along with the warming, and large precipitation events will likely increase in intensity and frequency. In the presentation, we will not only discuss the recent trends in severe weather and the projections of the impacts of climate change on severe weather in the future, but also specific examples of how this information is being used in developing and applying adaptation policies.

  11. Climate Change and Hydrological Extreme Events - Risks and Perspectives for Water Management in Bavaria and Québec

    NASA Astrophysics Data System (ADS)

    Ludwig, R.

    2017-12-01

    There is as yet no confirmed knowledge whether and how climate change contributes to the magnitude and frequency of hydrological extreme events and how regional water management could adapt to the corresponding risks. The ClimEx project (2015-2019) investigates the effects of climate change on the meteorological and hydrological extreme events and their implications for water management in Bavaria and Québec. High Performance Computing is employed to enable the complex simulations in a hydro-climatological model processing chain, resulting in a unique high-resolution and transient (1950-2100) dataset of climatological and meteorological forcing and hydrological response: (1) The climate module has developed a large ensemble of high resolution data (12km) of the CRCM5 RCM for Central Europe and North-Eastern North America, downscaled from 50 members of the CanESM2 GCM. The dataset is complemented by all available data from the Euro-CORDEX project to account for the assessment of both natural climate variability and climate change. The large ensemble with several thousand model years provides the potential to catch rare extreme events and thus improves the process understanding of extreme events with return periods of 1000+ years. (2) The hydrology module comprises process-based and spatially explicit model setups (e.g. WaSiM) for all major catchments in Bavaria and Southern Québec in high temporal (3h) and spatial (500m) resolution. The simulations form the basis for in depth analysis of hydrological extreme events based on the inputs from the large climate model dataset. The specific data situation enables to establish a new method for `virtual perfect prediction', which assesses climate change impacts on flood risk and water resources management by identifying patterns in the data which reveal preferential triggers of hydrological extreme events. The presentation will highlight first results from the analysis of the large scale ClimEx model ensemble, showing the current and future ratio of natural variability and climate change impacts on meteorological extreme events. Selected data from the ensemble is used to drive a hydrological model experiment to illustrate the capacity to better determine the recurrence periods of hydrological extreme events under conditions of climate change.

  12. European Extremely Large Telescope: progress report

    NASA Astrophysics Data System (ADS)

    Tamai, R.; Spyromilio, J.

    2014-07-01

    The European Extremely Large Telescope is a project of the European Southern Observatory to build and operate a 40-m class optical near-infrared telescope. The telescope design effort is largely concluded and construction contracts are being placed with industry and academic/research institutes for the various components. The siting of the telescope in Northern Chile close to the Paranal site allows for an integrated operation of the facility providing significant economies. The progress of the project in various areas is presented in this paper and references to other papers at this SPIE meeting are made.

  13. Accounting for Parameter Uncertainty in Complex Atmospheric Models, With an Application to Greenhouse Gas Emissions Evaluation

    NASA Astrophysics Data System (ADS)

    Swallow, B.; Rigby, M. L.; Rougier, J.; Manning, A.; Thomson, D.; Webster, H. N.; Lunt, M. F.; O'Doherty, S.

    2016-12-01

    In order to understand underlying processes governing environmental and physical phenomena, a complex mathematical model is usually required. However, there is an inherent uncertainty related to the parameterisation of unresolved processes in these simulators. Here, we focus on the specific problem of accounting for uncertainty in parameter values in an atmospheric chemical transport model. Systematic errors introduced by failing to account for these uncertainties have the potential to have a large effect on resulting estimates in unknown quantities of interest. One approach that is being increasingly used to address this issue is known as emulation, in which a large number of forward runs of the simulator are carried out, in order to approximate the response of the output to changes in parameters. However, due to the complexity of some models, it is often unfeasible to run large numbers of training runs that is usually required for full statistical emulators of the environmental processes. We therefore present a simplified model reduction method for approximating uncertainties in complex environmental simulators without the need for very large numbers of training runs. We illustrate the method through an application to the Met Office's atmospheric transport model NAME. We show how our parameter estimation framework can be incorporated into a hierarchical Bayesian inversion, and demonstrate the impact on estimates of UK methane emissions, using atmospheric mole fraction data. We conclude that accounting for uncertainties in the parameterisation of complex atmospheric models is vital if systematic errors are to be minimized and all relevant uncertainties accounted for. We also note that investigations of this nature can prove extremely useful in highlighting deficiencies in the simulator that might otherwise be missed.

  14. Construction of a ratiometric fluorescent probe with an extremely large emission shift for imaging hypochlorite in living cells

    NASA Astrophysics Data System (ADS)

    Song, Xuezhen; Dong, Baoli; Kong, Xiuqi; Wang, Chao; Zhang, Nan; Lin, Weiying

    2018-01-01

    Hypochlorite is one of the important reactive oxygen species (ROS) and plays critical roles in many biologically vital processes. Herein, we present a unique ratiometric fluorescent probe (CBP) with an extremely large emission shift for detecting hypochlorite in living cells. Utilizing positively charged α,β-unsaturated carbonyl group as the reaction site, the probe CBP itself exhibited near-infrared (NIR) fluorescence at 662 nm, and can display strong blue fluorescence at 456 nm when responded to hypochlorite. Notably, the extremely large emission shift of 206 nm could enable the precise measurement of the fluorescence peak intensities and ratios. CBP showed high sensitivity, excellent selectivity, desirable performance at physiological pH, and low cytotoxicity. The bioimaging experiments demonstrate the biological application of CBP for the ratiometric imaging of hypochlorite in living cells.

  15. Short-term effects of upper extremity circuit resistance training on muscle strength and functional independence in patients with paraplegia.

    PubMed

    Yildirim, Adem; Sürücü, Gülseren Dost; Karamercan, Ayşe; Gedik, Dilay Eken; Atci, Nermin; Dülgeroǧlu, Deniz; Özgirgin, Neşe

    2016-11-21

    A number of exercises to strengthen the upper extremities are recommended to increase functional independence and quality of life (QoL) in patients with paraplegia. Circuit resistance training (CRT) is a type of progressive resistive exercise performed repeatedly at fixed mechanical exercise stations. The aim of this study was to investigate the potential benefits of CRT for upper extremity muscle strength, functional independence, and QoL in patients with paraplegia. Twenty-six patients with paraplegia who were participating in a conventional rehabilitation program at a tertiary education and research hospital were enrolled in this study. The participants were randomly assigned to two groups. The exercise group participated in the CRT program, which consisted of repetitive exercises for the upper extremities performed at fixed mechanical stations 5 sessions per week for 6 weeks, in addition to conventional rehabilitation. Participants in the control group received only conventional rehabilitation over the same period. We compared the groups with respect to QoL, as well as isokinetic muscle test outcomes in the upper extremities, using the Functional Independence Measure (FIM) and Borg's scale. We observed significant increases in scores on the physical component of the FIM, Borg's scale, and QoL in both the exercise and control groups. Furthermore, the large majority of isokinetic values were significantly more improved in the exercise group compared to the control group. When post-treatment outcomes were compared between the groups, improvements in scores on the physical component of the FIM and in most isokinetic values were significantly greater in the exercise group. This study showed that CRT has positive effects on muscle strength in the upper extremities and the physical disability components of the FIM when added to conventional rehabilitation programs for paraplegic patients. However, we observed no significant improvement in QoL scores after adding CRT to a conventional treatment regime. Randomized trial (Level II).

  16. A large, benign prostatic cyst presented with an extremely high serum prostate-specific antigen level.

    PubMed

    Chen, Han-Kuang; Pemberton, Richard

    2016-01-08

    We report a case of a patient who presented with an extremely high serum prostate specific antigen (PSA) level and underwent radical prostatectomy for presumed prostate cancer. Surprisingly, the whole mount prostatectomy specimen showed only small volume, organ-confined prostate adenocarcinoma and a large, benign intraprostatic cyst, which was thought to be responsible for the PSA elevation. 2016 BMJ Publishing Group Ltd.

  17. Increasing weather-related impacts on European population under climate and demographic change

    NASA Astrophysics Data System (ADS)

    Forzieri, Giovanni; Cescatti, Alessandro; Batista e Silva, Filipe; Kovats, Sari R.; Feyen, Luc

    2017-04-01

    Over the last three decades the overwhelming majority of disasters have been caused by weather-related events. The observed rise in weather-related disaster losses has been largely attributed to increased exposure and to a lesser degree to global warming. Recent studies suggest an intensification in the climatology of multiple weather extremes in Europe over the coming decades in view of climate change, while urbanization continues. In view of these pressures, understanding and quantifying the potential impacts of extreme weather events on future societies is imperative in order to identify where and to what extent their livelihoods will be at risk in the future, and develop timely and effective adaptation and disaster risk reduction strategies. Here we show a comprehensive assessment of single- and multi-hazard impacts on the European population until the year 2100. For this purpose, we developed a novel methodology that quantifies the human impacts as a multiplicative function of hazard, exposure and population vulnerability. We focus on seven of the most impacting weather-related hazards - including heat and cold waves, wildfires, droughts, river and coastal floods and windstorms - and evaluated their spatial and temporal variations in intensity and frequency under a business-as-usual climate scenario. Long-term demographic dynamics were modelled to assess exposure developments under a corresponding middle-of-the-road scenario. Vulnerability of humans to weather extremes was appraised based on more than 2300 records of weather-related disasters. The integration of these elements provides a range of plausible estimates of extreme weather-related risks for future European generations. Expected impacts on population are quantified in terms of fatalities and number of people exposed. We find a staggering rise in fatalities from extreme weather events, with the projected death toll by the end of the century amounting to more than 50 times the present number of people killed. Approximately two-thirds of European citizens could then be exposed to a weather-related disaster each year, which will bring about huge rises in health costs to society. Future impacts show a prominent spatial gradient towards southern regions, where weather extremes could become the greatest environmental risk factor for people. The projected changes are dominated by global warming, mainly through a rise in heatwaves, but ongoing urbanization, development in hazard-prone areas and ageing population will likely further increase human risk. The results call for immediate action to achieve the Paris goals on climate mitigation and adaptation in order to protect future European generations.

  18. Asymmetrical Responses of Ecosystem Processes to Positive Versus Negative Precipitation Extremes: a Replicated Regression Experimental Approach

    NASA Astrophysics Data System (ADS)

    Felton, A. J.; Smith, M. D.

    2016-12-01

    Heightened climatic variability due to atmospheric warming is forecast to increase the frequency and severity of climate extremes. In particular, changes to interannual variability in precipitation, characterized by increases in extreme wet and dry years, are likely to impact virtually all terrestrial ecosystem processes. However, to date experimental approaches have yet to explicitly test how ecosystem processes respond to multiple levels of climatic extremity, limiting our understanding of how ecosystems will respond to forecast increases in the magnitude of climate extremes. Here we report the results of a replicated regression experimental approach, in which we imposed 9 and 11 levels of growing season precipitation amount and extremity in mesic grassland during 2015 and 2016, respectively. Each level corresponded to a specific percentile of the long-term record, which produced a large gradient of soil moisture conditions that ranged from extreme wet to extreme dry. In both 2015 and 2016, asymptotic responses to water availability were observed for soil respiration. This asymmetry was driven in part by transitions between soil moisture versus temperature constraints on respiration as conditions became increasingly dry versus increasingly wet. In 2015, aboveground net primary production (ANPP) exhibited asymmetric responses to precipitation that largely mirrored those of soil respiration. In total, our results suggest that in this mesic ecosystem, these two carbon cycle processes were more sensitive to extreme drought than to extreme wet years. Future work will assess ANPP responses for 2016, soil nutrient supply and physiological responses of the dominant plant species. Future efforts are needed to compare our findings across a diverse array of ecosystem types, and in particular how the timing and magnitude of precipitation events may modify the response of ecosystem processes to increasing magnitudes of precipitation extremes.

  19. Habitable planets with high obliquities

    NASA Technical Reports Server (NTRS)

    Williams, D. M.; Kasting, J. F.

    1997-01-01

    Earth's obliquity would vary chaotically from 0 degrees to 85 degrees were it not for the presence of the Moon (J. Laskar, F. Joutel, and P. Robutel, 1993, Nature 361, 615-617). The Moon itself is thought to be an accident of accretion, formed by a glancing blow from a Mars-sized planetesimal. Hence, planets with similar moons and stable obliquities may be extremely rare. This has lead Laskar and colleagues to suggest that the number of Earth-like planets with high obliquities and temperate, life-supporting climates may be small. To test this proposition, we have used an energy-balance climate model to simulate Earth's climate at obliquities up to 90 degrees. We show that Earth's climate would become regionally severe in such circumstances, with large seasonal cycles and accompanying temperature extremes on middle- and high-latitude continents which might be damaging to many forms of life. The response of other, hypothetical, Earth-like planets to large obliquity fluctuations depends on their land-sea distribution and on their position within the habitable zone (HZ) around their star. Planets with several modest-sized continents or equatorial supercontinents are more climatically stable than those with polar supercontinents. Planets farther out in the HZ are less affected by high obliquities because their atmospheres should accumulate CO2 in response to the carbonate-silicate cycle. Dense, CO2-rich atmospheres transport heat very effectively and therefore limit the magnitude of both seasonal cycles and latitudinal temperature gradients. We conclude that a significant fraction of extrasolar Earth-like planets may still be habitable, even if they are subject to large obliquity fluctuations.

  20. Anaerobic Thermophiles

    PubMed Central

    Canganella, Francesco; Wiegel, Juergen

    2014-01-01

    The term “extremophile” was introduced to describe any organism capable of living and growing under extreme conditions. With the further development of studies on microbial ecology and taxonomy, a variety of “extreme” environments have been found and an increasing number of extremophiles are being described. Extremophiles have also been investigated as far as regarding the search for life on other planets and even evaluating the hypothesis that life on Earth originally came from space. The first extreme environments to be largely investigated were those characterized by elevated temperatures. The naturally “hot environments” on Earth range from solar heated surface soils and water with temperatures up to 65 °C, subterranean sites such as oil reserves and terrestrial geothermal with temperatures ranging from slightly above ambient to above 100 °C, to submarine hydrothermal systems with temperatures exceeding 300 °C. There are also human-made environments with elevated temperatures such as compost piles, slag heaps, industrial processes and water heaters. Thermophilic anaerobic microorganisms have been known for a long time, but scientists have often resisted the belief that some organisms do not only survive at high temperatures, but actually thrive under those hot conditions. They are perhaps one of the most interesting varieties of extremophilic organisms. These microorganisms can thrive at temperatures over 50 °C and, based on their optimal temperature, anaerobic thermophiles can be subdivided into three main groups: thermophiles with an optimal temperature between 50 °C and 64 °C and a maximum at 70 °C, extreme thermophiles with an optimal temperature between 65 °C and 80 °C, and finally hyperthermophiles with an optimal temperature above 80 °C and a maximum above 90 °C. The finding of novel extremely thermophilic and hyperthermophilic anaerobic bacteria in recent years, and the fact that a large fraction of them belong to the Archaea has definitely made this area of investigation more exciting. Particularly fascinating are their structural and physiological features allowing them to withstand extremely selective environmental conditions. These properties are often due to specific biomolecules (DNA, lipids, enzymes, osmolites, etc.) that have been studied for years as novel sources for biotechnological applications. In some cases (DNA-polymerase, thermostable enzymes), the search and applications successful exceeded preliminary expectations, but certainly further exploitations are still needed. PMID:25370030

  1. Antral follicle counts are strongly associated with live-birth rates after assisted reproduction, with superior treatment outcome in women with polycystic ovaries.

    PubMed

    Holte, Jan; Brodin, Thomas; Berglund, Lars; Hadziosmanovic, Nermin; Olovsson, Matts; Bergh, Torbjörn

    2011-09-01

    To evaluate the association of antral follicle count (AFC) with in vitro fertilization/intracytoplasmic sperm injection (IVF-ICSI) outcome in a large unselected cohort of patients covering the entire range of AFC. Prospective observational study. University-affiliated private infertility center. 2,092 women undergoing 4,308 IVF-ICSI cycles. AFC analyzed for associations with treatment outcome and statistically adjusted for repeated treatments and age. Pregnancy rate, live-birth rate, and stimulation outcome parameters. The AFC was log-normally distributed. Pregnancy rates and live-birth rates were positively associated with AFC in a log-linear way, leveling out above AFC ∼30. Treatment outcome was superior among women with polycystic ovaries, independent from ovulatory status. The findings were significant also after adjustment for age and number of oocytes retrieved. Pregnancy and live-birth rates are log-linearly related to AFC. Polycystic ovaries, most often excluded from studies on ovarian reserve, fit as one extreme in the spectrum of AFC; a low count constitutes the other extreme, with the lowest ovarian reserve and poor treatment outcome. The findings remained statistically significant also after adjustment for the number of oocytes retrieved, suggesting this measure of ovarian reserve comprises information on oocyte quality and not only quantity. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  2. Three-dimensional lattice Boltzmann simulations of microdroplets including contact angle hysteresis on topologically structured surfaces

    DOE PAGES

    Ba, Yan; Kang, Qinjun; Liu, Haihu; ...

    2016-04-14

    In this study, the dynamical behavior of a droplet on topologically structured surface is investigated by using a three-dimensional color-gradient lattice Boltzmann model. A wetting boundary condition is proposed to model fluid-surface interactions, which is advantageous to improve the accuracy of the simulation and suppress spurious velocities at the contact line. The model is validated by the droplet partial wetting test and reproduction of the Cassie and Wenzel states. A series of simulations are conducted to investigate the behavior of a droplet when subjected to a shear flow. It is found that in Cassie state, the droplet undergoes a transitionmore » from stationary, to slipping and finally to detachment states as the capillary number increases, while in Wenzel state, the last state changes to the breakup state. The critical capillary number, above which the droplet slipping occurs, is small for the Cassie droplet, but is significantly enhanced for the Wenzel droplet due to the increased contact angle hysteresis. In Cassie state, the receding contact angle nearly equals the prediction by the Cassie relation, and the advancing contact angle is close to 180°, leading to a small contact angle hysteresis. In Wenzel state, however, the contact angle hysteresis is extremely large (around 100°). Finally, high droplet mobility can be easily achieved for Cassie droplets, whereas in Wenzel state, extremely low droplet mobility is identified.« less

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ba, Yan; Kang, Qinjun; Liu, Haihu

    In this study, the dynamical behavior of a droplet on topologically structured surface is investigated by using a three-dimensional color-gradient lattice Boltzmann model. A wetting boundary condition is proposed to model fluid-surface interactions, which is advantageous to improve the accuracy of the simulation and suppress spurious velocities at the contact line. The model is validated by the droplet partial wetting test and reproduction of the Cassie and Wenzel states. A series of simulations are conducted to investigate the behavior of a droplet when subjected to a shear flow. It is found that in Cassie state, the droplet undergoes a transitionmore » from stationary, to slipping and finally to detachment states as the capillary number increases, while in Wenzel state, the last state changes to the breakup state. The critical capillary number, above which the droplet slipping occurs, is small for the Cassie droplet, but is significantly enhanced for the Wenzel droplet due to the increased contact angle hysteresis. In Cassie state, the receding contact angle nearly equals the prediction by the Cassie relation, and the advancing contact angle is close to 180°, leading to a small contact angle hysteresis. In Wenzel state, however, the contact angle hysteresis is extremely large (around 100°). Finally, high droplet mobility can be easily achieved for Cassie droplets, whereas in Wenzel state, extremely low droplet mobility is identified.« less

  4. Amniotic Constriction Bands: Secondary Deformities and Their Treatments.

    PubMed

    Drury, Benjamin T; Rayan, Ghazi M

    2018-01-01

    The purpose of this study was to report the surgical treatment experience of patients with amniotic constriction bands (ACB) over a 35-year interval and detail consequential limb deformities with emphasis on hands and upper extremities, along with the nature and frequency of their surgical treatment methods. Fifty-one patients were identified; 26 were males and 25 females. The total number of deformities was listed. The total number of operations, individual procedures, and operations plus procedures that were done for each patient and their frequency were recorded. The total number of operations was 117, and total number of procedures was 341. More procedures were performed on the upper extremity (85%) than the lower extremity (15%). Including the primary deformity ACB, 16 different hand deformities secondary to ACB were encountered. Sixteen different surgical methods for the upper extremity were utilized; a primary procedure for ACB and secondary reconstructions for all secondary deformities. Average age at the time of the first procedure was 9.3 months. The most common procedures performed, in order of frequency, were excision of ACB plus Z-plasty, release of partial syndactyly, release of fenestrated syndactyly, full-thickness skin grafts, resection of digital bony overgrowth from amputation stumps, and deepening of first and other digital web spaces. Many hand and upper extremity deformities secondary to ACB are encountered. Children with ACB may require more than one operation including multiple procedures. Numerous surgical methods of reconstruction for these children's secondary deformities are necessary in addition to the customary primary procedure of excision of ACB and Z-plasty.

  5. Rare copy number variations in congenital heart disease patients identify unique genes in left-right patterning

    PubMed Central

    Fakhro, Khalid A.; Choi, Murim; Ware, Stephanie M.; Belmont, John W.; Towbin, Jeffrey A.; Lifton, Richard P.; Khokha, Mustafa K.; Brueckner, Martina

    2011-01-01

    Dominant human genetic diseases that impair reproductive fitness and have high locus heterogeneity constitute a problem for gene discovery because the usual criterion of finding more mutations in specific genes than expected by chance may require extremely large populations. Heterotaxy (Htx), a congenital heart disease resulting from abnormalities in left-right (LR) body patterning, has features suggesting that many cases fall into this category. In this setting, appropriate model systems may provide a means to support implication of specific genes. By high-resolution genotyping of 262 Htx subjects and 991 controls, we identify a twofold excess of subjects with rare genic copy number variations in Htx (14.5% vs. 7.4%, P = 1.5 × 10−4). Although 7 of 45 Htx copy number variations were large chromosomal abnormalities, 38 smaller copy number variations altered a total of 61 genes, 22 of which had Xenopus orthologs. In situ hybridization identified 7 of these 22 genes with expression in the ciliated LR organizer (gastrocoel roof plate), a marked enrichment compared with 40 of 845 previously studied genes (sevenfold enrichment, P < 10−6). Morpholino knockdown in Xenopus of Htx candidates demonstrated that five (NEK2, ROCK2, TGFBR2, GALNT11, and NUP188) strongly disrupted both morphological LR development and expression of pitx2, a molecular marker of LR patterning. These effects were specific, because 0 of 13 control genes from rare Htx or control copy number variations produced significant LR abnormalities (P = 0.001). These findings identify genes not previously implicated in LR patterning. PMID:21282601

  6. Rare copy number variations in congenital heart disease patients identify unique genes in left-right patterning.

    PubMed

    Fakhro, Khalid A; Choi, Murim; Ware, Stephanie M; Belmont, John W; Towbin, Jeffrey A; Lifton, Richard P; Khokha, Mustafa K; Brueckner, Martina

    2011-02-15

    Dominant human genetic diseases that impair reproductive fitness and have high locus heterogeneity constitute a problem for gene discovery because the usual criterion of finding more mutations in specific genes than expected by chance may require extremely large populations. Heterotaxy (Htx), a congenital heart disease resulting from abnormalities in left-right (LR) body patterning, has features suggesting that many cases fall into this category. In this setting, appropriate model systems may provide a means to support implication of specific genes. By high-resolution genotyping of 262 Htx subjects and 991 controls, we identify a twofold excess of subjects with rare genic copy number variations in Htx (14.5% vs. 7.4%, P = 1.5 × 10(-4)). Although 7 of 45 Htx copy number variations were large chromosomal abnormalities, 38 smaller copy number variations altered a total of 61 genes, 22 of which had Xenopus orthologs. In situ hybridization identified 7 of these 22 genes with expression in the ciliated LR organizer (gastrocoel roof plate), a marked enrichment compared with 40 of 845 previously studied genes (sevenfold enrichment, P < 10(-6)). Morpholino knockdown in Xenopus of Htx candidates demonstrated that five (NEK2, ROCK2, TGFBR2, GALNT11, and NUP188) strongly disrupted both morphological LR development and expression of pitx2, a molecular marker of LR patterning. These effects were specific, because 0 of 13 control genes from rare Htx or control copy number variations produced significant LR abnormalities (P = 0.001). These findings identify genes not previously implicated in LR patterning.

  7. A Water Droplet Pinning and Heat Transfer Characteristics on an Inclined Hydrophobic Surface.

    PubMed

    Al-Sharafi, Abdullah; Yilbas, Bekir Sami; Ali, Haider; AlAqeeli, N

    2018-02-15

    A water droplet pinning on inclined hydrophobic surface is considered and the droplet heat transfer characteristics are examined. Solution crystallization of polycarbonate is carried out to create hydrophobic characteristics on the surface. The pinning state of the water droplet on the extreme inclined hydrophobic surface (0° ≤ δ ≤ 180°, δ being the inclination angle) is assessed. Heat transfer from inclined hydrophobic surface to droplet is simulated for various droplet volumes and inclination angles in line with the experimental conditions. The findings revealed that the hydrophobic surface give rise to large amount of air being trapped within texture, which generates Magdeburg like forces between the droplet meniscus and the textured surface while contributing to droplet pinning at extreme inclination angles. Two counter rotating cells are developed for inclination angle in the range of 0° < δ < 20° and 135° < δ < 180°; however, a single circulation cell is formed inside the droplet for inclination angle of 25° ≤ δ ≤ 135°. The Nusselt number remains high for the range of inclination angle of 45° ≤ δ ≤ 135°. Convection and conduction heat transfer enhances when a single and large circulation cell is formed inside the droplet.

  8. Tomographic local 2D analyses of the WISExSuperCOSMOS all-sky galaxy catalogue

    NASA Astrophysics Data System (ADS)

    Novaes, C. P.; Bernui, A.; Xavier, H. S.; Marques, G. A.

    2018-05-01

    The recent progress in obtaining larger and deeper galaxy catalogues is of fundamental importance for cosmological studies, especially to robustly measure the large scale density fluctuations in the Universe. The present work uses the Minkowski Functionals (MF) to probe the galaxy density field from the WISExSuperCOSMOS (WSC) all-sky catalogue by performing tomographic local analyses in five redshift shells (of thickness δz = 0.05) in the total range of 0.10 < z < 0.35. Here, for the first time, the MF are applied to 2D projections of the galaxy number count (GNC) fields with the purpose of looking for regions in the WSC catalogue with unexpected features compared to ΛCDM mock realisations. Our methodology reveals 1 - 3 regions of the GNC maps in each redshift shell with an uncommon behaviour (extreme regions), i.e., p-value < 1.4%. Indeed, the resulting MF curves show signatures that suggest the uncommon behaviour to be associated with the presence of over- or under-densities there, but contamination due to residual foregrounds is not discarded. Additionally, even though our analyses indicate a good agreement among data and simulations, we identify 1 highly extreme region, seemingly associated to a large clustered distribution of galaxies. Our results confirm the usefulness of the MF to analyse GNC maps from photometric galaxy datasets.

  9. Human skeletal muscle biochemical diversity.

    PubMed

    Tirrell, Timothy F; Cook, Mark S; Carr, J Austin; Lin, Evie; Ward, Samuel R; Lieber, Richard L

    2012-08-01

    The molecular components largely responsible for muscle attributes such as passive tension development (titin and collagen), active tension development (myosin heavy chain, MHC) and mechanosensitive signaling (titin) have been well studied in animals but less is known about their roles in humans. The purpose of this study was to perform a comprehensive analysis of titin, collagen and MHC isoform distributions in a large number of human muscles, to search for common themes and trends in the muscular organization of the human body. In this study, 599 biopsies were obtained from six human cadaveric donors (mean age 83 years). Three assays were performed on each biopsy - titin molecular mass determination, hydroxyproline content (a surrogate for collagen content) and MHC isoform distribution. Titin molecular mass was increased in more distal muscles of the upper and lower limbs. This trend was also observed for collagen. Percentage MHC-1 data followed a pattern similar to collagen in muscles of the upper extremity but this trend was reversed in the lower extremity. Titin molecular mass was the best predictor of anatomical region and muscle functional group. On average, human muscles had more slow myosin than other mammals. Also, larger titins were generally associated with faster muscles. These trends suggest that distal muscles should have higher passive tension than proximal ones, and that titin size variability may potentially act to 'tune' the protein's mechanotransduction capability.

  10. Human skeletal muscle biochemical diversity

    PubMed Central

    Tirrell, Timothy F.; Cook, Mark S.; Carr, J. Austin; Lin, Evie; Ward, Samuel R.; Lieber, Richard L.

    2012-01-01

    SUMMARY The molecular components largely responsible for muscle attributes such as passive tension development (titin and collagen), active tension development (myosin heavy chain, MHC) and mechanosensitive signaling (titin) have been well studied in animals but less is known about their roles in humans. The purpose of this study was to perform a comprehensive analysis of titin, collagen and MHC isoform distributions in a large number of human muscles, to search for common themes and trends in the muscular organization of the human body. In this study, 599 biopsies were obtained from six human cadaveric donors (mean age 83 years). Three assays were performed on each biopsy – titin molecular mass determination, hydroxyproline content (a surrogate for collagen content) and MHC isoform distribution. Titin molecular mass was increased in more distal muscles of the upper and lower limbs. This trend was also observed for collagen. Percentage MHC-1 data followed a pattern similar to collagen in muscles of the upper extremity but this trend was reversed in the lower extremity. Titin molecular mass was the best predictor of anatomical region and muscle functional group. On average, human muscles had more slow myosin than other mammals. Also, larger titins were generally associated with faster muscles. These trends suggest that distal muscles should have higher passive tension than proximal ones, and that titin size variability may potentially act to ‘tune’ the protein's mechanotransduction capability. PMID:22786631

  11. Comparative Performance of Four Single Extreme Outlier Discordancy Tests from Monte Carlo Simulations

    PubMed Central

    Díaz-González, Lorena; Quiroz-Ruiz, Alfredo

    2014-01-01

    Using highly precise and accurate Monte Carlo simulations of 20,000,000 replications and 102 independent simulation experiments with extremely low simulation errors and total uncertainties, we evaluated the performance of four single outlier discordancy tests (Grubbs test N2, Dixon test N8, skewness test N14, and kurtosis test N15) for normal samples of sizes 5 to 20. Statistical contaminations of a single observation resulting from parameters called δ from ±0.1 up to ±20 for modeling the slippage of central tendency or ε from ±1.1 up to ±200 for slippage of dispersion, as well as no contamination (δ = 0 and ε = ±1), were simulated. Because of the use of precise and accurate random and normally distributed simulated data, very large replications, and a large number of independent experiments, this paper presents a novel approach for precise and accurate estimations of power functions of four popular discordancy tests and, therefore, should not be considered as a simple simulation exercise unrelated to probability and statistics. From both criteria of the Power of Test proposed by Hayes and Kinsella and the Test Performance Criterion of Barnett and Lewis, Dixon test N8 performs less well than the other three tests. The overall performance of these four tests could be summarized as N2≅N15 > N14 > N8. PMID:24737992

  12. Comparative performance of four single extreme outlier discordancy tests from Monte Carlo simulations.

    PubMed

    Verma, Surendra P; Díaz-González, Lorena; Rosales-Rivera, Mauricio; Quiroz-Ruiz, Alfredo

    2014-01-01

    Using highly precise and accurate Monte Carlo simulations of 20,000,000 replications and 102 independent simulation experiments with extremely low simulation errors and total uncertainties, we evaluated the performance of four single outlier discordancy tests (Grubbs test N2, Dixon test N8, skewness test N14, and kurtosis test N15) for normal samples of sizes 5 to 20. Statistical contaminations of a single observation resulting from parameters called δ from ±0.1 up to ±20 for modeling the slippage of central tendency or ε from ±1.1 up to ±200 for slippage of dispersion, as well as no contamination (δ = 0 and ε = ±1), were simulated. Because of the use of precise and accurate random and normally distributed simulated data, very large replications, and a large number of independent experiments, this paper presents a novel approach for precise and accurate estimations of power functions of four popular discordancy tests and, therefore, should not be considered as a simple simulation exercise unrelated to probability and statistics. From both criteria of the Power of Test proposed by Hayes and Kinsella and the Test Performance Criterion of Barnett and Lewis, Dixon test N8 performs less well than the other three tests. The overall performance of these four tests could be summarized as N2≅N15 > N14 > N8.

  13. Whole genome sequences of a male and female supercentenarian, ages greater than 114 years.

    PubMed

    Sebastiani, Paola; Riva, Alberto; Montano, Monty; Pham, Phillip; Torkamani, Ali; Scherba, Eugene; Benson, Gary; Milton, Jacqueline N; Baldwin, Clinton T; Andersen, Stacy; Schork, Nicholas J; Steinberg, Martin H; Perls, Thomas T

    2011-01-01

    Supercentenarians (age 110+ years old) generally delay or escape age-related diseases and disability well beyond the age of 100 and this exceptional survival is likely to be influenced by a genetic predisposition that includes both common and rare genetic variants. In this report, we describe the complete genomic sequences of male and female supercentenarians, both age >114 years old. We show that: (1) the sequence variant spectrum of these two individuals' DNA sequences is largely comparable to existing non-supercentenarian genomes; (2) the two individuals do not appear to carry most of the well-established human longevity enabling variants already reported in the literature; (3) they have a comparable number of known disease-associated variants relative to most human genomes sequenced to-date; (4) approximately 1% of the variants these individuals possess are novel and may point to new genes involved in exceptional longevity; and (5) both individuals are enriched for coding variants near longevity-associated variants that we discovered through a large genome-wide association study. These analyses suggest that there are both common and rare longevity-associated variants that may counter the effects of disease-predisposing variants and extend lifespan. The continued analysis of the genomes of these and other rare individuals who have survived to extremely old ages should provide insight into the processes that contribute to the maintenance of health during extreme aging.

  14. Whole Genome Sequences of a Male and Female Supercentenarian, Ages Greater than 114 Years

    PubMed Central

    Sebastiani, Paola; Riva, Alberto; Montano, Monty; Pham, Phillip; Torkamani, Ali; Scherba, Eugene; Benson, Gary; Milton, Jacqueline N.; Baldwin, Clinton T.; Andersen, Stacy; Schork, Nicholas J.; Steinberg, Martin H.; Perls, Thomas T.

    2012-01-01

    Supercentenarians (age 110+ years old) generally delay or escape age-related diseases and disability well beyond the age of 100 and this exceptional survival is likely to be influenced by a genetic predisposition that includes both common and rare genetic variants. In this report, we describe the complete genomic sequences of male and female supercentenarians, both age >114 years old. We show that: (1) the sequence variant spectrum of these two individuals’ DNA sequences is largely comparable to existing non-supercentenarian genomes; (2) the two individuals do not appear to carry most of the well-established human longevity enabling variants already reported in the literature; (3) they have a comparable number of known disease-associated variants relative to most human genomes sequenced to-date; (4) approximately 1% of the variants these individuals possess are novel and may point to new genes involved in exceptional longevity; and (5) both individuals are enriched for coding variants near longevity-associated variants that we discovered through a large genome-wide association study. These analyses suggest that there are both common and rare longevity-associated variants that may counter the effects of disease-predisposing variants and extend lifespan. The continued analysis of the genomes of these and other rare individuals who have survived to extremely old ages should provide insight into the processes that contribute to the maintenance of health during extreme aging. PMID:22303384

  15. Disastrous torrential floods in mountain areas in Serbia

    NASA Astrophysics Data System (ADS)

    Gavrilovic, Z.

    2009-04-01

    In Serbia, the relief is predominantly hilly and mountainous, intersected with numerous rivers. The greatest number of watercourses are small torrents; however the proportionally large rivers also have a distinctive torrential character. The highest parts of the catchments are at the altitudes above 1500 m, while their confluences are at the altitudes of 200 - 300 m. The catchment and channel slopes are extremely steep. So, as the initial natural preconditions are satisfied, torrential floods are the consequence. Although the Južna Morava catchments were regulated by erosion control works, during the last decades there were numerous torrential floods. Some of the floods had disastrous proportions, not recorded in Serbia or in Europe. The flood of river Vlasina in 1988 was presented to the professional public several times. This flood was not an isolated case. Many large-scale torrential floods occurred in Serbia from 1994 to 2007. As there were floods also in 2007, the causes of the recorded floods had to be analysed. The analysis pointed out a series of scenarios which were the causes of disastrous torrential floods, and also the disadvantages of the actual system of torrent and erosion control. Special attention was focused on the floods which resulted from sudden snow melting. This paper will present the results of the analyses of the extreme torrential floods of the rivers Nišava and Vlasina. Key words: Flood, torrents, torrent control, erosion control

  16. Modeling jointly low, moderate, and heavy rainfall intensities without a threshold selection

    NASA Astrophysics Data System (ADS)

    Naveau, Philippe; Huser, Raphael; Ribereau, Pierre; Hannart, Alexis

    2016-04-01

    In statistics, extreme events are often defined as excesses above a given large threshold. This definition allows hydrologists and flood planners to apply Extreme-Value Theory (EVT) to their time series of interest. Even in the stationary univariate context, this approach has at least two main drawbacks. First, working with excesses implies that a lot of observations (those below the chosen threshold) are completely disregarded. The range of precipitation is artificially shopped down into two pieces, namely large intensities and the rest, which necessarily imposes different statistical models for each piece. Second, this strategy raises a nontrivial and very practical difficultly: how to choose the optimal threshold which correctly discriminates between low and heavy rainfall intensities. To address these issues, we propose a statistical model in which EVT results apply not only to heavy, but also to low precipitation amounts (zeros excluded). Our model is in compliance with EVT on both ends of the spectrum and allows a smooth transition between the two tails, while keeping a low number of parameters. In terms of inference, we have implemented and tested two classical methods of estimation: likelihood maximization and probability weighed moments. Last but not least, there is no need to choose a threshold to define low and high excesses. The performance and flexibility of this approach are illustrated on simulated and hourly precipitation recorded in Lyon, France.

  17. Numerical simulation of turbulent forced convection in liquid metals

    NASA Astrophysics Data System (ADS)

    Vodret, S.; Vitale Di Maio, D.; Caruso, G.

    2014-11-01

    In the frame of the future generation of nuclear reactors, liquid metals are foreseen to be used as a primary coolant. Liquid metals are characterized by a very low Prandtl number due to their very high heat diffusivity. As such, they do not meet the so-called Reynolds analogy which assumes a complete similarity between the momentum and the thermal boundary layers via the use of the turbulent Prandtl number. Particularly, in the case of industrial fluid-dynamic calculations where a resolved computation near walls could be extremely time consuming and could need very large computational resources, the use of the classical wall function approach could lead to an inaccurate description of the temperature profile close to the wall. The first aim of the present study is to investigate the ability of a well- established commercial code (ANSYS FLUENT v.14) to deal with this issue, validating a suitable expression for the turbulent Prandtl number. Moreover, a thermal wall-function developed at Universite Catholique de Louvain has been implemented in FLUENT and validated, overcoming the limits of the solver to define it directly. Both the resolved and unresolved approaches have been carried out for a channel flow case and assessed against available direct numerical and large eddy simulations. A comparison between the numerically evaluated Nusselt number and the main correlations available in the literature has been also carried out. Finally, an application of the proposed methodology to a typical sub-channel case has been performed, comparing the results with literature correlations for tube banks.

  18. Extremely rare collapse and build-up of turbulence in stochastic models of transitional wall flows.

    PubMed

    Rolland, Joran

    2018-02-01

    This paper presents a numerical and theoretical study of multistability in two stochastic models of transitional wall flows. An algorithm dedicated to the computation of rare events is adapted on these two stochastic models. The main focus is placed on a stochastic partial differential equation model proposed by Barkley. Three types of events are computed in a systematic and reproducible manner: (i) the collapse of isolated puffs and domains initially containing their steady turbulent fraction; (ii) the puff splitting; (iii) the build-up of turbulence from the laminar base flow under a noise perturbation of vanishing variance. For build-up events, an extreme realization of the vanishing variance noise pushes the state from the laminar base flow to the most probable germ of turbulence which in turn develops into a full blown puff. For collapse events, the Reynolds number and length ranges of the two regimes of collapse of laminar-turbulent pipes, independent collapse or global collapse of puffs, is determined. The mean first passage time before each event is then systematically computed as a function of the Reynolds number r and pipe length L in the laminar-turbulent coexistence range of Reynolds number. In the case of isolated puffs, the faster-than-linear growth with Reynolds number of the logarithm of mean first passage time T before collapse is separated in two. One finds that ln(T)=A_{p}r-B_{p}, with A_{p} and B_{p} positive. Moreover, A_{p} and B_{p} are affine in the spatial integral of turbulence intensity of the puff, with the same slope. In the case of pipes initially containing the steady turbulent fraction, the length L and Reynolds number r dependence of the mean first passage time T before collapse is also separated. The author finds that T≍exp[L(Ar-B)] with A and B positive. The length and Reynolds number dependence of T are then discussed in view of the large deviations theoretical approaches of the study of mean first passage times and multistability, where ln(T) in the limit of small variance noise is studied. Two points of view, local noise of small variance and large length, can be used to discuss the exponential dependence in L of T. In particular, it is shown how a T≍exp[L(A^{'}R-B^{'})] can be derived in a conceptual two degrees of freedom model of a transitional wall flow proposed by Dauchot and Manneville. This is done by identifying a quasipotential in low variance noise, large length limit. This pinpoints the physical effects controlling collapse and build-up trajectories and corresponding passage times with an emphasis on the saddle points between laminar and turbulent states. This analytical analysis also shows that these effects lead to the asymmetric probability density function of kinetic energy of turbulence.

  19. Extremely rare collapse and build-up of turbulence in stochastic models of transitional wall flows

    NASA Astrophysics Data System (ADS)

    Rolland, Joran

    2018-02-01

    This paper presents a numerical and theoretical study of multistability in two stochastic models of transitional wall flows. An algorithm dedicated to the computation of rare events is adapted on these two stochastic models. The main focus is placed on a stochastic partial differential equation model proposed by Barkley. Three types of events are computed in a systematic and reproducible manner: (i) the collapse of isolated puffs and domains initially containing their steady turbulent fraction; (ii) the puff splitting; (iii) the build-up of turbulence from the laminar base flow under a noise perturbation of vanishing variance. For build-up events, an extreme realization of the vanishing variance noise pushes the state from the laminar base flow to the most probable germ of turbulence which in turn develops into a full blown puff. For collapse events, the Reynolds number and length ranges of the two regimes of collapse of laminar-turbulent pipes, independent collapse or global collapse of puffs, is determined. The mean first passage time before each event is then systematically computed as a function of the Reynolds number r and pipe length L in the laminar-turbulent coexistence range of Reynolds number. In the case of isolated puffs, the faster-than-linear growth with Reynolds number of the logarithm of mean first passage time T before collapse is separated in two. One finds that ln(T ) =Apr -Bp , with Ap and Bp positive. Moreover, Ap and Bp are affine in the spatial integral of turbulence intensity of the puff, with the same slope. In the case of pipes initially containing the steady turbulent fraction, the length L and Reynolds number r dependence of the mean first passage time T before collapse is also separated. The author finds that T ≍exp[L (A r -B )] with A and B positive. The length and Reynolds number dependence of T are then discussed in view of the large deviations theoretical approaches of the study of mean first passage times and multistability, where ln(T ) in the limit of small variance noise is studied. Two points of view, local noise of small variance and large length, can be used to discuss the exponential dependence in L of T . In particular, it is shown how a T ≍exp[L (A'R -B') ] can be derived in a conceptual two degrees of freedom model of a transitional wall flow proposed by Dauchot and Manneville. This is done by identifying a quasipotential in low variance noise, large length limit. This pinpoints the physical effects controlling collapse and build-up trajectories and corresponding passage times with an emphasis on the saddle points between laminar and turbulent states. This analytical analysis also shows that these effects lead to the asymmetric probability density function of kinetic energy of turbulence.

  20. [Professional strategy and institutional isomorphism: the dental health insurance industry in Brazil].

    PubMed

    Vieira, Cristine; Costa, Nilson do Rosário

    2008-01-01

    This article analyzes the organizational model of the dental health industry. The main organizational leaders in this industry are the professional cooperatives and group dental insurance companies. The theoretical basis of the article is the organizational theory developed by Di Maggio and Powell. The dental health industry consists of a great number of small and very dynamic companies, however an expressive part of clients and profit are concentrated in a few large companies. The results show that the industry has expanded the number of clients after the creation of the National Health Insurance Agency. The regulation regime has forced institutional changes in the firms with regard to the market entry, permanence or exit patterns. There was no evidence that the regulatory rules have interfered with the development and financial conditions of the industry. The average profitability of the sector, especially among the group dental insurance companies, is extremely high.

  1. Effects of forebody geometry on subsonic boundary-layer stability

    NASA Technical Reports Server (NTRS)

    Dodbele, Simha S.

    1990-01-01

    As part of an effort to develop computational techniques for design of natural laminar flow fuselages, a computational study was made of the effect of forebody geometry on laminar boundary layer stability on axisymmetric body shapes. The effects of nose radius on the stability of the incompressible laminar boundary layer was computationally investigated using linear stability theory for body length Reynolds numbers representative of small and medium-sized airplanes. The steepness of the pressure gradient and the value of the minimum pressure (both functions of fineness ratio) govern the stability of laminar flow possible on an axisymmetric body at a given Reynolds number. It was found that to keep the laminar boundary layer stable for extended lengths, it is important to have a small nose radius. However, nose shapes with extremely small nose radii produce large pressure peaks at off-design angles of attack and can produce vortices which would adversely affect transition.

  2. Fine mapping of regulatory loci for mammalian gene expression using radiation hybrids

    PubMed Central

    Park, Christopher C; Ahn, Sangtae; Bloom, Joshua S; Lin, Andy; Wang, Richard T; Wu, Tongtong; Sekar, Aswin; Khan, Arshad H; Farr, Christine J; Lusis, Aldons J; Leahy, Richard M; Lange, Kenneth; Smith, Desmond J

    2010-01-01

    We mapped regulatory loci for nearly all protein-coding genes in mammals using comparative genomic hybridization and expression array measurements from a panel of mouse–hamster radiation hybrid cell lines. The large number of breaks in the mouse chromosomes and the dense genotyping of the panel allowed extremely sharp mapping of loci. As the regulatory loci result from extra gene dosage, we call them copy number expression quantitative trait loci, or ceQTLs. The −2log10P support interval for the ceQTLs was <150 kb, containing an average of <2–3 genes. We identified 29,769 trans ceQTLs with −log10P > 4, including 13 hotspots each regulating >100 genes in trans. Further, this work identifies 2,761 trans ceQTLs harboring no known genes, and provides evidence for a mode of gene expression autoregulation specific to the X chromosome. PMID:18362883

  3. A fast, preconditioned conjugate gradient Toeplitz solver

    NASA Technical Reports Server (NTRS)

    Pan, Victor; Schrieber, Robert

    1989-01-01

    A simple factorization is given of an arbitrary hermitian, positive definite matrix in which the factors are well-conditioned, hermitian, and positive definite. In fact, given knowledge of the extreme eigenvalues of the original matrix A, an optimal improvement can be achieved, making the condition numbers of each of the two factors equal to the square root of the condition number of A. This technique is to applied to the solution of hermitian, positive definite Toeplitz systems. Large linear systems with hermitian, positive definite Toeplitz matrices arise in some signal processing applications. A stable fast algorithm is given for solving these systems that is based on the preconditioned conjugate gradient method. The algorithm exploits Toeplitz structure to reduce the cost of an iteration to O(n log n) by applying the fast Fourier Transform to compute matrix-vector products. Matrix factorization is used as a preconditioner.

  4. Design and interpretation of cell trajectory assays

    PubMed Central

    Bowden, Lucie G.; Simpson, Matthew J.; Baker, Ruth E.

    2013-01-01

    Cell trajectory data are often reported in the experimental cell biology literature to distinguish between different types of cell migration. Unfortunately, there is no accepted protocol for designing or interpreting such experiments and this makes it difficult to quantitatively compare different published datasets and to understand how changes in experimental design influence our ability to interpret different experiments. Here, we use an individual-based mathematical model to simulate the key features of a cell trajectory experiment. This shows that our ability to correctly interpret trajectory data is extremely sensitive to the geometry and timing of the experiment, the degree of motility bias and the number of experimental replicates. We show that cell trajectory experiments produce data that are most reliable when the experiment is performed in a quasi-one-dimensional geometry with a large number of identically prepared experiments conducted over a relatively short time-interval rather than a few trajectories recorded over particularly long time-intervals. PMID:23985736

  5. Optimal Design of Experiments by Combining Coarse and Fine Measurements

    NASA Astrophysics Data System (ADS)

    Lee, Alpha A.; Brenner, Michael P.; Colwell, Lucy J.

    2017-11-01

    In many contexts, it is extremely costly to perform enough high-quality experimental measurements to accurately parametrize a predictive quantitative model. However, it is often much easier to carry out large numbers of experiments that indicate whether each sample is above or below a given threshold. Can many such categorical or "coarse" measurements be combined with a much smaller number of high-resolution or "fine" measurements to yield accurate models? Here, we demonstrate an intuitive strategy, inspired by statistical physics, wherein the coarse measurements are used to identify the salient features of the data, while the fine measurements determine the relative importance of these features. A linear model is inferred from the fine measurements, augmented by a quadratic term that captures the correlation structure of the coarse data. We illustrate our strategy by considering the problems of predicting the antimalarial potency and aqueous solubility of small organic molecules from their 2D molecular structure.

  6. From entanglement witness to generalized Catalan numbers

    NASA Astrophysics Data System (ADS)

    Cohen, E.; Hansen, T.; Itzhaki, N.

    2016-07-01

    Being extremely important resources in quantum information and computation, it is vital to efficiently detect and properly characterize entangled states. We analyze in this work the problem of entanglement detection for arbitrary spin systems. It is demonstrated how a single measurement of the squared total spin can probabilistically discern separable from entangled many-particle states. For achieving this goal, we construct a tripartite analogy between the degeneracy of entanglement witness eigenstates, tensor products of SO(3) representations and classical lattice walks with special constraints. Within this framework, degeneracies are naturally given by generalized Catalan numbers and determine the fraction of states that are decidedly entangled and also known to be somewhat protected against decoherence. In addition, we introduce the concept of a “sterile entanglement witness”, which for large enough systems detects entanglement without affecting much the system’s state. We discuss when our proposed entanglement witness can be regarded as a sterile one.

  7. From entanglement witness to generalized Catalan numbers.

    PubMed

    Cohen, E; Hansen, T; Itzhaki, N

    2016-07-27

    Being extremely important resources in quantum information and computation, it is vital to efficiently detect and properly characterize entangled states. We analyze in this work the problem of entanglement detection for arbitrary spin systems. It is demonstrated how a single measurement of the squared total spin can probabilistically discern separable from entangled many-particle states. For achieving this goal, we construct a tripartite analogy between the degeneracy of entanglement witness eigenstates, tensor products of SO(3) representations and classical lattice walks with special constraints. Within this framework, degeneracies are naturally given by generalized Catalan numbers and determine the fraction of states that are decidedly entangled and also known to be somewhat protected against decoherence. In addition, we introduce the concept of a "sterile entanglement witness", which for large enough systems detects entanglement without affecting much the system's state. We discuss when our proposed entanglement witness can be regarded as a sterile one.

  8. Supersymmetry breaking and Nambu-Goldstone fermions with cubic dispersion

    NASA Astrophysics Data System (ADS)

    Sannomiya, Noriaki; Katsura, Hosho; Nakayama, Yu

    2017-03-01

    We introduce a lattice fermion model in one spatial dimension with supersymmetry (SUSY) but without particle number conservation. The Hamiltonian is defined as the anticommutator of two nilpotent supercharges Q and Q†. Each supercharge is built solely from spinless fermion operators and depends on a parameter g . The system is strongly interacting for small g , and in the extreme limit g =0 , the number of zero-energy ground states grows exponentially with the system size. By contrast, in the large-g limit, the system is noninteracting and SUSY is broken spontaneously. We study the model for modest values of g and show that under certain conditions spontaneous SUSY breaking occurs in both finite and infinite chains. We analyze the low-energy excitations both analytically and numerically. Our analysis suggests that the Nambu-Goldstone fermions accompanying the spontaneous SUSY breaking have cubic dispersion at low energies.

  9. From entanglement witness to generalized Catalan numbers

    PubMed Central

    Cohen, E.; Hansen, T.; Itzhaki, N.

    2016-01-01

    Being extremely important resources in quantum information and computation, it is vital to efficiently detect and properly characterize entangled states. We analyze in this work the problem of entanglement detection for arbitrary spin systems. It is demonstrated how a single measurement of the squared total spin can probabilistically discern separable from entangled many-particle states. For achieving this goal, we construct a tripartite analogy between the degeneracy of entanglement witness eigenstates, tensor products of SO(3) representations and classical lattice walks with special constraints. Within this framework, degeneracies are naturally given by generalized Catalan numbers and determine the fraction of states that are decidedly entangled and also known to be somewhat protected against decoherence. In addition, we introduce the concept of a “sterile entanglement witness”, which for large enough systems detects entanglement without affecting much the system’s state. We discuss when our proposed entanglement witness can be regarded as a sterile one. PMID:27461089

  10. Large transcription units unify copy number variants and common fragile sites arising under replication stress.

    PubMed

    Wilson, Thomas E; Arlt, Martin F; Park, So Hae; Rajendran, Sountharia; Paulsen, Michelle; Ljungman, Mats; Glover, Thomas W

    2015-02-01

    Copy number variants (CNVs) resulting from genomic deletions and duplications and common fragile sites (CFSs) seen as breaks on metaphase chromosomes are distinct forms of structural chromosome instability precipitated by replication inhibition. Although they share a common induction mechanism, it is not known how CNVs and CFSs are related or why some genomic loci are much more prone to their occurrence. Here we compare large sets of de novo CNVs and CFSs in several experimental cell systems to each other and to overlapping genomic features. We first show that CNV hotpots and CFSs occurred at the same human loci within a given cultured cell line. Bru-seq nascent RNA sequencing further demonstrated that although genomic regions with low CNV frequencies were enriched in transcribed genes, the CNV hotpots that matched CFSs specifically corresponded to the largest active transcription units in both human and mouse cells. Consistently, active transcription units >1 Mb were robust cell-type-specific predictors of induced CNV hotspots and CFS loci. Unlike most transcribed genes, these very large transcription units replicated late and organized deletion and duplication CNVs into their transcribed and flanking regions, respectively, supporting a role for transcription in replication-dependent lesion formation. These results indicate that active large transcription units drive extreme locus- and cell-type-specific genomic instability under replication stress, resulting in both CNVs and CFSs as different manifestations of perturbed replication dynamics. © 2015 Wilson et al.; Published by Cold Spring Harbor Laboratory Press.

  11. Large transcription units unify copy number variants and common fragile sites arising under replication stress

    PubMed Central

    Park, So Hae; Rajendran, Sountharia; Paulsen, Michelle; Ljungman, Mats; Glover, Thomas W.

    2015-01-01

    Copy number variants (CNVs) resulting from genomic deletions and duplications and common fragile sites (CFSs) seen as breaks on metaphase chromosomes are distinct forms of structural chromosome instability precipitated by replication inhibition. Although they share a common induction mechanism, it is not known how CNVs and CFSs are related or why some genomic loci are much more prone to their occurrence. Here we compare large sets of de novo CNVs and CFSs in several experimental cell systems to each other and to overlapping genomic features. We first show that CNV hotpots and CFSs occurred at the same human loci within a given cultured cell line. Bru-seq nascent RNA sequencing further demonstrated that although genomic regions with low CNV frequencies were enriched in transcribed genes, the CNV hotpots that matched CFSs specifically corresponded to the largest active transcription units in both human and mouse cells. Consistently, active transcription units >1 Mb were robust cell-type-specific predictors of induced CNV hotspots and CFS loci. Unlike most transcribed genes, these very large transcription units replicated late and organized deletion and duplication CNVs into their transcribed and flanking regions, respectively, supporting a role for transcription in replication-dependent lesion formation. These results indicate that active large transcription units drive extreme locus- and cell-type-specific genomic instability under replication stress, resulting in both CNVs and CFSs as different manifestations of perturbed replication dynamics. PMID:25373142

  12. Space Shuttle wind tunnel testing program

    NASA Technical Reports Server (NTRS)

    Whitnah, A. M.; Hillje, E. R.

    1984-01-01

    A major phase of the Space Shuttle Vehicle (SSV) Development Program was the acquisition of data through the space shuttle wind tunnel testing program. It became obvious that the large number of configuration/environment combinations would necessitate an extremely large wind tunnel testing program. To make the most efficient use of available test facilities and to assist the prime contractor for orbiter design and space shuttle vehicle integration, a unique management plan was devised for the design and development phase. The space shuttle program is reviewed together with the evolutional development of the shuttle configuration. The wind tunnel testing rationale and the associated test program management plan and its overall results is reviewed. Information is given for the various facilities and models used within this program. A unique posttest documentation procedure and a summary of the types of test per disciplines, per facility, and per model are presented with detailed listing of the posttest documentation.

  13. Automated adaptive inference of phenomenological dynamical models.

    PubMed

    Daniels, Bryan C; Nemenman, Ilya

    2015-08-21

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved.

  14. A fast low-power optical memory based on coupled micro-ring lasers

    NASA Astrophysics Data System (ADS)

    Hill, Martin T.; Dorren, Harmen J. S.; de Vries, Tjibbe; Leijtens, Xaveer J. M.; den Besten, Jan Hendrik; Smalbrugge, Barry; Oei, Yok-Siang; Binsma, Hans; Khoe, Giok-Djan; Smit, Meint K.

    2004-11-01

    The increasing speed of fibre-optic-based telecommunications has focused attention on high-speed optical processing of digital information. Complex optical processing requires a high-density, high-speed, low-power optical memory that can be integrated with planar semiconductor technology for buffering of decisions and telecommunication data. Recently, ring lasers with extremely small size and low operating power have been made, and we demonstrate here a memory element constructed by interconnecting these microscopic lasers. Our device occupies an area of 18 × 40µm2 on an InP/InGaAsP photonic integrated circuit, and switches within 20ps with 5.5fJ optical switching energy. Simulations show that the element has the potential for much smaller dimensions and switching times. Large numbers of such memory elements can be densely integrated and interconnected on a photonic integrated circuit: fast digital optical information processing systems employing large-scale integration should now be viable.

  15. Flooding disturbances increase resource availability and productivity but reduce stability in diverse plant communities.

    PubMed

    Wright, Alexandra J; Ebeling, Anne; de Kroon, Hans; Roscher, Christiane; Weigelt, Alexandra; Buchmann, Nina; Buchmann, Tina; Fischer, Christine; Hacker, Nina; Hildebrandt, Anke; Leimer, Sophia; Mommer, Liesje; Oelmann, Yvonne; Scheu, Stefan; Steinauer, Katja; Strecker, Tanja; Weisser, Wolfgang; Wilcke, Wolfgang; Eisenhauer, Nico

    2015-01-20

    The natural world is increasingly defined by change. Within the next 100 years, rising atmospheric CO₂ concentrations will continue to increase the frequency and magnitude of extreme weather events. Simultaneously, human activities are reducing global biodiversity, with current extinction rates at ~1,000 × what they were before human domination of Earth's ecosystems. The co-occurrence of these trends may be of particular concern, as greater biological diversity could help ecosystems resist change during large perturbations. We use data from a 200-year flood event to show that when a disturbance is associated with an increase in resource availability, the opposite may occur. Flooding was associated with increases in productivity and decreases in stability, particularly in the highest diversity communities. Our results undermine the utility of the biodiversity-stability hypothesis during a large number of disturbances where resource availability increases. We propose a conceptual framework that can be widely applied during natural disturbances.

  16. Community engagement: leadership tool for catastrophic health events.

    PubMed

    Schoch-Spana, Monica; Franco, Crystal; Nuzzo, Jennifer B; Usenza, Christiana

    2007-03-01

    Disasters and epidemics are immense and shocking disturbances that require the judgments and efforts of large numbers of people, not simply those who serve in an official capacity. This article reviews the Working Group on Community Engagement in Health Emergency Planning's recommendations to government decision makers on why and how to catalyze the civic infrastructure for an extreme health event. Community engagement--defined here as structured dialogue, joint problem solving, and collaborative action among formal authorities, citizens at-large, and local opinion leaders around a pressing public matter--can augment officials' abilities to govern in a crisis, improve application of communally held resources in a disaster or epidemic, and mitigate community wide losses. The case of limited medical options in an influenza pandemic serves to demonstrate the civic infrastructure's preparedness, response, and recovery capabilities and to illustrate how community engagement can improve pandemic contingency planning.

  17. Selective significance of genome size in a plant community with heavy metal pollution.

    PubMed

    Vidic, T; Greilhuber, J; Vilhar, B; Dermastia, M

    2009-09-01

    In eukaryotes, nuclear genome sizes vary by more than five orders of magnitude. This variation is not related to organismal complexity, and its origin and biological significance are still disputed. One of the open questions is whether genome size has an adaptive role. We tested the hypothesis that genome size has selective significance, using five grassland communities occurring on a gradient of metal pollution of the soil as a model. We detected a negative correlation between the concentration of contaminating metals in the soil and the number of vascular plant species. Analysis of genome sizes of 70 herbaceous dicot perennial species occurring on the investigated plots revealed a negative correlation between the concentration of contaminating metals in the soil and the proportion of species with large genomes in plant communities. Consistent with the hypothesis, these results show that species with large genomes are at selective disadvantage in extreme environmental conditions.

  18. Automated adaptive inference of phenomenological dynamical models

    PubMed Central

    Daniels, Bryan C.; Nemenman, Ilya

    2015-01-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved. PMID:26293508

  19. Feasibility of utilizing Cherenkov Telescope Array gamma-ray telescopes as free-space optical communication ground stations.

    PubMed

    Carrasco-Casado, Alberto; Vilera, Mariafernanda; Vergaz, Ricardo; Cabrero, Juan Francisco

    2013-04-10

    The signals that will be received on Earth from deep-space probes in future implementations of free-space optical communication will be extremely weak, and new ground stations will have to be developed in order to support these links. This paper addresses the feasibility of using the technology developed in the gamma-ray telescopes that will make up the Cherenkov Telescope Array (CTA) observatory in the implementation of a new kind of ground station. Among the main advantages that these telescopes provide are the much larger apertures needed to overcome the power limitation that ground-based gamma-ray astronomy and optical communication both have. Also, the large number of big telescopes that will be built for CTA will make it possible to reduce costs by economy-scale production, enabling optical communications in the large telescopes that will be needed for future deep-space links.

  20. The Coast Artillery Journal. Volume 57, Number 6, December 1922

    DTIC Science & Technology

    1922-12-01

    theorems ; Chapter III, to application; Chapters IV, V and VI, to infinitesimals and differentials, trigonometric functions, and logarithms and...taneously." There are chapters on complex numbers with simple and direct discussion of the roots of unity; on elementary theorems on the roots of an...through the centuries from the time of Pythagoras , an interest shared on the one extreme by nearly every noted mathematician and on the other extreme by

  1. A rational decision rule with extreme events.

    PubMed

    Basili, Marcello

    2006-12-01

    Risks induced by extreme events are characterized by small or ambiguous probabilities, catastrophic losses, or windfall gains. Through a new functional, that mimics the restricted Bayes-Hurwicz criterion within the Choquet expected utility approach, it is possible to represent the decisionmaker behavior facing both risky (large and reliable probability) and extreme (small or ambiguous probability) events. A new formalization of the precautionary principle (PP) is shown and a new functional, which encompasses both extreme outcomes and expectation of all the possible results for every act, is claimed.

  2. Effects of climate extremes on the terrestrial carbon cycle: concepts, processes and potential future impacts

    PubMed Central

    Frank, Dorothea; Reichstein, Markus; Bahn, Michael; Thonicke, Kirsten; Frank, David; Mahecha, Miguel D; Smith, Pete; van der Velde, Marijn; Vicca, Sara; Babst, Flurin; Beer, Christian; Buchmann, Nina; Canadell, Josep G; Ciais, Philippe; Cramer, Wolfgang; Ibrom, Andreas; Miglietta, Franco; Poulter, Ben; Rammig, Anja; Seneviratne, Sonia I; Walz, Ariane; Wattenbach, Martin; Zavala, Miguel A; Zscheischler, Jakob

    2015-01-01

    Extreme droughts, heat waves, frosts, precipitation, wind storms and other climate extremes may impact the structure, composition and functioning of terrestrial ecosystems, and thus carbon cycling and its feedbacks to the climate system. Yet, the interconnected avenues through which climate extremes drive ecological and physiological processes and alter the carbon balance are poorly understood. Here, we review the literature on carbon cycle relevant responses of ecosystems to extreme climatic events. Given that impacts of climate extremes are considered disturbances, we assume the respective general disturbance-induced mechanisms and processes to also operate in an extreme context. The paucity of well-defined studies currently renders a quantitative meta-analysis impossible, but permits us to develop a deductive framework for identifying the main mechanisms (and coupling thereof) through which climate extremes may act on the carbon cycle. We find that ecosystem responses can exceed the duration of the climate impacts via lagged effects on the carbon cycle. The expected regional impacts of future climate extremes will depend on changes in the probability and severity of their occurrence, on the compound effects and timing of different climate extremes, and on the vulnerability of each land-cover type modulated by management. Although processes and sensitivities differ among biomes, based on expert opinion, we expect forests to exhibit the largest net effect of extremes due to their large carbon pools and fluxes, potentially large indirect and lagged impacts, and long recovery time to regain previous stocks. At the global scale, we presume that droughts have the strongest and most widespread effects on terrestrial carbon cycling. Comparing impacts of climate extremes identified via remote sensing vs. ground-based observational case studies reveals that many regions in the (sub-)tropics are understudied. Hence, regional investigations are needed to allow a global upscaling of the impacts of climate extremes on global carbon–climate feedbacks. PMID:25752680

  3. Effects of climate extremes on the terrestrial carbon cycle: concepts, processes and potential future impacts.

    PubMed

    Frank, Dorothea; Reichstein, Markus; Bahn, Michael; Thonicke, Kirsten; Frank, David; Mahecha, Miguel D; Smith, Pete; van der Velde, Marijn; Vicca, Sara; Babst, Flurin; Beer, Christian; Buchmann, Nina; Canadell, Josep G; Ciais, Philippe; Cramer, Wolfgang; Ibrom, Andreas; Miglietta, Franco; Poulter, Ben; Rammig, Anja; Seneviratne, Sonia I; Walz, Ariane; Wattenbach, Martin; Zavala, Miguel A; Zscheischler, Jakob

    2015-08-01

    Extreme droughts, heat waves, frosts, precipitation, wind storms and other climate extremes may impact the structure, composition and functioning of terrestrial ecosystems, and thus carbon cycling and its feedbacks to the climate system. Yet, the interconnected avenues through which climate extremes drive ecological and physiological processes and alter the carbon balance are poorly understood. Here, we review the literature on carbon cycle relevant responses of ecosystems to extreme climatic events. Given that impacts of climate extremes are considered disturbances, we assume the respective general disturbance-induced mechanisms and processes to also operate in an extreme context. The paucity of well-defined studies currently renders a quantitative meta-analysis impossible, but permits us to develop a deductive framework for identifying the main mechanisms (and coupling thereof) through which climate extremes may act on the carbon cycle. We find that ecosystem responses can exceed the duration of the climate impacts via lagged effects on the carbon cycle. The expected regional impacts of future climate extremes will depend on changes in the probability and severity of their occurrence, on the compound effects and timing of different climate extremes, and on the vulnerability of each land-cover type modulated by management. Although processes and sensitivities differ among biomes, based on expert opinion, we expect forests to exhibit the largest net effect of extremes due to their large carbon pools and fluxes, potentially large indirect and lagged impacts, and long recovery time to regain previous stocks. At the global scale, we presume that droughts have the strongest and most widespread effects on terrestrial carbon cycling. Comparing impacts of climate extremes identified via remote sensing vs. ground-based observational case studies reveals that many regions in the (sub-)tropics are understudied. Hence, regional investigations are needed to allow a global upscaling of the impacts of climate extremes on global carbon-climate feedbacks. © 2015 The Authors. Global Change Biology published by John Wiley & Sons Ltd.

  4. Extreme seasonal droughts and floods in Amazonia: causes, trends and impacts

    NASA Astrophysics Data System (ADS)

    Marengo, J. A.

    2015-12-01

    J. A. Marengo * and J. C. Espinoza** * Centro Nacional de Monitoramento e Alerta de Desastres Naturais, Ministério da Ciência, Tecnologia e Inovação, Sao Paulo, Brazil ** Subdirección de Ciencias de la Atmósfera e Hidrósfera (SCAH), Instituto Geofísico del Perú, Lima, Peru This paper reviews recent progress in the study and understanding of extreme seasonal events in the Amazon region, focusing on drought and floods. The review includes a history of droughts and floods in the past, in the present and some discussions on future extremes in the context of climate change and its impacts on the Amazon region. Several extreme hydrological events, some of them characterized as 'once in a century', have been reported in the Amazon region during the last decade. While abundant rainfall in various sectors of the basin has determined extreme floods along the river's main stem in 1953, 1989, 1999, 2009, 2012-2015, deficient rainfall in 1912, 1926, 1963, 1980, 1983, 1995, 1997, 1998, 2005 and 2010 has caused anomalously low river levels, and an increase in the risk and number of fires in the region, with consequences for humans. This is consistent with changes in the variability of the hydrometeorology of the basin and suggests that extreme hydrological events have been more frequent in the last two decades. Some of these intense/reduced rainfalls and subsequent floods/droughts were associated (but not exclusively) with La Niña/El Niño events. In addition, moisture transport anomalies from the tropical Atlantic into Amazonia, and from northern to southern Amazonia alter the water cycle in the region year-to-year. We also assess the impacts of such extremes on natural and human systems in the region, considering ecological, economic and societal impacts in urban and rural areas, particularly during the recent decades. In the context of the future climate change, studies show a large range of uncertainty, but suggest that drought might intensify through the 21st century.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Gang

    Mid-latitude extreme weather events are responsible for a large part of climate-related damage. Yet large uncertainties remain in climate model projections of heat waves, droughts, and heavy rain/snow events on regional scales, limiting our ability to effectively use these projections for climate adaptation and mitigation. These uncertainties can be attributed to both the lack of spatial resolution in the models, and to the lack of a dynamical understanding of these extremes. The approach of this project is to relate the fine-scale features to the large scales in current climate simulations, seasonal re-forecasts, and climate change projections in a very widemore » range of models, including the atmospheric and coupled models of ECMWF over a range of horizontal resolutions (125 to 10 km), aqua-planet configuration of the Model for Prediction Across Scales and High Order Method Modeling Environments (resolutions ranging from 240 km – 7.5 km) with various physics suites, and selected CMIP5 model simulations. The large scale circulation will be quantified both on the basis of the well tested preferred circulation regime approach, and very recently developed measures, the finite amplitude Wave Activity (FAWA) and its spectrum. The fine scale structures related to extremes will be diagnosed following the latest approaches in the literature. The goal is to use the large scale measures as indicators of the probability of occurrence of the finer scale structures, and hence extreme events. These indicators will then be applied to the CMIP5 models and time-slice projections of a future climate.« less

  6. Large uncertainties in observed daily precipitation extremes over land

    NASA Astrophysics Data System (ADS)

    Herold, Nicholas; Behrangi, Ali; Alexander, Lisa V.

    2017-01-01

    We explore uncertainties in observed daily precipitation extremes over the terrestrial tropics and subtropics (50°S-50°N) based on five commonly used products: the Climate Hazards Group InfraRed Precipitation with Stations (CHIRPS) dataset, the Global Precipitation Climatology Centre-Full Data Daily (GPCC-FDD) dataset, the Tropical Rainfall Measuring Mission (TRMM) multi-satellite research product (T3B42 v7), the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Climate Data Record (PERSIANN-CDR), and the Global Precipitation Climatology Project's One-Degree Daily (GPCP-1DD) dataset. We use the precipitation indices R10mm and Rx1day, developed by the Expert Team on Climate Change Detection and Indices, to explore the behavior of "moderate" and "extreme" extremes, respectively. In order to assess the sensitivity of extreme precipitation to different grid sizes we perform our calculations on four common spatial resolutions (0.25° × 0.25°, 1° × 1°, 2.5° × 2.5°, and 3.75° × 2.5°). The impact of the chosen "order of operation" in calculating these indices is also determined. Our results show that moderate extremes are relatively insensitive to product and resolution choice, while extreme extremes can be very sensitive. For example, at 0.25° × 0.25° quasi-global mean Rx1day values vary from 37 mm in PERSIANN-CDR to 62 mm in T3B42. We find that the interproduct spread becomes prominent at resolutions of 1° × 1° and finer, thus establishing a minimum effective resolution at which observational products agree. Without improvements in interproduct spread, these exceedingly large observational uncertainties at high spatial resolution may limit the usefulness of model evaluations. As has been found previously, resolution sensitivity can be largely eliminated by applying an order of operation where indices are calculated prior to regridding. However, this approach is not appropriate when true area averages are desired (e.g., for model evaluations).

  7. Dynamical properties and extremes of Northern Hemisphere climate fields over the past 60 years

    NASA Astrophysics Data System (ADS)

    Faranda, Davide; Messori, Gabriele; Alvarez-Castro, M. Carmen; Yiou, Pascal

    2017-12-01

    Atmospheric dynamics are described by a set of partial differential equations yielding an infinite-dimensional phase space. However, the actual trajectories followed by the system appear to be constrained to a finite-dimensional phase space, i.e. a strange attractor. The dynamical properties of this attractor are difficult to determine due to the complex nature of atmospheric motions. A first step to simplify the problem is to focus on observables which affect - or are linked to phenomena which affect - human welfare and activities, such as sea-level pressure, 2 m temperature, and precipitation frequency. We make use of recent advances in dynamical systems theory to estimate two instantaneous dynamical properties of the above fields for the Northern Hemisphere: local dimension and persistence. We then use these metrics to characterize the seasonality of the different fields and their interplay. We further analyse the large-scale anomaly patterns corresponding to phase-space extremes - namely time steps at which the fields display extremes in their instantaneous dynamical properties. The analysis is based on the NCEP/NCAR reanalysis data, over the period 1948-2013. The results show that (i) despite the high dimensionality of atmospheric dynamics, the Northern Hemisphere sea-level pressure and temperature fields can on average be described by roughly 20 degrees of freedom; (ii) the precipitation field has a higher dimensionality; and (iii) the seasonal forcing modulates the variability of the dynamical indicators and affects the occurrence of phase-space extremes. We further identify a number of robust correlations between the dynamical properties of the different variables.

  8. Global Distribution of Extreme Precipitation and High-Impact Landslides in 2010 Relative to Previous Years

    NASA Technical Reports Server (NTRS)

    Kirschbaum, Dalia; Adler, Robert; Adler, David; Peters-Lidard, Christa; Huffman, George

    2012-01-01

    It is well known that extreme or prolonged rainfall is the dominant trigger of landslides worldwide. While research has evaluated the spatiotemporal distribution of extreme rainfall and landslides at local or regional scales using in situ data, few studies have mapped rainfall-triggered landslide distribution globally due to the dearth of landslide data and consistent precipitation information. This study uses a newly developed Global Landslide Catalog (GLC) and a 13-year satellite-based precipitation record from TRMM data. For the first time, these two unique products provide the foundation to quantitatively evaluate the co-occurrence of precipitation and landslides globally. Evaluation of the GLC indicates that 2010 had a large number of high-impact landslide events relative to previous years. This study considers how variations in extreme and prolonged satellite-based rainfall are related to the distribution of landslides over the same time scales for three active landslide areas: Central America, the Himalayan Arc, and central-eastern China. Several test statistics confirm that TRMM rainfall generally scales with the observed increase in landslide reports and fatal events for 2010 and previous years over each region. These findings suggest that the co-occurrence of satellite precipitation and landslide reports may serve as a valuable indicator for characterizing the spatiotemporal distribution of landslide-prone areas in order to establish a global rainfall-triggered landslide climatology. This study characterizes the variability of satellite precipitation data and reported landslide activity at the globally scale in order to improve landslide cataloging, forecasting and quantify potential triggering sources at daily, monthly and yearly time scales.

  9. Increasing precipitation volatility in twenty-first-century California

    NASA Astrophysics Data System (ADS)

    Swain, Daniel L.; Langenbrunner, Baird; Neelin, J. David; Hall, Alex

    2018-05-01

    Mediterranean climate regimes are particularly susceptible to rapid shifts between drought and flood—of which, California's rapid transition from record multi-year dryness between 2012 and 2016 to extreme wetness during the 2016-2017 winter provides a dramatic example. Projected future changes in such dry-to-wet events, however, remain inadequately quantified, which we investigate here using the Community Earth System Model Large Ensemble of climate model simulations. Anthropogenic forcing is found to yield large twenty-first-century increases in the frequency of wet extremes, including a more than threefold increase in sub-seasonal events comparable to California's `Great Flood of 1862'. Smaller but statistically robust increases in dry extremes are also apparent. As a consequence, a 25% to 100% increase in extreme dry-to-wet precipitation events is projected, despite only modest changes in mean precipitation. Such hydrological cycle intensification would seriously challenge California's existing water storage, conveyance and flood control infrastructure.

  10. Instability of Poiseuille flow at extreme Mach numbers: linear analysis and simulations.

    PubMed

    Xie, Zhimin; Girimaji, Sharath S

    2014-04-01

    We develop the perturbation equations to describe instability evolution in Poiseuille flow at the limit of very high Mach numbers. At this limit the equation governing the flow is the pressure-released Navier-Stokes equation. The ensuing semianalytical solution is compared against simulations performed using the gas-kinetic method (GKM), resulting in excellent agreement. A similar comparison between analytical and computational results of small perturbation growth is performed at the incompressible (zero Mach number) limit, again leading to excellent agreement. The study accomplishes two important goals: it (i) contrasts the small perturbation evolution in Poiseuille flows at extreme Mach numbers and (ii) provides important verification of the GKM simulation scheme.

  11. DETERMINING THE LARGE-SCALE ENVIRONMENTAL DEPENDENCE OF GAS-PHASE METALLICITY IN DWARF GALAXIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Douglass, Kelly A.; Vogeley, Michael S., E-mail: kelly.a.douglass@drexel.edu

    2017-01-10

    We study how the cosmic environment affects galaxy evolution in the universe by comparing the metallicities of dwarf galaxies in voids with dwarf galaxies in more dense regions. Ratios of the fluxes of emission lines, particularly those of the forbidden [O iii] and [S ii] transitions, provide estimates of a region’s electron temperature and number density. From these two quantities and the emission line fluxes [O ii] λ 3727, [O iii] λ 4363, and [O iii] λλ 4959, 5007, we estimate the abundance of oxygen with the direct T{sub e}  method. We estimate the metallicity of 42 blue, star-forming voidmore » dwarf galaxies and 89 blue, star-forming dwarf galaxies in more dense regions using spectroscopic observations from the Sloan Digital Sky Survey Data Release 7, as reprocessed in the MPA-JHU value-added catalog. We find very little difference between the two sets of galaxies, indicating little influence from the large-scale environment on their chemical evolution. Of particular interest are a number of extremely metal-poor dwarf galaxies that are less prevalent in voids than in the denser regions.« less

  12. Single stock dynamics on high-frequency data: from a compressed coding perspective.

    PubMed

    Fushing, Hsieh; Chen, Shu-Chun; Hwang, Chii-Ruey

    2014-01-01

    High-frequency return, trading volume and transaction number are digitally coded via a nonparametric computing algorithm, called hierarchical factor segmentation (HFS), and then are coupled together to reveal a single stock dynamics without global state-space structural assumptions. The base-8 digital coding sequence, which is capable of revealing contrasting aggregation against sparsity of extreme events, is further compressed into a shortened sequence of state transitions. This compressed digital code sequence vividly demonstrates that the aggregation of large absolute returns is the primary driving force for stimulating both the aggregations of large trading volumes and transaction numbers. The state of system-wise synchrony is manifested with very frequent recurrence in the stock dynamics. And this data-driven dynamic mechanism is seen to correspondingly vary as the global market transiting in and out of contraction-expansion cycles. These results not only elaborate the stock dynamics of interest to a fuller extent, but also contradict some classical theories in finance. Overall this version of stock dynamics is potentially more coherent and realistic, especially when the current financial market is increasingly powered by high-frequency trading via computer algorithms, rather than by individual investors.

  13. Toxicity assessment of industrial chemicals and airborne contaminants: transition from in vivo to in vitro test methods: a review.

    PubMed

    Bakand, S; Winder, C; Khalil, C; Hayes, A

    2005-12-01

    Exposure to occupational and environmental contaminants is a major contributor to human health problems. Inhalation of gases, vapors, aerosols, and mixtures of these can cause a wide range of adverse health effects, ranging from simple irritation to systemic diseases. Despite significant achievements in the risk assessment of chemicals, the toxicological database, particularly for industrial chemicals, remains limited. Considering there are approximately 80,000 chemicals in commerce, and an extremely large number of chemical mixtures, in vivo testing of this large number is unachievable from both economical and practical perspectives. While in vitro methods are capable of rapidly providing toxicity information, regulatory agencies in general are still cautious about the replacement of whole-animal methods with new in vitro techniques. Although studying the toxic effects of inhaled chemicals is a complex subject, recent studies demonstrate that in vitro methods may have significant potential for assessing the toxicity of airborne contaminants. In this review, current toxicity test methods for risk evaluation of industrial chemicals and airborne contaminants are presented. To evaluate the potential applications of in vitro methods for studying respiratory toxicity, more recent models developed for toxicity testing of airborne contaminants are discussed.

  14. Single Stock Dynamics on High-Frequency Data: From a Compressed Coding Perspective

    PubMed Central

    Fushing, Hsieh; Chen, Shu-Chun; Hwang, Chii-Ruey

    2014-01-01

    High-frequency return, trading volume and transaction number are digitally coded via a nonparametric computing algorithm, called hierarchical factor segmentation (HFS), and then are coupled together to reveal a single stock dynamics without global state-space structural assumptions. The base-8 digital coding sequence, which is capable of revealing contrasting aggregation against sparsity of extreme events, is further compressed into a shortened sequence of state transitions. This compressed digital code sequence vividly demonstrates that the aggregation of large absolute returns is the primary driving force for stimulating both the aggregations of large trading volumes and transaction numbers. The state of system-wise synchrony is manifested with very frequent recurrence in the stock dynamics. And this data-driven dynamic mechanism is seen to correspondingly vary as the global market transiting in and out of contraction-expansion cycles. These results not only elaborate the stock dynamics of interest to a fuller extent, but also contradict some classical theories in finance. Overall this version of stock dynamics is potentially more coherent and realistic, especially when the current financial market is increasingly powered by high-frequency trading via computer algorithms, rather than by individual investors. PMID:24586235

  15. Spatiotemporal analysis the precipitation extremes affecting rice yield in Jiangsu province, southeast China

    NASA Astrophysics Data System (ADS)

    Huang, Jin; Islam, A. R. M. Towfiqul; Zhang, Fangmin; Hu, Zhenghua

    2017-10-01

    With the increasing risk of meteorological disasters, it is of great importance to analyze the spatiotemporal changes of precipitation extremes and its possible impact on rice productivity, especially in Jiangsu province, southeast China. In this study, we explored the relationships between rice yield and extreme precipitation indices using Mann-Kendall trend test, Pettitt's test, and K-means clustering methods. This study used 10 extreme precipitation indices of the rice growing season (May to October) based on the daily precipitation records and rice yield data at 52 meteorological stations during 1961-2012 in Jiangsu province. The main findings were as follows: (1) correlation results indicated that precipitation extremes occurred in the months of July, August, and October, which had noticeable adverse effects on rice yield; (2) the maximum 7-day precipitation of July and the number of rainy days of August and October should be considered as three key indicators for the precipitation-induced rice meteorological disasters; and (3) most of the stations showed an increasing trends for the maximum 7-day precipitation of July and the number of rainy days of August, while the number of rainy days of October in all the stations demonstrated a decreasing trend. Moreover, Jiangsu province could be divided into two major sub-regions such as north and south areas with different temporal variations in the three key indicators.

  16. Variability of rRNA Operon Copy Number and Growth Rate Dynamics of Bacillus Isolated from an Extremely Oligotrophic Aquatic Ecosystem

    PubMed Central

    Valdivia-Anistro, Jorge A.; Eguiarte-Fruns, Luis E.; Delgado-Sapién, Gabriela; Márquez-Zacarías, Pedro; Gasca-Pineda, Jaime; Learned, Jennifer; Elser, James J.; Olmedo-Alvarez, Gabriela; Souza, Valeria

    2016-01-01

    The ribosomal RNA (rrn) operon is a key suite of genes related to the production of protein synthesis machinery and thus to bacterial growth physiology. Experimental evidence has suggested an intrinsic relationship between the number of copies of this operon and environmental resource availability, especially the availability of phosphorus (P), because bacteria that live in oligotrophic ecosystems usually have few rrn operons and a slow growth rate. The Cuatro Ciénegas Basin (CCB) is a complex aquatic ecosystem that contains an unusually high microbial diversity that is able to persist under highly oligotrophic conditions. These environmental conditions impose a variety of strong selective pressures that shape the genome dynamics of their inhabitants. The genus Bacillus is one of the most abundant cultivable bacterial groups in the CCB and usually possesses a relatively large number of rrn operon copies (6–15 copies). The main goal of this study was to analyze the variation in the number of rrn operon copies of Bacillus in the CCB and to assess their growth-related properties as well as their stoichiometric balance (N and P content). We defined 18 phylogenetic groups within the Bacilli clade and documented a range of from six to 14 copies of the rrn operon. The growth dynamic of these Bacilli was heterogeneous and did not show a direct relation to the number of operon copies. Physiologically, our results were not consistent with the Growth Rate Hypothesis, since the copies of the rrn operon were decoupled from growth rate. However, we speculate that the diversity of the growth properties of these Bacilli as well as the low P content of their cells in an ample range of rrn copy number is an adaptive response to oligotrophy of the CCB and could represent an ecological mechanism that allows these taxa to coexist. These findings increase the knowledge of the variability in the number of copies of the rrn operon in the genus Bacillus and give insights about the physiology of this bacterial group under extreme oligotrophic conditions. PMID:26779143

  17. The role of Natural Flood Management in managing floods in large scale basins during extreme events

    NASA Astrophysics Data System (ADS)

    Quinn, Paul; Owen, Gareth; ODonnell, Greg; Nicholson, Alex; Hetherington, David

    2016-04-01

    There is a strong evidence database showing the negative impacts of land use intensification and soil degradation in NW European river basins on hydrological response and to flood impact downstream. However, the ability to target zones of high runoff production and the extent to which we can manage flood risk using nature-based flood management solution are less known. A move to planting more trees and having less intense farmed landscapes is part of natural flood management (NFM) solutions and these methods suggest that flood risk can be managed in alternative and more holistic ways. So what local NFM management methods should be used, where in large scale basin should they be deployed and how does flow is propagate to any point downstream? Generally, how much intervention is needed and will it compromise food production systems? If we are observing record levels of rainfall and flow, for example during Storm Desmond in Dec 2015 in the North West of England, what other flood management options are really needed to complement our traditional defences in large basins for the future? In this paper we will show examples of NFM interventions in the UK that have impacted at local scale sites. We will demonstrate the impact of interventions at local, sub-catchment (meso-scale) and finally at the large scale. These tools include observations, process based models and more generalised Flood Impact Models. Issues of synchronisation and the design level of protection will be debated. By reworking observed rainfall and discharge (runoff) for observed extreme events in the River Eden and River Tyne, during Storm Desmond, we will show how much flood protection is needed in large scale basins. The research will thus pose a number of key questions as to how floods may have to be managed in large scale basins in the future. We will seek to support a method of catchment systems engineering that holds water back across the whole landscape as a major opportunity to management water in large scale basins in the future. The broader benefits of engineering landscapes to hold water for pollution control, sediment loss and drought minimisation will also be shown.

  18. Promoting transportation flexibility in extreme events through multi-modal connectivity.

    DOT National Transportation Integrated Search

    2014-06-01

    Extreme events of all kinds are increasing in number, severity, or impacts. Transportation provides a vital support : service for people in such circumstances in the short-term for evacuation and providing supplies where evacuation is : not undertake...

  19. Influence of climate variability versus change at multi-decadal time scales on hydrological extremes

    NASA Astrophysics Data System (ADS)

    Willems, Patrick

    2014-05-01

    Recent studies have shown that rainfall and hydrological extremes do not randomly occur in time, but are subject to multidecadal oscillations. In addition to these oscillations, there are temporal trends due to climate change. Design statistics, such as intensity-duration-frequency (IDF) for extreme rainfall or flow-duration-frequency (QDF) relationships, are affected by both types of temporal changes (short term and long term). This presentation discusses these changes, how they influence water engineering design and decision making, and how this influence can be assessed and taken into account in practice. The multidecadal oscillations in rainfall and hydrological extremes were studied based on a technique for the identification and analysis of changes in extreme quantiles. The statistical significance of the oscillations was evaluated by means of a non-parametric bootstrapping method. Oscillations in large scale atmospheric circulation were identified as the main drivers for the temporal oscillations in rainfall and hydrological extremes. They also explain why spatial phase shifts (e.g. north-south variations in Europe) exist between the oscillation highs and lows. Next to the multidecadal climate oscillations, several stations show trends during the most recent decades, which may be attributed to climate change as a result of anthropogenic global warming. Such attribution to anthropogenic global warming is, however, uncertain. It can be done based on simulation results with climate models, but it is shown that the climate model results are too uncertain to enable a clear attribution. Water engineering design statistics, such as extreme rainfall IDF or peak or low flow QDF statistics, obviously are influenced by these temporal variations (oscillations, trends). It is shown in the paper, based on the Brussels 10-minutes rainfall data, that rainfall design values may be about 20% biased or different when based on short rainfall series of 10 to 15 years length, and still 8% for series of 25 years lengths. Methods for bias correction are demonstrated. The definition of "bias" depends on a number of factors, which needs further debate in the hydrological and water engineering community. References: Willems P. (2013), 'Multidecadal oscillatory behaviour of rainfall extremes in Europe', Climatic Change, 120(4), 931-944 Willems, P. (2013). 'Adjustment of extreme rainfall statistics accounting for multidecadal climate oscillations', Journal of Hydrology, 490, 126-133 Willems, P., Olsson, J., Arnbjerg-Nielsen, K., Beecham, S., Pathirana, A., Bülow Gregersen, I., Madsen, H., Nguyen, V-T-V. (2012), 'Impacts of climate change on rainfall extremes and urban drainage', IWA Publishing, 252p., Paperback Print ISBN 9781780401256; Ebook ISBN 9781780401263

  20. Predictability and possible earlier awareness of extreme precipitation across Europe

    NASA Astrophysics Data System (ADS)

    Lavers, David; Pappenberger, Florian; Richardson, David; Zsoter, Ervin

    2017-04-01

    Extreme hydrological events can cause large socioeconomic damages in Europe. In winter, a large proportion of these flood episodes are associated with atmospheric rivers, a region of intense water vapour transport within the warm sector of extratropical cyclones. When preparing for such extreme events, forecasts of precipitation from numerical weather prediction models or river discharge forecasts from hydrological models are generally used. Given the strong link between water vapour transport (integrated vapour transport IVT) and heavy precipitation, it is possible that IVT could be used to warn of extreme events. Furthermore, as IVT is located in extratropical cyclones, it is hypothesized to be a more predictable variable due to its link with synoptic-scale atmospheric dynamics. In this research, we firstly provide an overview of the predictability of IVT and precipitation forecasts, and secondly introduce and evaluate the ECMWF Extreme Forecast Index (EFI) for IVT. The EFI is a tool that has been developed to evaluate how ensemble forecasts differ from the model climate, thus revealing the extremeness of the forecast. The ability of the IVT EFI to capture extreme precipitation across Europe during winter 2013/14, 2014/15, and 2015/16 is presented. The results show that the IVT EFI is more capable than the precipitation EFI of identifying extreme precipitation in forecast week 2 during forecasts initialized in a positive North Atlantic Oscillation (NAO) phase. However, the precipitation EFI is superior during the negative NAO phase and at shorter lead times. An IVT EFI example is shown for storm Desmond in December 2015 highlighting its potential to identify upcoming hydrometeorological extremes.

  1. Changes in extreme events and the potential impacts on human health.

    PubMed

    Bell, Jesse E; Brown, Claudia Langford; Conlon, Kathryn; Herring, Stephanie; Kunkel, Kenneth E; Lawrimore, Jay; Luber, George; Schreck, Carl; Smith, Adam; Uejio, Christopher

    2018-04-01

    Extreme weather and climate-related events affect human health by causing death, injury, and illness, as well as having large socioeconomic impacts. Climate change has caused changes in extreme event frequency, intensity, and geographic distribution, and will continue to be a driver for change in the future. Some of these events include heat waves, droughts, wildfires, dust storms, flooding rains, coastal flooding, storm surges, and hurricanes. The pathways connecting extreme events to health outcomes and economic losses can be diverse and complex. The difficulty in predicting these relationships comes from the local societal and environmental factors that affect disease burden. More information is needed about the impacts of climate change on public health and economies to effectively plan for and adapt to climate change. This paper describes some of the ways extreme events are changing and provides examples of the potential impacts on human health and infrastructure. It also identifies key research gaps to be addressed to improve the resilience of public health to extreme events in the future. Extreme weather and climate events affect human health by causing death, injury, and illness, as well as having large socioeconomic impacts. Climate change has caused changes in extreme event frequency, intensity, and geographic distribution, and will continue to be a driver for change in the future. Some of these events include heat waves, droughts, wildfires, flooding rains, coastal flooding, surges, and hurricanes. The pathways connecting extreme events to health outcomes and economic losses can be diverse and complex. The difficulty in predicting these relationships comes from the local societal and environmental factors that affect disease burden.

  2. Extreme temperatures and out-of-hospital coronary deaths in six large Chinese cities.

    PubMed

    Chen, Renjie; Li, Tiantian; Cai, Jing; Yan, Meilin; Zhao, Zhuohui; Kan, Haidong

    2014-12-01

    The seasonal trend of out-of-hospital coronary death (OHCD) and sudden cardiac death has been observed, but whether extreme temperature serves as a risk factor is rarely investigated. We therefore aimed to evaluate the impact of extreme temperatures on OHCDs in China. We obtained death records of 126,925 OHCDs from six large Chinese cities (Harbin, Beijing, Tianjin, Nanjing, Shanghai and Guangzhou) during the period 2009-2011. The short-term associations between extreme temperature and OHCDs were analysed with time-series methods in each city, using generalised additive Poisson regression models. We specified distributed lag non-linear models in studying the delayed effects of extreme temperature. We then applied Bayesian hierarchical models to combine the city-specific effect estimates. The associations between extreme temperature and OHCDs were almost U-shaped or J-shaped. The pooled relative risks (RRs) of extreme cold temperatures over the lags 0-14 days comparing the 1st and 25th centile temperatures were 1.49 (95% posterior interval (PI) 1.26-1.76); the pooled RRs of extreme hot temperatures comparing the 99th and 75th centile temperatures were 1.53 (95% PI 1.27-1.84) for OHCDs. The RRs of extreme temperature on OHCD were higher if the patients with coronary heart disease were old, male and less educated. This multicity epidemiological study suggested that both extreme cold and hot temperatures posed significant risks on OHCDs, and might have important public health implications for the prevention of OHCD or sudden cardiac death. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  3. Intra-seasonal Characteristics of Wintertime Extreme Cold Events over South Korea

    NASA Astrophysics Data System (ADS)

    Park, Taewon; Jeong, Jeehoon; Choi, Jahyun

    2017-04-01

    The present study reveals the changes in the characteristics of extreme cold events over South Korea for boreal winter (November to March) in terms of the intra-seasonal variability of frequency, duration, and atmospheric circulation pattern. Influences of large-scale variabilities such as the Siberian High activity, the Arctic Oscillation (AO), and the Madden-Julian Oscillation (MJO) on extreme cold events are also investigated. In the early and the late of the winter during November and March, the upper-tropospheric wave-train for a life-cycle of the extreme cold events tends to pass quickly over East Asia. In addition, compared with the other months, the intensity of the Siberian High is weaker and the occurrences of strong negative AO are less frequent. It lead to events with weak amplitude and short duration. On the other hand, the amplified Siberian High and the strong negative AO occur more frequently in the mid of the winter from December to February. The extreme cold events are mainly characterized by a well-organized anticyclonic blocking around the Ural Mountain and the Subarctic. These large-scale circulation makes the extreme cold events for the midwinter last long with strong amplitude. The MJO phases 2-3 which provide a suitable condition for the amplification of extreme cold events occur frequently for November to January when the frequencies are more than twice those for February and March. While the extreme cold events during March have the least frequency, the weakest amplitude, and the shortest duration due to weak impacts of the abovementioned factors, the strong activities of the factors for January force the extreme cold events to be the most frequent, the strongest, and the longest among the boreal winter. Keywords extreme cold event, wave-train, blocking, Siberian High, AO, MJO

  4. Instructive Biologic Scaffold for Functional Tissue Regeneration Following Trauma to the Extremities

    DTIC Science & Technology

    2015-09-01

    Award Number: W81XWH-12-2-0128 TITLE: Instructive Biologic Scaffold for Functional Tissue Regeneration Following Trauma to the Extremities...2014 - 29 Aug 2015 4. TITLE AND SUBTITLE Instructive Biologic Scaffold for Functional Tissue Regeneration Following Trauma to the Extremities 5a...effectiveness of a regenerative scaffold for the restoration of functional musculotendinous tissue , including the restoration of blood supply and innervation

  5. Extremal entanglement witnesses

    NASA Astrophysics Data System (ADS)

    Hansen, Leif Ove; Hauge, Andreas; Myrheim, Jan; Sollid, Per Øyvind

    2015-02-01

    We present a study of extremal entanglement witnesses on a bipartite composite quantum system. We define the cone of witnesses as the dual of the set of separable density matrices, thus TrΩρ≥0 when Ω is a witness and ρ is a pure product state, ρ=ψψ† with ψ=ϕ⊗χ. The set of witnesses of unit trace is a compact convex set, uniquely defined by its extremal points. The expectation value f(ϕ,χ)=TrΩρ as a function of vectors ϕ and χ is a positive semidefinite biquadratic form. Every zero of f(ϕ,χ) imposes strong real-linear constraints on f and Ω. The real and symmetric Hessian matrix at the zero must be positive semidefinite. Its eigenvectors with zero eigenvalue, if such exist, we call Hessian zeros. A zero of f(ϕ,χ) is quadratic if it has no Hessian zeros, otherwise it is quartic. We call a witness quadratic if it has only quadratic zeros, and quartic if it has at least one quartic zero. A main result we prove is that a witness is extremal if and only if no other witness has the same, or a larger, set of zeros and Hessian zeros. A quadratic extremal witness has a minimum number of isolated zeros depending on dimensions. If a witness is not extremal, then the constraints defined by its zeros and Hessian zeros determine all directions in which we may search for witnesses having more zeros or Hessian zeros. A finite number of iterated searches in random directions, by numerical methods, leads to an extremal witness which is nearly always quadratic and has the minimum number of zeros. We discuss briefly some topics related to extremal witnesses, in particular the relation between the facial structures of the dual sets of witnesses and separable states. We discuss the relation between extremality and optimality of witnesses, and a conjecture of separability of the so-called structural physical approximation (SPA) of an optimal witness. Finally, we discuss how to treat the entanglement witnesses on a complex Hilbert space as a subset of the witnesses on a real Hilbert space.

  6. The Joint Statistics of California Temperature and Precipitation as a Function of the Large-scale State of the Climate

    NASA Astrophysics Data System (ADS)

    OBrien, J. P.; O'Brien, T. A.

    2015-12-01

    Single climatic extremes have a strong and disproportionate effect on society and the natural environment. However, the joint occurrence of two or more concurrent extremes has the potential to negatively impact these areas of life in ways far greater than any single event could. California, USA, home to nearly 40 million people and the largest agricultural producer in the United States, is currently experiencing an extreme drought, which has persisted for several years. While drought is commonly thought of in terms of only precipitation deficits, above average temperatures co-occurring with precipitation deficits greatly exacerbate drought conditions. The 2014 calendar year in California was characterized both by extremely low precipitation and extremely high temperatures, which has significantly deepened the already extreme drought conditions leading to severe water shortages and wildfires. While many studies have shown the statistics of 2014 temperature and precipitation anomalies as outliers, none have demonstrated a connection with large-scale, long-term climate trends, which would provide useful relationships for predicting the future trajectory of California climate and water resources. We focus on understanding non-stationarity in the joint distribution of California temperature and precipitation anomalies in terms of large-scale, low-frequency trends in climate such as global mean temperature rise and oscillatory indices such as ENSO and the Pacific Decadal Oscillation among others. We consider temperature and precipitation data from the seven distinct climate divisions in California and employ a novel, high-fidelity kernel density estimation method to directly infer the multivariate distribution of temperature and precipitation anomalies conditioned on the large-scale state of the climate. We show that the joint distributions and associated statistics of temperature and precipitation are non-stationary and vary regionally in California. Further, we show that recurrence intervals of extreme concurrent events vary as a function of time and of teleconnections. This research has implications for predicting and forecasting future temperature and precipitation anomalies, which is critically important for city, water, and agricultural planning in California.

  7. Simulated trends of extreme climate indices for the Carpathian basin using outputs of different regional climate models

    NASA Astrophysics Data System (ADS)

    Pongracz, R.; Bartholy, J.; Szabo, P.; Pieczka, I.; Torma, C. S.

    2009-04-01

    Regional climatological effects of global warming may be recognized not only in shifts of mean temperature and precipitation, but in the frequency or intensity changes of different climate extremes. Several climate extreme indices are analyzed and compared for the Carpathian basin (located in Central/Eastern Europe) following the guidelines suggested by the joint WMO-CCl/CLIVAR Working Group on climate change detection. Our statistical trend analysis includes the evaluation of several extreme temperature and precipitation indices, e.g., the numbers of severe cold days, winter days, frost days, cold days, warm days, summer days, hot days, extremely hot days, cold nights, warm nights, the intra-annual extreme temperature range, the heat wave duration, the growing season length, the number of wet days (using several threshold values defining extremes), the maximum number of consecutive dry days, the highest 1-day precipitation amount, the greatest 5-day rainfall total, the annual fraction due to extreme precipitation events, etc. In order to evaluate the future trends (2071-2100) in the Carpathian basin, daily values of meteorological variables are obtained from the outputs of various regional climate model (RCM) experiments accomplished in the frame of the completed EU-project PRUDENCE (Prediction of Regional scenarios and Uncertainties for Defining EuropeaN Climate change risks and Effects). Horizontal resolution of the applied RCMs is 50 km. Both scenarios A2 and B2 are used to compare past and future trends of the extreme climate indices for the Carpathian basin. Furthermore, fine-resolution climate experiments of two additional RCMs adapted and run at the Department of Meteorology, Eotvos Lorand University are used to extend the trend analysis of climate extremes for the Carpathian basin. (1) Model PRECIS (run at 25 km horizontal resolution) was developed at the UK Met Office, Hadley Centre, and it uses the boundary conditions from the HadCM3 GCM. (2) Model RegCM3 (run at 10 km horizontal resolution) was developed by Giorgi et al. and it is available from the ICTP (International Centre for Theoretical Physics). Analysis of the simulated daily temperature datasets suggests that the detected regional warming is expected to continue in the 21st century. Cold temperature extremes are projected to decrease while warm extremes tend to increase significantly. Expected changes of annual precipitation indices are small, but generally consistent with the detected trends of the 20th century. Based on the simulations, extreme precipitation events are expected to become more intense and more frequent in winter, while a general decrease of extreme precipitation indices is expected in summer.

  8. From ratites to rats: the size of fleshy fruits shapes species' distributions and continental rainforest assembly

    PubMed Central

    Rossetto, Maurizio; Kooyman, Robert; Yap, Jia-Yee S.; Laffan, Shawn W.

    2015-01-01

    Seed dispersal is a key process in plant spatial dynamics. However, consistently applicable generalizations about dispersal across scales are mostly absent because of the constraints on measuring propagule dispersal distances for many species. Here, we focus on fleshy-fruited taxa, specifically taxa with large fleshy fruits and their dispersers across an entire continental rainforest biome. We compare species-level results of whole-chloroplast DNA analyses in sister taxa with large and small fruits, to regional plot-based samples (310 plots), and whole-continent patterns for the distribution of woody species with either large (more than 30 mm) or smaller fleshy fruits (1093 taxa). The pairwise genomic comparison found higher genetic distances between populations and between regions in the large-fruited species (Endiandra globosa), but higher overall diversity within the small-fruited species (Endiandra discolor). Floristic comparisons among plots confirmed lower numbers of large-fruited species in areas where more extreme rainforest contraction occurred, and re-colonization by small-fruited species readily dispersed by the available fauna. Species' distribution patterns showed that larger-fruited species had smaller geographical ranges than smaller-fruited species and locations with stable refugia (and high endemism) aligned with concentrations of large fleshy-fruited taxa, making them a potentially valuable conservation-planning indicator. PMID:26645199

  9. From ratites to rats: the size of fleshy fruits shapes species' distributions and continental rainforest assembly.

    PubMed

    Rossetto, Maurizio; Kooyman, Robert; Yap, Jia-Yee S; Laffan, Shawn W

    2015-12-07

    Seed dispersal is a key process in plant spatial dynamics. However, consistently applicable generalizations about dispersal across scales are mostly absent because of the constraints on measuring propagule dispersal distances for many species. Here, we focus on fleshy-fruited taxa, specifically taxa with large fleshy fruits and their dispersers across an entire continental rainforest biome. We compare species-level results of whole-chloroplast DNA analyses in sister taxa with large and small fruits, to regional plot-based samples (310 plots), and whole-continent patterns for the distribution of woody species with either large (more than 30 mm) or smaller fleshy fruits (1093 taxa). The pairwise genomic comparison found higher genetic distances between populations and between regions in the large-fruited species (Endiandra globosa), but higher overall diversity within the small-fruited species (Endiandra discolor). Floristic comparisons among plots confirmed lower numbers of large-fruited species in areas where more extreme rainforest contraction occurred, and re-colonization by small-fruited species readily dispersed by the available fauna. Species' distribution patterns showed that larger-fruited species had smaller geographical ranges than smaller-fruited species and locations with stable refugia (and high endemism) aligned with concentrations of large fleshy-fruited taxa, making them a potentially valuable conservation-planning indicator. © 2015 The Author(s).

  10. Evolutionary dynamics of olfactory receptor genes in chordates: interaction between environments and genomic contents

    PubMed Central

    2009-01-01

    Olfaction is essential for the survival of animals. Versatile odour molecules in the environment are received by olfactory receptors (ORs), which form the largest multigene family in vertebrates. Identification of the entire repertories of OR genes using bioinformatics methods from the whole-genome sequences of diverse organisms revealed that the numbers of OR genes vary enormously, ranging from ~1,200 in rats and ~400 in humans to ~150 in zebrafish and ~15 in pufferfish. Most species have a considerable fraction of pseudogenes. Extensive phylogenetic analyses have suggested that the numbers of gene gains and losses are extremely large in the OR gene family, which is a striking example of the birth-and-death evolution. It appears that OR gene repertoires change dynamically, depending on each organism's living environment. For example, higher primates equipped with a well-developed vision system have lost a large number of OR genes. Moreover, two groups of OR genes for detecting airborne odorants greatly expanded after the time of terrestrial adaption in the tetrapod lineage, whereas fishes retain diverse repertoires of genes that were present in aquatic ancestral species. The origin of vertebrate OR genes can be traced back to the common ancestor of all chordate species, but insects, nematodes and echinoderms utilise distinctive families of chemoreceptors, suggesting that chemoreceptor genes have evolved many times independently in animal evolution. PMID:20038498

  11. Jimsphere wind and turbulence exceedance statistic

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.; Court, A.

    1972-01-01

    Exceedance statistics of winds and gusts observed over Cape Kennedy with Jimsphere balloon sensors are described. Gust profiles containing positive and negative departures, from smoothed profiles, in the wavelength ranges 100-2500, 100-1900, 100-860, and 100-460 meters were computed from 1578 profiles with four 41 weight digital high pass filters. Extreme values of the square root of gust speed are normally distributed. Monthly and annual exceedance probability distributions of normalized rms gust speeds in three altitude bands (2-7, 6-11, and 9-14 km) are log-normal. The rms gust speeds are largest in the 100-2500 wavelength band between 9 and 14 km in late winter and early spring. A study of monthly and annual exceedance probabilities and the number of occurrences per kilometer of level crossings with positive slope indicates significant variability with season, altitude, and filter configuration. A decile sampling scheme is tested and an optimum approach is suggested for drawing a relatively small random sample that represents the characteristic extreme wind speeds and shears of a large parent population of Jimsphere wind profiles.

  12. The wasteland of random supergravities

    NASA Astrophysics Data System (ADS)

    Marsh, David; McAllister, Liam; Wrase, Timm

    2012-03-01

    We show that in a general {N} = {1} supergravity with N ≫ 1 scalar fields, an exponentially small fraction of the de Sitter critical points are metastable vacua. Taking the superpotential and Kähler potential to be random functions, we construct a random matrix model for the Hessian matrix, which is well-approximated by the sum of a Wigner matrix and two Wishart matrices. We compute the eigenvalue spectrum analytically from the free convolution of the constituent spectra and find that in typical configurations, a significant fraction of the eigenvalues are negative. Building on the Tracy-Widom law governing fluctuations of extreme eigenvalues, we determine the probability P of a large fluctuation in which all the eigenvalues become positive. Strong eigenvalue repulsion makes this extremely unlikely: we find P ∝ exp(- c N p ), with c, p being constants. For generic critical points we find p ≈ 1 .5, while for approximately-supersymmetric critical points, p ≈ 1 .3. Our results have significant implications for the counting of de Sitter vacua in string theory, but the number of vacua remains vast.

  13. Extreme Pressure Synergistic Mechanism of Bismuth Naphthenate and Sulfurized Isobutene Additives

    NASA Astrophysics Data System (ADS)

    Xu, Xin; Hu, Jianqiang; Yang, Shizhao; Xie, Feng; Guo, Li

    A four-ball tester was used to evaluate the tribological performances of bismuth naphthenate (BiNap), sulfurized isobutene (VSB), and their combinations. The results show that the antiwear properties of BiNap and VSB are not very visible, but they possess good extreme pressure (EP) properties, particularly sulfur containing bismuth additives. Synergistic EP properties of BiNap with various sulfur-containing additives were investigated. The results indicate that BiNap exhibits good EP synergism with sulfur-containing additives. The surface analytical tools, such as X-ray photoelectron spectrometer (XPS) scanning electron microscope (SEM) and energy dispersive X-ray (EDX), were used to investigate the topography, composition contents, and depth profile of some typical elements on the rubbing surface. Smooth topography of wear scar further confirms that the additive showed good EP capacities, and XPS and EDX analyzes indicate that tribochemical mixed protective films composed of bismuth, bismuth oxides, sulfides, and sulfates are formed on the rubbing surface, which improves the tribological properties of lubricants. In particular, a large number of bismuth atoms and bismuth sulfides play an important role in improving the EP properties of oils.

  14. Controversial cytogenetic observations in mammalian somatic cells exposed to extremely low frequency electromagnetic radiation: a review and future research recommendations.

    PubMed

    Vijayalaxmi; Obe, Guenter

    2005-07-01

    During the years 1990-2003, a large number of investigations were conducted using animals, cultured rodent and human cells as well as freshly collected human blood lymphocytes to determine the genotoxic potential of exposure to nonionizing radiation emitted from extremely low frequency electromagnetic fields (EMF). Among the 63 peer reviewed scientific reports, the conclusions from 29 studies (46%) did not indicate increased damage to the genetic material, as assessed from DNA strand breaks, incidence of chromosomal aberrations (CA), micronuclei (MN), and sister chromatid exchanges (SCE), in EMF exposed cells as compared with sham exposed and/or unexposed cells, while those from 14 investigations (22%) have suggested an increase in such damage in EMF exposed cells. The observations from 20 other studies (32%) were inconclusive. This study reviews the investigations published in peer reviewed scientific journals during 1990-2003 and attempts to identify probable reason(s) for the conflicting results. Recommendations are made for future research to address some of the controversial observations. Copyright 2005 Wiley-Liss, Inc.

  15. Exploring the extremely low surface brightness sky: distances to 23 newly discovered objects in Dragonfly fields

    NASA Astrophysics Data System (ADS)

    van Dokkum, Pieter

    2016-10-01

    We are obtaining deep, wide field images of nearby galaxies with the Dragonfly Telephoto Array. This telescope is optimized for low surface brightness imaging, and we are finding many low surface brightness objects in the Dragonfly fields. In Cycle 22 we obtained ACS imaging for 7 galaxies that we had discovered in a Dragonfly image of the galaxy M101. Unexpectedly, the ACS data show that only 3 of the galaxies are members of the M101 group, and the other 4 are very large Ultra Diffuse Galaxies (UDGs) at much greater distance. Building on our Cycle 22 program, here we request ACS imaging for 23 newly discovered low surface brightness objects in four Dragonfly fields centered on the galaxies NGC 1052, NGC 1084, NGC 3384, and NGC 4258. The immediate goals are to construct the satellite luminosity functions in these four fields and to constrain the number density of UDGs that are not in rich clusters. More generally, this complete sample of extremely low surface brightness objects provides the first systematic insight into galaxies whose brightness peaks at >25 mag/arcsec^2.

  16. SAChES: Scalable Adaptive Chain-Ensemble Sampling.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Swiler, Laura Painton; Ray, Jaideep; Ebeida, Mohamed Salah

    We present the development of a parallel Markov Chain Monte Carlo (MCMC) method called SAChES, Scalable Adaptive Chain-Ensemble Sampling. This capability is targed to Bayesian calibration of com- putationally expensive simulation models. SAChES involves a hybrid of two methods: Differential Evo- lution Monte Carlo followed by Adaptive Metropolis. Both methods involve parallel chains. Differential evolution allows one to explore high-dimensional parameter spaces using loosely coupled (i.e., largely asynchronous) chains. Loose coupling allows the use of large chain ensembles, with far more chains than the number of parameters to explore. This reduces per-chain sampling burden, enables high-dimensional inversions and the usemore » of computationally expensive forward models. The large number of chains can also ameliorate the impact of silent-errors, which may affect only a few chains. The chain ensemble can also be sampled to provide an initial condition when an aberrant chain is re-spawned. Adaptive Metropolis takes the best points from the differential evolution and efficiently hones in on the poste- rior density. The multitude of chains in SAChES is leveraged to (1) enable efficient exploration of the parameter space; and (2) ensure robustness to silent errors which may be unavoidable in extreme-scale computational platforms of the future. This report outlines SAChES, describes four papers that are the result of the project, and discusses some additional results.« less

  17. A scheme for a high-power, low-cost transmitter for deep space applications

    NASA Astrophysics Data System (ADS)

    Scheffer, L. K.

    2005-10-01

    Applications such as planetary radars and spacecraft communications require transmitters with extremely high effective isotropic radiated power. Until now, this has been done by combining a high-power microwave source with a large reflective antenna. However, this arrangement has a number of disadvantages. It is costly, since the steerable reflector alone is quite expensive, and for spacecraft communications, the need to transmit hurts the receive performance. For planetary radars, the utilization is very low since the antenna must be shared with other applications such as radio astronomy or spacecraft communications. This paper describes a potential new way of building such transmitters with lower cost, greater versatility, higher reliability, and potentially higher power. The basic idea is a phased array with a very large number of low-power elements, built with mass production techniques that have been optimized for consumer markets. The antennas are built en mass on printed circuit boards and are driven by chips, built with consumer complementary metal-oxide-semiconductor technology, that adjust the phase of each element. Assembly and maintenance should be comparatively inexpensive since the boards need only be attached to large, flat, unmoving, ground-level infrastructure. Applications to planetary radar and spacecraft communications are examined. Although we would be unlikely to use such a facility in this way, an implication for Search for Extraterrestrial Intelligence (SETI) is that high-power beacons are easier to build than had been thought.

  18. United States Air Force Quality of Air Force Life Active Duty Air Force Personnel Survey: 1980 Quick Look Report

    DTIC Science & Technology

    1980-07-01

    number) Quality of life Job satisfaction ABSTRACT (Continue on reverse tide If nece.’snry and Identify by block number) eport summarizes results of...following description! WORKS Doing work that is personally meaningful and important; pride in ay work) job satisfaction ) recognition for my efforts and...family (if married ) or from home end friends (if unmarried ). EXTREMELY UNDESIRABLE INDIFFERENT EXTREMELY DESIRABLE 68. A favorable attitude on the

  19. Wildfire Activity Across the Triassic-Jurassic Boundary in the Polish Basin: Evidence from New Fossil Charcoal & Carbon-isotope Data

    NASA Astrophysics Data System (ADS)

    Pointer, R.; Belcher, C.; Hesselbo, S. P.; Hodbod, M.; Pieńkowski, G.

    2017-12-01

    New fossil charcoal abundance and carbon-isotope data from two sedimentary cores provide new evidence of extreme environmental conditions in the Polish Basin during the Latest Triassic to Earliest Jurassic. Sedimentary cores from the Polish Basin provide an excellent record of terrestrial environmental conditions across the Triassic-Jurassic Boundary, a time of climatic extremes. Previous work has shown that the marine realm was affected by a large perturbation to the carbon cycle across the Triassic-Jurassic Boundary (manifested by large negative and positive carbon-isotope excursions) and limited records of charcoal abundance and organic geochemistry have indicated important changes in fire regime in the coeval ecosystems. Here we present two new carbon-isotope records generated from fossil plant matter across the Triassic-Jurassic boundary, and present new charcoal records. The charcoal abundance data confirm that there was variation in wildfire activity during the Late Triassic-Early Jurassic in the Polish Basin. Peaks in the number of fossil charcoal fragments present occur in both sedimentary cores, and increases in fossil charcoal abundance are linked to wildfires, signalling a short-lived rise in wildfire activity. Fossil charcoal abundance does not appear to be fully controlled by total organic matter content, depositional environment or bioturbation. We argue that increased wildfire activity is likely caused by an increase in ignition of plant material as a result of an elevated number of lightning strikes. Global warming (caused by a massive input of carbon into the atmosphere, as indicated by carbon-isotope data) can increase storm activity, leading to increased numbers of lightning strikes. Previous Triassic-Jurassic Boundary wildfire studies have found fossil charcoal abundance peaks at other northern hemisphere sites (Denmark & Greenland), and concluded that they represent increases in wildfire activity in the earliest Jurassic. Our new charcoal and carbon-isotope data confirm that there was a peak in wildfire activity in the Polish Basin in the earliest Jurassic, and support previous suggestions of widespread increased wildfire activity at the Triassic-Jurassic Boundary.

  20. Large optical glass blanks for the ELT generation

    NASA Astrophysics Data System (ADS)

    Jedamzik, Ralf; Petzold, Uwe; Dietrich, Volker; Wittmer, Volker; Rexius, Olga

    2016-07-01

    The upcoming extremely large telescope projects like the E-ELT, TMT or GMT telescopes require not only large amount of mirror blank substrates but have also sophisticated instrument setups. Common instrument components are atmospheric dispersion correctors that compensate for the varying atmospheric path length depending on the telescope inclination angle. These elements consist usually of optical glass blanks that have to be large due to the increased size of the focal beam of the extremely large telescopes. SCHOTT has a long experience in producing and delivering large optical glass blanks for astronomical applications up to 1 m and in homogeneity grades up to H3 quality in the past. The most common optical glass available in large formats is SCHOTT N-BK7. But other glass types like F2 or LLF1 can also be produced in formats up to 1 m. The extremely large telescope projects partly demand atmospheric dispersion components even in sizes beyond 1m up to a range of 1.5 m diameter. The production of such large homogeneous optical glass banks requires tight control of all process steps. To cover this demand in the future SCHOTT initiated a research project to improve the large optical blank production process steps from melting to annealing and measurement. Large optical glass blanks are measured in several sub-apertures that cover the total clear aperture of the application. With SCHOTT's new stitching software it is now possible to combine individual sub-aperture measurements to a total homogeneity map of the blank. In this presentation first results will be demonstrated.

Top