Science.gov

Sample records for future large-scale observatories

  1. Large Scale Observatories for Changing Cold Regions - Recent Progress and Future Vision

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.; Pomeroy, J. W.; Carey, S. K.; DeBeer, C. M.

    2016-12-01

    Observatories are at the core of hydrological science and a critical resource for the detection and analysis of environmental change. The combination of multiple pressures on the water environment and new scientific opportunities provides a context where a broader vision is urgently needed. Human activities are increasingly affecting land and water management at multiple scales, so our observatories now need to more fully include the human dimensions of water, including their integration across jurisdictional boundaries and at large basin scales. And large scales are also needed to diagnose and predict impacts of climate change at regional and continental scales, and to address land-water-atmosphere interactions and feedbacks. We argue the need to build on the notable past successes of the World Climate Research Programme and move forward to a new era of globally-distributed large scale observatories. This paper introduces 2 such observatories in rapidly warming western Canada - the 405,000 km2 Saskatchewan and the 1.8 million km2 Mackenzie river basins. We review progress in these multi-scale observatories, including the use of point and small basin-scale observatory sites to observe and diagnose complex regional patterns of hydrological change. And building on new opportunities for observational systems and data assimilation, we present a vision for a pan-Canadian observing system to support the science needed for the management of future societal risk from extreme events and environmental change.

  2. Large scale scientific computing - future directions

    NASA Astrophysics Data System (ADS)

    Patterson, G. S.

    1982-06-01

    Every new generation of scientific computers has opened up new areas of science for exploration through the use of more realistic numerical models or the ability to process ever larger amounts of data. Concomitantly, scientists, because of the success of past models and the wide range of physical phenomena left unexplored, have pressed computer designers to strive for the maximum performance that current technology will permit. This encompasses not only increased processor speed, but also substantial improvements in processor memory, I/O bandwidth, secondary storage and facilities to augment the scientist's ability both to program and to understand the results of a computation. Over the past decade, performance improvements for scientific calculations have come from algoeithm development and a major change in the underlying architecture of the hardware, not from significantly faster circuitry. It appears that this trend will continue for another decade. A future archetectural change for improved performance will most likely be multiple processors coupled together in some fashion. Because the demand for a significantly more powerful computer system comes from users with single large applications, it is essential that an application be efficiently partitionable over a set of processors; otherwise, a multiprocessor system will not be effective. This paper explores some of the constraints on multiple processor architecture posed by these large applications. In particular, the trade-offs between large numbers of slow processors and small numbers of fast processors is examined. Strategies for partitioning range from partitioning at the language statement level (in-the-small) and at the program module level (in-the-large). Some examples of partitioning in-the-large are given and a strategy for efficiently executing a partitioned program is explored.

  3. The Saskatchewan River Basin - a large scale observatory for transdisciplinary science

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2012-12-01

    Water resources are under pressure world-wide and face unprecedented challenges - from population growth, economic development, pollution and environmental change. Further, effective water management is becoming increasingly complex, requiring deep understanding of aquatic and terrestrial environments, their vulnerabilities to environmental change, and water management and protection challenges. Important science challenges arise in understanding and managing environmental change. However, with increasing pressures on the environment, it is necessary to recognise the effects of human interventions; flows in many major rivers are strongly affected by operational water management, and large-scale agricultural land management change can affect hydrology, land-atmosphere feedbacks, water quality and habitats. There is a need to represent effects on river flows and groundwater of management decisions, and more generally to understand impacts of policy, governance and societal values on water futures. This research agenda poses important challenges to the science community. Observational data are necessary, across multiple scales, to understand environmental change. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The 340,000 km2 Saskatchewan River Basin (SRB) is being developed as a large scale observatory to support a new level of integration of interdisciplinary science. In one of the most extreme and variable climates in the world, we are developing state-of-the-art hydro-ecological experimental sites in the

  4. Large-Scale Science Observatories: Building on What We Have Learned from USArray

    NASA Astrophysics Data System (ADS)

    Woodward, R.; Busby, R.; Detrick, R. S.; Frassetto, A.

    2015-12-01

    With the NSF-sponsored EarthScope USArray observatory, the Earth science community has built the operational capability and experience to tackle scientific challenges at the largest scales, such as a Subduction Zone Observatory. In the first ten years of USArray, geophysical instruments were deployed across roughly 2% of the Earth's surface. The USArray operated a rolling deployment of seismic stations that occupied ~1,700 sites across the USA, made co-located atmospheric observations, occupied hundreds of sites with magnetotelluric sensors, expanded a backbone reference network of seismic stations, and provided instruments to PI-led teams that deployed thousands of additional seismic stations. USArray included a comprehensive outreach component that directly engaged hundreds of students at over 50 colleges and universities to locate station sites and provided Earth science exposure to roughly 1,000 landowners who hosted stations. The project also included a comprehensive data management capability that received, archived and distributed data, metadata, and data products; data were acquired and distributed in real time. The USArray project was completed on time and under budget and developed a number of best practices that can inform other large-scale science initiatives that the Earth science community is contemplating. Key strategies employed by USArray included: using a survey, rather than hypothesis-driven, mode of observation to generate comprehensive, high quality data on a large-scale for exploration and discovery; making data freely and openly available to any investigator from the very onset of the project; and using proven, commercial, off-the-shelf systems to ensure a fast start and avoid delays due to over-reliance on unproven technology or concepts. Scope was set ambitiously, but managed carefully to avoid overextending. Configuration was controlled to ensure efficient operations while providing consistent, uniform observations. Finally, community

  5. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and

  6. Large-scale weather systems: A future research priority

    NASA Astrophysics Data System (ADS)

    Davies, Huw C.

    2006-12-01

    A brief assessment is provided of both the case against and the case for assigning priority to research on large-scale weather systems (LSWS). The three-fold case against is based upon: the emergence of new overarching themes in environmental science; the fresh emphasis upon other sub-disciplines of the atmospheric science; and the mature state of research and prediction of LSWS. The case for is also supported by three arguments. First is the assertion that LSWS research should not merely be an integral but a major component of future research related to both the new overarching themes and the other sub-disciplines. Second recent major developments in LSWS research, as epitomized by the paradigm shifts in the prediction strategy for LSWS and the emergence of the potential vorticity perspective, testify to the theme’s on-going vibrancy. Third the field’s future development, as exemplified by the new international THORPEX (The Observing System Research and Predictability Experiment) programme, embodies a perceptive dovetailing of intellectually challenging fundamental research with directed application(s) of societal and economic benefit. It is thus inferred that LSWS research, far from being in demise, will feature at the forefront of the new relationship between science and society.

  7. Large-Scale Data Challenges in Future Power Grids

    SciTech Connect

    Yin, Jian; Sharma, Poorva; Gorton, Ian; Akyol, Bora A.

    2013-03-25

    This paper describes technical challenges in supporting large-scale real-time data analysis for future power grid systems and discusses various design options to address these challenges. Even though the existing U.S. power grid has served the nation remarkably well over the last 120 years, big changes are in the horizon. The widespread deployment of renewable generation, smart grid controls, energy storage, plug-in hybrids, and new conducting materials will require fundamental changes in the operational concepts and principal components. The whole system becomes highly dynamic and needs constant adjustments based on real time data. Even though millions of sensors such as phase measurement units (PMUs) and smart meters are being widely deployed, a data layer that can support this amount of data in real time is needed. Unlike the data fabric in cloud services, the data layer for smart grids must address some unique challenges. This layer must be scalable to support millions of sensors and a large number of diverse applications and still provide real time guarantees. Moreover, the system needs to be highly reliable and highly secure because the power grid is a critical piece of infrastructure. No existing systems can satisfy all the requirements at the same time. We examine various design options. In particular, we explore the special characteristics of power grid data to meet both scalability and quality of service requirements. Our initial prototype can improve performance by orders of magnitude over existing general-purpose systems. The prototype was demonstrated with several use cases from PNNL’s FPGI and was shown to be able to integrate huge amount of data from a large number of sensors and a diverse set of applications.

  8. A large-scale solar dynamics observatory image dataset for computer vision applications.

    PubMed

    Kucuk, Ahmet; Banda, Juan M; Angryk, Rafal A

    2017-01-01

    The National Aeronautics Space Agency (NASA) Solar Dynamics Observatory (SDO) mission has given us unprecedented insight into the Sun's activity. By capturing approximately 70,000 images a day, this mission has created one of the richest and biggest repositories of solar image data available to mankind. With such massive amounts of information, researchers have been able to produce great advances in detecting solar events. In this resource, we compile SDO solar data into a single repository in order to provide the computer vision community with a standardized and curated large-scale dataset of several hundred thousand solar events found on high resolution solar images. This publicly available resource, along with the generation source code, will accelerate computer vision research on NASA's solar image data by reducing the amount of time spent performing data acquisition and curation from the multiple sources we have compiled. By improving the quality of the data with thorough curation, we anticipate a wider adoption and interest from the computer vision to the solar physics community.

  9. The beginning of observations of large-scale solar magnetic fields at the Sayan Observatory - Instrument, plans, preliminary results

    NASA Astrophysics Data System (ADS)

    Grigoryev, V. M.; Peshcherov, V. S.; Demidov, M. L.

    A telescope and a system for measuring large-scale magnetic fields and the large-scale field of line-of-sight velocities in the sun photosphere have been constructed at the Sayan Observatory (USSR). The instrument permits the following synoptic observations of large-scale structures: (1) magnetograms of a large-scale magnetic field with a 3-arcmin resolution and 0.1-0.2 Gs sensitivity; (2) solar disk magnetograms in the form of half-tone images of the magnetic field distribution with 15 Gs sensitivity and 8 x 8 arcsec resolution; and (3) measurement of the mean magnetic field of the sun as a star with about 0.1 Gs sensitivity. Preliminary results of toroidal magnetic field observations are briefly discussed.

  10. Large-Scale Sequencing: The Future of Genomic Sciences Colloquium

    SciTech Connect

    Margaret Riley; Merry Buckley

    2009-01-01

    Genetic sequencing and the various molecular techniques it has enabled have revolutionized the field of microbiology. Examining and comparing the genetic sequences borne by microbes - including bacteria, archaea, viruses, and microbial eukaryotes - provides researchers insights into the processes microbes carry out, their pathogenic traits, and new ways to use microorganisms in medicine and manufacturing. Until recently, sequencing entire microbial genomes has been laborious and expensive, and the decision to sequence the genome of an organism was made on a case-by-case basis by individual researchers and funding agencies. Now, thanks to new technologies, the cost and effort of sequencing is within reach for even the smallest facilities, and the ability to sequence the genomes of a significant fraction of microbial life may be possible. The availability of numerous microbial genomes will enable unprecedented insights into microbial evolution, function, and physiology. However, the current ad hoc approach to gathering sequence data has resulted in an unbalanced and highly biased sampling of microbial diversity. A well-coordinated, large-scale effort to target the breadth and depth of microbial diversity would result in the greatest impact. The American Academy of Microbiology convened a colloquium to discuss the scientific benefits of engaging in a large-scale, taxonomically-based sequencing project. A group of individuals with expertise in microbiology, genomics, informatics, ecology, and evolution deliberated on the issues inherent in such an effort and generated a set of specific recommendations for how best to proceed. The vast majority of microbes are presently uncultured and, thus, pose significant challenges to such a taxonomically-based approach to sampling genome diversity. However, we have yet to even scratch the surface of the genomic diversity among cultured microbes. A coordinated sequencing effort of cultured organisms is an appropriate place to begin

  11. Eliminating large-scale magnetospheric current perturbations from long-term geomagnetic observatory data

    NASA Astrophysics Data System (ADS)

    Pick, L.; Korte, M. C.

    2016-12-01

    Magnetospheric currents generate the largest external contribution to the geomagnetic field observed on Earth. Of particular importance is the solar-driven effect of the ring current whose fluctuations overlap with internal field secular variation (SV). Recent core field models thus co-estimate this effect but their validity is limited to the last 15 years offering satellite data. We aim at eliminating magnetospheric modulation from the whole geomagnetic observatory record from 1840 onwards in order to obtain clean long-term SV that will enhance core flow and geodynamo studies.The ring current effect takes form of a southward directed external dipole field aligned with the geomagnetic main field axis. Commonly the Dst index (Sugiura, 1964) is used to parametrize temporal variations of this dipole term. Because of baseline instabilities, the alternative RC index was derived from hourly means of 21 stations spanning 1997-2013 (Olsen et al., 2014). We follow their methodology based on annual means from a reduced station set spanning 1960-2010. The absolute level of the variation so determined is "hidden" in the static lithospheric offsets taken as quiet-time means. We tackle this issue by subtracting crustal biases independently calculated for each observatory from an inversion of combined Swarm satellite and observatory data.Our index reproduces the original annual RC index variability with a reasonable offset of -10 nT in the reference time window 2000-2010. Prior to that it depicts a long-term trend consistent with the external dipole term from COV-OBS (Gillet et al., 2013), being the only long-term field model available for comparison. Sharper variations that are better correlated with the Ap index than the COV-OBS solution lend support to the usefulness of our initial modeling approach. Following a detailed sensitivity study of station choice future work will focus on increasing the resolution from annual to hourly means.

  12. Important aspects of Eastern Mediterranean large-scale variability revealed from data of three fixed observatories

    NASA Astrophysics Data System (ADS)

    Bensi, Manuel; Velaoras, Dimitris; Cardin, Vanessa; Perivoliotis, Leonidas; Pethiakis, George

    2015-04-01

    Long-term variations of temperature and salinity observed in the Adriatic and Aegean Seas seem to be regulated by larger-scale circulation modes of the Eastern Mediterranean (EMed) Sea, such as the recently discovered feedback mechanisms, namely the BiOS (Bimodal Oscillating System) and the internal thermohaline pump theories. These theories are the results of interpretation of many years' observations, highlighting possible interactions between two key regions of the EMed. Although repeated oceanographic cruises carried out in the past or planned for the future are a very useful tool for understanding the interaction between the two basins (e.g. alternating dense water formation, salt ingressions), recent long time-series of high frequency (up to 1h) sampling have added valuable information to the interpretation of internal mechanisms for both areas (i.e. mesoscale eddies, evolution of fast internal processes, etc.). During the last 10 years, three deep observatories were deployed and maintained in the Adriatic, Ionian, and Aegean Seas: they are respectively, the E2-M3A, the Pylos, and the E1-M3A. All are part of the largest European network of Fixed Point Open Ocean Observatories (FixO3, http://www.fixo3.eu/). Herein, from the analysis of temperature and salinity, and potential density time series collected at the three sites from the surface down to the intermediate and deep layers, we will discuss the almost perfect anti-correlated behavior between the Adriatic and the Aegean Seas. Our data, collected almost continuously since 2006, reveal that these observatories well represent the thermohaline variability of their own areas. Interestingly, temperature and salinity in the intermediate layer suddenly increased in the South Adriatic from the end of 2011, exactly when they started decreasing in the Aegean Sea. Moreover, Pylos data used together with additional ones (e.g. Absolute dynamic topography, temperature and salinity data from other platforms) collected

  13. New ultracool subdwarfs identified in large-scale surveys using Virtual Observatory tools

    NASA Astrophysics Data System (ADS)

    Lodieu, N.; Espinoza Contreras, M.; Zapatero Osorio, M. R.; Solano, E.; Aberasturi, M.; Martín, E. L.; Rodrigo, C.

    2017-02-01

    . Our new late-type M discoveries include 49 subdwarfs, 25 extreme subdwarfs, six ultrasubdwarfs, one subdwarf/extreme subdwarf, and two dwarfs/subdwarfs. In addition, we discovered three early-L subdwarfs to add to the current compendium of L-type subdwarfs known to date. We doubled the numbers of cool subdwarfs (11 new from SDSS vs. 2MASS and 50 new from SDSS vs. UKIDSS). We derived a surface density of late-type subdwarfs of 0.040 per square degree in the SDSS DR7 vs. UKIDSS LAS DR10 cross-match (J = 15.9-18.8 mag) after correcting for incompleteness. The density of M dwarfs decreases with decreasing metallicity. We also checked the Wide Field Survey Explorer (AllWISE) photometry of known and new subdwarfs and found that mid-infrared colours of M subdwarfs do not appear to differ from their solar-metallicity counterparts of similar spectral types. However, the near-to-mid-infrared colours J-W2 and J-W1 are bluer for lower metallicity dwarfs, results that may be used as a criterion to look for late-type subdwarfs in future searches. Based on observations made with ESO Telescopes at the La Silla Paranal Observatory under programmes IDs 088.C-0250(A), 090.C-0832(A).Based on observations made with the Nordic Optical Telescope, operated by the Nordic Optical Telescope Scientific Association at the Observatorio del Roque de los Muchachos, La Palma, Spain, of the Instituto de Astrofísica de Canarias.Based on observations made with the Gran Telescopio Canarias (GTC), installed in the Spanish Observatorio del Roque de los Muchachos of the Instituto de Astrofísica de Canarias, in the island of La Palma (programs GTC44-09B, GTC53-10B, GTC31-MULTIPLE-11B, GTC36/12B, and GTC79-14A).The data presented in this paper are gathered in a VO-compliant archive at http://svo2.cab.inta-csic.es/vocats/ltsa/The photometric and spectroscopic data are available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc

  14. Prospects for strangelet detection with large-scale cosmic ray observatories

    NASA Astrophysics Data System (ADS)

    Pshirkov, M. S.

    Quark matter which contains s-quarks in addition to u- and d- could be stable or metastable. In this case, lumps made of this strange matter, called strangelets, could occasionally hit the Earth. When travelling through the atmosphere they would behave not dissimilar to usual high-velocity meteors with only exception that, eventually, strangelets reach the surface. As these encounters are expected to be extremely rare events, very large exposure is needed for their observation. Fluorescence detectors utilized in large ultra-high energy cosmic ray observatories, such as the Pierre Auger observatory and the Telescope Array are well suited for a task of the detection of these events. The flux limits that can be obtained with the Telescope Array fluorescence detectors could be as low as 2.5 × 10‑22 cm‑2s‑1sr‑1 which would improve by two orders of magnitude of the strongest present limits obtained from ancient mica crystals.

  15. Heliophysics/Geospace System Observatory: System level science by large-scale space-ground coordination

    NASA Astrophysics Data System (ADS)

    Nishimura, T.; Angelopoulos, V.; Moore, T. E.; Samara, M.

    2015-12-01

    Recent multi-satellite and ground-based network measurements have revealed importance of cross-scale and cross-regional coupling processes for understanding key issues in geospace such as magnetic reconnection, substorms and particle acceleration. In particular, localized and fast plasma transport in a global scale has been recognized to play a fundamental role in regulating evolution of the magnetosphere-ionosphere-thermosphere coupling. Those results call for coordinated measurements multi-missions and facilities in a global scale for understanding global coupling processes in a system level. In fact, the National Research Council recommends to use NASA's existing heliophysics flight missions and NSF's ground-based facilities by forming a network of observing platforms that operate simultaneously to investigate the solar system. This array can be thought of as a single observatory, the Heliophysics/Geospace System Observatory (H/GSO). Motivated by the successful launch of MMS and the healthy status of THEMIS, Van Allen Probes and other missions, we plan a strategic use of existing and upcoming assets in space and ground in the next two years. In the 2015-2016 and 2016-2017 northern winter seasons, MMS will be in the dayside over northern Europe, and THEMIS will be in the nightside over North America. In the 2016 and 2017 southern winter seasons, THEMIS will be in the dayside over the South Pole, and MMS will be in the nightside in the Australian sector. These are favorable configurations for simultaneous day-night coupling measurements of magnetic reconnection and related plasma transport both in space and on the ground, and also provide excellent opportunities for cross-scale coupling, global effects of dayside transients, tail-inner magnetosphere coupling, and other global processes. This presentation will give the current status and plan of the H/GSO and these science targets.

  16. The Landscape Evolution Observatory: A large-scale controllable infrastructure to study coupled Earth-surface processes

    NASA Astrophysics Data System (ADS)

    Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferre, T. P. A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin

    2015-09-01

    Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.

  17. The Landscape Evolution Observatory: a large-scale controllable infrastructure to study coupled Earth-surface processes

    USGS Publications Warehouse

    Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferre, T. P. A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin

    2015-01-01

    Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.

  18. Prototyping a large-scale distributed system for the Great Observatories era - NASA Astrophysics Data System (ADS)

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1990-01-01

    The NASA Astrophysics Data System (ADS) is a distributed information system intended to support research in the Great Observatories era, to simplify access to data, and to enable simultaneous analyses of multispectral data sets. Here, the user agent and interface, its functions, and system components are examined, and the system architecture and infrastructure is addressed. The present status of the system and related future activities are examined.

  19. Prototyping a large-scale distributed system for the Great Observatories era - NASA Astrophysics Data System (ADS)

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1990-01-01

    The NASA Astrophysics Data System (ADS) is a distributed information system intended to support research in the Great Observatories era, to simplify access to data, and to enable simultaneous analyses of multispectral data sets. Here, the user agent and interface, its functions, and system components are examined, and the system architecture and infrastructure is addressed. The present status of the system and related future activities are examined.

  20. Possible future effects of large-scale algae cultivation for biofuels on coastal eutrophication in Europe.

    PubMed

    Blaas, Harry; Kroeze, Carolien

    2014-10-15

    Biodiesel is increasingly considered as an alternative for fossil diesel. Biodiesel can be produced from rapeseed, palm, sunflower, soybean and algae. In this study, the consequences of large-scale production of biodiesel from micro-algae for eutrophication in four large European seas are analysed. To this end, scenarios for the year 2050 are analysed, assuming that in the 27 countries of the European Union fossil diesel will be replaced by biodiesel from algae. Estimates are made for the required fertiliser inputs to algae parks, and how this may increase concentrations of nitrogen and phosphorus in coastal waters, potentially leading to eutrophication. The Global NEWS (Nutrient Export from WaterSheds) model has been used to estimate the transport of nitrogen and phosphorus to the European coastal waters. The results indicate that the amount of nitrogen and phosphorus in the coastal waters may increase considerably in the future as a result of large-scale production of algae for the production of biodiesel, even in scenarios assuming effective waste water treatment and recycling of waste water in algae production. To ensure sustainable production of biodiesel from micro-algae, it is important to develop cultivation systems with low nutrient losses to the environment.

  1. FutureGen 2.0 Oxy-combustion Large Scale Test – Final Report

    SciTech Connect

    Kenison, LaVesta; Flanigan, Thomas; Hagerty, Gregg; Gorrie, James; Leclerc, Mathieu; Lockwood, Frederick; Falla, Lyle; Macinnis, Jim; Fedak, Mathew; Yakle, Jeff; Williford, Mark; Wood, Paul

    2016-04-01

    The primary objectives of the FutureGen 2.0 CO2 Oxy-Combustion Large Scale Test Project were to site, permit, design, construct, and commission, an oxy-combustion boiler, gas quality control system, air separation unit, and CO2 compression and purification unit, together with the necessary supporting and interconnection utilities. The project was to demonstrate at commercial scale (168MWe gross) the capability to cleanly produce electricity through coal combustion at a retrofitted, existing coal-fired power plant; thereby, resulting in near-zeroemissions of all commonly regulated air emissions, as well as 90% CO2 capture in steady-state operations. The project was to be fully integrated in terms of project management, capacity, capabilities, technical scope, cost, and schedule with the companion FutureGen 2.0 CO2 Pipeline and Storage Project, a separate but complementary project whose objective was to safely transport, permanently store and monitor the CO2 captured by the Oxy-combustion Power Plant Project. The FutureGen 2.0 Oxy-Combustion Large Scale Test Project successfully achieved all technical objectives inclusive of front-end-engineering and design, and advanced design required to accurately estimate and contract for the construction, commissioning, and start-up of a commercial-scale "ready to build" power plant using oxy-combustion technology, including full integration with the companion CO2 Pipeline and Storage project. Ultimately the project did not proceed to construction due to insufficient time to complete necessary EPC contract negotiations and commercial financing prior to expiration of federal co-funding, which triggered a DOE decision to closeout its participation in the project. Through the work that was completed, valuable technical, commercial, and programmatic lessons were learned. This project has significantly advanced the development of near-zero emission technology and will

  2. The future of primordial features with large-scale structure surveys

    NASA Astrophysics Data System (ADS)

    Chen, Xingang; Dvorkin, Cora; Huang, Zhiqi; Namjoo, Mohammad Hossein; Verde, Licia

    2016-11-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopic and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.

  3. Constraints on the Origin of Cosmic Rays above 1018 eV from Large-scale Anisotropy Searches in Data of the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Pierre Auger Collaboration; Abreu, P.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Antiči'c, T.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Badescu, A. M.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellétoile, A.; Bellido, J. A.; BenZvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Buroker, L.; Burton, R. E.; Caballero-Mora, K. S.; Caccianiga, B.; Caramete, L.; Caruso, R.; Castellina, A.; Catalano, O.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chirinos Diaz, J.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; De Donato, C.; de Jong, S. J.; De La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; De Mitri, I.; de Souza, V.; de Vries, K. D.; del Peral, L.; del Río, M.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Díaz Castro, M. L.; Diep, P. N.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Fratu, O.; Fröhlich, U.; Fuchs, B.; Gaior, R.; Gamarra, R. F.; Gambetta, S.; García, B.; Garcia Roca, S. T.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gemmeke, H.; Ghia, P. L.; Giller, M.; Gitto, J.; Glass, H.; Gold, M. S.; Golup, G.; Gomez Albarracin, F.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; Gookin, B.; Gorgi, A.; Gouffon, P.; Grashorn, E.; Grebe, S.; Griffith, N.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holmes, V. C.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jansen, S.; Jarne, C.; Jiraskova, S.; Josebachuili, M.; Kadija, K.; Kampert, K. H.; Karhan, P.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D.-H.; Kotera, K.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; LaHurd, D.; Latronico, L.; Lauer, R.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Macolino, C.; Maldera, S.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, J.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Messina, S.; Meurer, C.; Meyhandan, R.; Mi'canovi'c, S.; Micheletti, M. I.; Minaya, I. A.; Miramonti, L.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nhung, P. T.; Niechciol, M.; Niemietz, L.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nožka, L.; Oehlschläger, J.; Olinto, A.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Pastor, S.; Paul, T.; Pech, M.; Peķala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrolini, A.; Petrov, Y.; Pfendner, C.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Ponce, V. H.; Pontz, M.; Porcelli, A.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rivera, H.; Rizi, V.; Roberts, J.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Cabo, I.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Rouillé-d'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Rühle, C.; Saftoiu, A.; Salamida, F.; Salazar, H.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovánek, P.; Schröder, F.; Schuster, D.; Sciutto, S. J.; Scuderi, M.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Silva Lopez, H. H.; Sima, O.; 'Smiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Srivastava, Y. N.; Stanic, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Šuša, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tapia, A.; Tartare, M.; Taşcău, O.; Tcaciuc, R.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tkaczyk, W.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Westerhoff, S.; Whelan, B. J.; Widom, A.; Wieczorek, G.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano Garcia, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.

    2013-01-01

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 1018 eV at the Pierre Auger Observatory is reported. For the first time, these large-scale anisotropy searches are performed as a function of both the right ascension and the declination and expressed in terms of dipole and quadrupole moments. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Upper limits on dipole and quadrupole amplitudes are derived under the hypothesis that any cosmic ray anisotropy is dominated by such moments in this energy range. These upper limits provide constraints on the production of cosmic rays above 1018 eV, since they allow us to challenge an origin from stationary galactic sources densely distributed in the galactic disk and emitting predominantly light particles in all directions.

  4. CONSTRAINTS ON THE ORIGIN OF COSMIC RAYS ABOVE 10{sup 18} eV FROM LARGE-SCALE ANISOTROPY SEARCHES IN DATA OF THE PIERRE AUGER OBSERVATORY

    SciTech Connect

    Abreu, P.; Andringa, S.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Castillo, J. Alvarez; Alvarez-Muniz, J.; Alves Batista, R.; Ambrosio, M.; Aramo, C.; Aminaei, A.; Anchordoqui, L.; Antici'c, T.; Arganda, E.; Collaboration: Pierre Auger Collaboration; and others

    2013-01-01

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 10{sup 18} eV at the Pierre Auger Observatory is reported. For the first time, these large-scale anisotropy searches are performed as a function of both the right ascension and the declination and expressed in terms of dipole and quadrupole moments. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Upper limits on dipole and quadrupole amplitudes are derived under the hypothesis that any cosmic ray anisotropy is dominated by such moments in this energy range. These upper limits provide constraints on the production of cosmic rays above 10{sup 18} eV, since they allow us to challenge an origin from stationary galactic sources densely distributed in the galactic disk and emitting predominantly light particles in all directions.

  5. S-net project: Construction of large scale seafloor observatory network for tsunamis and earthquakes in Japan

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Kanazawa, T.; Uehira, K.; Shimbo, T.; Shiomi, K.; Kunugi, T.; Aoi, S.; Matsumoto, T.; Sekiguchi, S.; Yamamoto, N.; Takahashi, N.; Shinohara, M.; Yamada, T.

    2016-12-01

    National Research Institute for Earth Science and Disaster Resilience ( NIED ) has launched the project of constructing an observatory network for tsunamis and earthquakes on the seafloor. The observatory network was named "S-net, Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench". The S-net consists of 150 seafloor observatories which are connected in line with submarine optical cables. The total length of submarine optical cable is about 5,700 km. The S-net system extends along Kuril and Japan trenches around Japan islands from north to south covering the area between southeast off island of Hokkaido and off the Boso Peninsula, Chiba Prefecture. The project has been financially supported by MEXT Japan. An observatory package is 34cm in diameter and 226cm long. Each observatory equips two units of a high sensitive water-depth sensor as a tsunami meter and four sets of three-component seismometers. The water-depth sensor has measurement resolution of sub-centimeter level. Combination of multiple seismometers secures wide dynamic range and robustness of the observation that are needed for early earthquake warning. The S-net is composed of six segment networks that consists of about 25 observatories and 800-1,600km length submarine optical cable. Five of six segment networks except the one covering the outer rise area of the Japan Trench has been already installed. The data from the observatories on those five segment networks are being transferred to the data center at NIED on a real-time basis, and then verification of data integrity are being carried out at the present moment. Installation of the last segment network of the S-net, that is, the outer rise one is scheduled to be finished within FY2016. Full-scale operation of the S-net will start at FY2017. We will report construction and operation of the S-net submarine cable system as well as the outline of the obtained data in this presentation.

  6. Using cellular network diagrams to interpret large-scale datasets: past progress and future challenges

    NASA Astrophysics Data System (ADS)

    Karp, Peter D.; Latendresse, Mario; Paley, Suzanne

    2011-03-01

    Cellular networks are graphs of molecular interactions within the cell. Thanks to the confluence of genome sequencing and bioinformatics, scientists are now able to reconstruct cellular network models for more than 1,000 organisms. A variety of bioinformatics tools have been developed to support the visualization and navigation of cellular network data. Another important application is the use of cellular network diagrams to visualize and interpret large-scale datasets, such as gene-expression data. We present the Cellular Overview, a network visualization tool developed at SRI International (SRI) to support visualization, navigation, and interpretation of large-scale datasets on metabolic networks. Different variations of the diagram have been generated algorithmically for more than 1,000 organisms. We discuss the graphical design of the diagram and its interactive capabilities.

  7. The Plate Boundary Observatory Cascadia Network: Development and Installation of a Large Scale Real-time GPS Network

    NASA Astrophysics Data System (ADS)

    Austin, K. E.; Blume, F.; Berglund, H. T.; Feaux, K.; Gallaher, W. W.; Hodgkinson, K. M.; Mattioli, G. S.; Mencin, D.

    2014-12-01

    The EarthScope Plate Boundary Observatory (PBO), through a NSF-ARRA supplement, has enhanced the geophysical infrastructure in in the Pacific Northwest by upgrading a total of 282 Plate Boundary Observatory GPS stations to allow the collection and distribution of high-rate (1 Hz), low-latency (<1 s) data streams (RT-GPS). These upgraded stations supplemented the original 100 RT-GPS stations in the PBO GPS network. The addition of the new RT-GPS sites in Cascadia should spur new volcano and earthquake research opportunities in an area of great scientific interest and high geophysical hazard. Streaming RT-GPS data will enable researchers to detect and investigate strong ground motion during large geophysical events, including a possible plate-interface earthquake, which has implications for earthquake hazard mitigation. A Mw 6.9 earthquake occurred on March 10, 2014, off the coast of northern California. As a response, UNAVCO downloaded high-rate GPS data from Plate Boundary Observatory stations within 500 km of the epicenter of the event, providing a good test of network performance.In addition to the 282 stations upgraded to real-time, 22 new meteorological instruments were added to existing PBO stations. Extensive testing of BGAN satellite communications systems has been conducted to support the Cascadia RT-GPS upgrades and the installation of three BGAN satellite fail over systems along the Cascadia margin will allow for the continuation of data flow in the event of a loss of primary communications during in a large geophysical event or other interruptions in commercial cellular networks. In summary, with these additional upgrades in the Cascadia region, the PBO RT-GPS network will increase to 420 stations. Upgrades to the UNAVCO data infrastructure included evaluation and purchase of the Trimble Pivot Platform, servers, and additional hardware for archiving the high rate data, as well as testing and implementation of GLONASS and Trimble RTX positioning on the

  8. The future of VLBI observatories in space

    NASA Technical Reports Server (NTRS)

    Preston, R. A.; Jordan, J. F.; Burke, B. F.; Doxsey, R.; Morgan, S. H.; Roberts, D. H.; Shapiro, I. I.

    1983-01-01

    The angular resolution of radio maps made by earth-based VLBI observations can be exceeded by placing at least one element of a VLBI array into earth orbit. A VLBI observatory in space can offer the additional advantages of increased sky coverage, higher density sampling of Fourier components, and rapid mapping of objects whose structure changes in less than a day. This paper explores the future of this technique.

  9. Methods for large-scale production of AM fungi: past, present, and future.

    PubMed

    Ijdo, Marleen; Cranenbrouck, Sylvie; Declerck, Stéphane

    2011-01-01

    Many different cultivation techniques and inoculum products of the plant-beneficial arbuscular mycorrhizal (AM) fungi have been developed in the last decades. Soil- and substrate-based production techniques as well as substrate-free culture techniques (hydroponics and aeroponics) and in vitro cultivation methods have all been attempted for the large-scale production of AM fungi. In this review, we describe the principal in vivo and in vitro production methods that have been developed so far. We present the parameters that are critical for optimal production, discuss the advantages and disadvantages of the methods, and highlight their most probable sectors of application.

  10. The Plate Boundary Observatory Cascadia Network: Development and Installation of a Large Scale Real-time GPS Network

    NASA Astrophysics Data System (ADS)

    Austin, K. E.; Blume, F.; Berglund, H. T.; Dittman, T.; Feaux, K.; Gallaher, W. W.; Mattioli, G. S.; Mencin, D.; Walls, C. P.

    2013-12-01

    The EarthScope Plate Boundary Observatory (PBO), through a NSF-ARRA supplement, has enhanced the geophysical infrastructure in in the Pacific Northwest by upgrading 232 Plate Boundary Observatory GPS stations to allow the collection and distribution of high-rate (1 Hz), low-latency (<1 s) data streams (RT-GPS). These upgraded stations supplemented the original 100 RT-GPS stations in the PBO GPS network. The addition of the new RT-GPS sites in the Pacific Northwest should spur new volcano and earthquake research opportunities in an area of great scientific interest and high geophysical hazard. Streaming RT-GPS data will enable researchers to detect and investigate strong ground motion during large geophysical events, including a possible plate-interface earthquake, which has implications for earthquake hazard mitigation. A total of 282 PBO stations were upgraded and added to the UNAVCO real-time GPS system, along with addition of 22 new meteorological instruments to existing PBO stations. Extensive testing of BGAN satellite communications systems has been conducted to support the Cascadia RT-GPS upgrades and the installation of three BGAN satellite fail over systems along the Cascadia margin will allow for the continuation of data flow in the event of a loss of primary communications during in a large geophysical event or other interruptions in commercial cellular networks. In summary, with these additional upgrades in the Cascadia region, the PBO RT-GPS network will increase to 420 stations. Upgrades to UNAVCO's data infrastructure included evaluation and purchase of the Trimble Pivot Platform, servers, and additional hardware for archiving the high rate data. UNAVCO staff is working closely with the UNAVCO community to develop data standards, protocols, and a science plan for the use of RT-GPS data.

  11. Hydrological response of karst systems to large-scale climate variability for different catchments of the French karst observatory network INSU/CNRS SNO KARST

    NASA Astrophysics Data System (ADS)

    Massei, Nicolas; Labat, David; Jourde, Hervé; Lecoq, Nicolas; Mazzilli, Naomi

    2017-04-01

    The french karst observatory network SNO KARST is a national initiative from the National Institute for Earth Sciences and Astronomy (INSU) of the National Center for Scientific Research (CNRS). It is also part of the new french research infrastructure for the observation of the critical zone OZCAR. SNO KARST is composed by several karst sites distributed over conterminous France which are located in different physiographic and climatic contexts (Mediterranean, Pyrenean, Jura mountain, western and northwestern shore near the Atlantic or the English Channel). This allows the scientific community to develop advanced research and experiments dedicated to improve understanding of the hydrological functioning of karst catchments. Here we used several sites of SNO KARST in order to assess the hydrological response of karst catchments to long-term variation of large-scale atmospheric circulation. Using NCEP reanalysis products and karst discharge, we analyzed the links between large-scale circulation and karst water resources variability. As karst hydrosystems are highly heterogeneous media, they behave differently across different time-scales : we explore the large-scale/local-scale relationships according to time-scales using a wavelet multiresolution approach of both karst hydrological variables and large-scale climate fields such as sea level pressure (SLP). The different wavelet components of karst discharge in response to the corresponding wavelet component of climate fields are either 1) compared to physico-chemical/geochemical responses at karst springs, or 2) interpreted in terms of hydrological functioning by comparing discharge wavelet components to internal components obtained from precipitation/discharge models using the KARSTMOD conceptual modeling platform of SNO KARST.

  12. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    PubMed

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion.

  13. Vulnerability of the large-scale future smart electric power grid

    NASA Astrophysics Data System (ADS)

    Nasiruzzaman, A. B. M.; Pota, H. R.; Akter, Most. Nahida

    2014-11-01

    The changing power flow pattern of the power system, with inclusion of large-scale renewable energy sources in the distribution side of the network, has been modeled by complex network framework based bidirectional graph. The bidirectional graph accommodates the reverse power flowing back from the distribution side to the grid in the model as a reverse edge connecting two nodes. The capacity of the reverse edge is equal to the capacity of the existing edge between the nodes in the forward directional nominal graph. Increased path in the combined model, built to facilitate grid reliability and efficiency, may serve as a bottleneck in practice with removal of certain percentage of nodes or edges. The effect of removal of critical elements has been analyzed in terms of increased path length, connectivity loss, load loss, and number of overloaded lines.

  14. Inclusive constraints on unified dark matter models from future large-scale surveys

    SciTech Connect

    Camera, Stefano; Carbone, Carmelita; Moscardini, Lauro E-mail: carmelita.carbone@unibo.it

    2012-03-01

    In the very last years, cosmological models where the properties of the dark components of the Universe — dark matter and dark energy — are accounted for by a single ''dark fluid'' have drawn increasing attention and interest. Amongst many proposals, Unified Dark Matter (UDM) cosmologies are promising candidates as effective theories. In these models, a scalar field with a non-canonical kinetic term in its Lagrangian mimics both the accelerated expansion of the Universe at late times and the clustering properties of the large-scale structure of the cosmos. However, UDM models also present peculiar behaviours, the most interesting one being the fact that the perturbations in the dark-matter component of the scalar field do have a non-negligible speed of sound. This gives rise to an effective Jeans scale for the Newtonian potential, below which the dark fluid does not cluster any more. This implies a growth of structures fairly different from that of the concordance ΛCDM model. In this paper, we demonstrate that forthcoming large-scale surveys will be able to discriminate between viable UDM models and ΛCDM to a good degree of accuracy. To this purpose, the planned Euclid satellite will be a powerful tool, since it will provide very accurate data on galaxy clustering and the weak lensing effect of cosmic shear. Finally, we also exploit the constraining power of the ongoing CMB Planck experiment. Although our approach is the most conservative, with the inclusion of only well-understood, linear dynamics, in the end we also show what could be done if some amount of non-linear information were included.

  15. Inclusive constraints on unified dark matter models from future large-scale surveys

    NASA Astrophysics Data System (ADS)

    Camera, Stefano; Carbone, Carmelita; Moscardini, Lauro

    2012-03-01

    In the very last years, cosmological models where the properties of the dark components of the Universe — dark matter and dark energy — are accounted for by a single ``dark fluid'' have drawn increasing attention and interest. Amongst many proposals, Unified Dark Matter (UDM) cosmologies are promising candidates as effective theories. In these models, a scalar field with a non-canonical kinetic term in its Lagrangian mimics both the accelerated expansion of the Universe at late times and the clustering properties of the large-scale structure of the cosmos. However, UDM models also present peculiar behaviours, the most interesting one being the fact that the perturbations in the dark-matter component of the scalar field do have a non-negligible speed of sound. This gives rise to an effective Jeans scale for the Newtonian potential, below which the dark fluid does not cluster any more. This implies a growth of structures fairly different from that of the concordance ΛCDM model. In this paper, we demonstrate that forthcoming large-scale surveys will be able to discriminate between viable UDM models and ΛCDM to a good degree of accuracy. To this purpose, the planned Euclid satellite will be a powerful tool, since it will provide very accurate data on galaxy clustering and the weak lensing effect of cosmic shear. Finally, we also exploit the constraining power of the ongoing CMB Planck experiment. Although our approach is the most conservative, with the inclusion of only well-understood, linear dynamics, in the end we also show what could be done if some amount of non-linear information were included.

  16. Large-scale Distribution of Arrival Directions of Cosmic Rays Detected Above 1018 eV at the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Pierre Auger Collaboration; Abreu, P.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Antiči'c, T.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Badescu, A. M.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellétoile, A.; Bellido, J. A.; BenZvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Buroker, L.; Burton, R. E.; Caballero-Mora, K. S.; Caccianiga, B.; Caramete, L.; Caruso, R.; Castellina, A.; Catalano, O.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chirinos Diaz, J.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; De Donato, C.; de Jong, S. J.; De La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; De Mitri, I.; de Souza, V.; de Vries, K. D.; del Peral, L.; del Río, M.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Díaz Castro, M. L.; Diep, P. N.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Fratu, O.; Fröhlich, U.; Fuchs, B.; Gaior, R.; Gamarra, R. F.; Gambetta, S.; García, B.; Garcia Roca, S. T.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gemmeke, H.; Ghia, P. L.; Giller, M.; Gitto, J.; Glass, H.; Gold, M. S.; Golup, G.; Gomez Albarracin, F.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; Gookin, B.; Gorgi, A.; Gouffon, P.; Grashorn, E.; Grebe, S.; Griffith, N.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holmes, V. C.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jansen, S.; Jarne, C.; Jiraskova, S.; Josebachuili, M.; Kadija, K.; Kampert, K. H.; Karhan, P.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D.-H.; Kotera, K.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; LaHurd, D.; Latronico, L.; Lauer, R.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Macolino, C.; Maldera, S.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, J.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Messina, S.; Meurer, C.; Meyhandan, R.; Mi'canovi'c, S.; Micheletti, M. I.; Minaya, I. A.; Miramonti, L.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nhung, P. T.; Niechciol, M.; Niemietz, L.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nožka, L.; Oehlschläger, J.; Olinto, A.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Pastor, S.; Paul, T.; Pech, M.; Peķala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrolini, A.; Petrov, Y.; Pfendner, C.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Ponce, V. H.; Pontz, M.; Porcelli, A.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rivera, H.; Rizi, V.; Roberts, J.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Cabo, I.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Rouillé-d'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Rühle, C.; Saftoiu, A.; Salamida, F.; Salazar, H.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovánek, P.; Schröder, F.; Schuster, D.; Sciutto, S. J.; Scuderi, M.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Silva Lopez, H. H.; Sima, O.; 'Smiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Srivastava, Y. N.; Stanic, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Šuša, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tapia, A.; Tartare, M.; Taşcău, O.; Tcaciuc, R.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tkaczyk, W.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Westerhoff, S.; Whelan, B. J.; Widom, A.; Wieczorek, G.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano Garcia, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.

    2012-12-01

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 1018 eV at the Pierre Auger Observatory is presented. This search is performed as a function of both declination and right ascension in several energy ranges above 1018 eV, and reported in terms of dipolar and quadrupolar coefficients. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Assuming that any cosmic-ray anisotropy is dominated by dipole and quadrupole moments in this energy range, upper limits on their amplitudes are derived. These upper limits allow us to test the origin of cosmic rays above 1018 eV from stationary Galactic sources densely distributed in the Galactic disk and predominantly emitting light particles in all directions.

  17. The effect of the geomagnetic field on cosmic ray energy estimates and large scale anisotropy searches on data from the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Pierre Auger Collaboration; Abreu, P.; Aglietta, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Antičić, T.; Anzalone, A.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Bäcker, T.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Bäuml, J.; Beatty, J. J.; Becker, B. R.; Becker, K. H.; Bellétoile, A.; Bellido, J. A.; BenZvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Burton, R. E.; Caballero-Mora, K. S.; Caramete, L.; Caruso, R.; Castellina, A.; Catalano, O.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chou, A.; Chudoba, J.; Clay, R. W.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; De Donato, C.; de Jong, S. J.; De La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; De Mitri, I.; de Souza, V.; de Vries, K. D.; Decerprit, G.; del Peral, L.; del Río, M.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Diaz, J. C.; Díaz Castro, M. L.; Diep, P. N.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Fajardo Tapia, I.; Falcke, H.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Ferrero, A.; Fick, B.; Filevich, A.; Filipčič, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Fröhlich, U.; Fuchs, B.; Gaior, R.; Gamarra, R. F.; Gambetta, S.; García, B.; García Gámez, D.; Garcia-Pinto, D.; Gascon, A.; Gemmeke, H.; Gesterling, K.; Ghia, P. L.; Giaccari, U.; Giller, M.; Glass, H.; Gold, M. S.; Golup, G.; Gomez Albarracin, F.; Gómez Berisso, M.; Gonçalves, P.; Gonzalez, D.; Gonzalez, J. G.; Gookin, B.; Góra, D.; Gorgi, A.; Gouffon, P.; Gozzini, S. R.; Grashorn, E.; Grebe, S.; Griffith, N.; Grigat, M.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Guzman, A.; Hague, J. D.; Hansen, P.; Harari, D.; Harmsma, S.; Harrison, T. A.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hojvat, C.; Hollon, N.; Holmes, V. C.; Homola, P.; Hörandel, J. R.; Horneffer, A.; Horvath, P.; Hrabovský, M.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jarne, C.; Jiraskova, S.; Josebachuili, M.; Kadija, K.; Kampert, K. H.; Karhan, P.; Kasper, P.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D.-H.; Kotera, K.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuehn, F.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; Lautridou, P.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Lemiere, A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Macolino, C.; Maldera, S.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, J.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Meurer, C.; Mićanović, S.; Micheletti, M. I.; Miller, W.; Miramonti, L.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Morris, C.; Mostafá, M.; Moura, C. A.; Mueller, S.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nhung, P. T.; Niemietz, L.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nožka, L.; Nyklicek, M.; Oehlschläger, J.; Olinto, A.; Oliva, P.; Olmos-Gilbaja, V. M.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Parsons, R. D.; Pastor, S.; Paul, T.; Pech, M.; Pękala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrinca, P.; Petrolini, A.; Petrov, Y.; Petrovic, J.; Pfendner, C.; Phan, N.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Ponce, V. H.; Pontz, M.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rivera, H.; Rizi, V.; Roberts, J.; Robledo, C.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodriguez-Cabo, I.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Rouillé-d'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Rühle, C.; Salamida, F.; Salazar, H.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Schmidt, F.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovánek, P.; Schröder, F.; Schulte, S.; Schuster, D.; Sciutto, S. J.; Scuderi, M.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Silva Lopez, H. H.; Śacute; Smiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Stanic, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Strazzeri, E.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Šuša, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tamashiro, A.; Tapia, A.; Tartare, M.; Taşąu, O.; Tavera Ruiz, C. G.; Tcaciuc, R.; Tegolo, D.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tiwari, D. K.; Tkaczyk, W.; Todero Peixoto, C. J.; Tomé, B.; Tonachini, A.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van den Berg, A. M.; Varela, E.; Vargas Cárdenas, B.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Walz, D.; Warner, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Westerhoff, S.; Whelan, B. J.; Wieczorek, G.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Winnick, M. G.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zimbres Silva, M.; Ziolkowski, M.

    2011-11-01

    We present a comprehensive study of the influence of the geomagnetic field on the energy estimation of extensive air showers with a zenith angle smaller than 60°, detected at the Pierre Auger Observatory. The geomagnetic field induces an azimuthal modulation of the estimated energy of cosmic rays up to the ~ 2% level at large zenith angles. We present a method to account for this modulation of the reconstructed energy. We analyse the effect of the modulation on large scale anisotropy searches in the arrival direction distributions of cosmic rays. At a given energy, the geomagnetic effect is shown to induce a pseudo-dipolar pattern at the percent level in the declination distribution that needs to be accounted for.

  18. LARGE-SCALE DISTRIBUTION OF ARRIVAL DIRECTIONS OF COSMIC RAYS DETECTED ABOVE 10{sup 18} eV AT THE PIERRE AUGER OBSERVATORY

    SciTech Connect

    Abreu, P.; Andringa, S.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Alves Batista, R.; Ambrosio, M.; Aramo, C.; Aminaei, A.; Anchordoqui, L.; Antici'c, T.; Arganda, E.; Collaboration: Pierre Auger Collaboration; and others

    2012-12-15

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 10{sup 18} eV at the Pierre Auger Observatory is presented. This search is performed as a function of both declination and right ascension in several energy ranges above 10{sup 18} eV, and reported in terms of dipolar and quadrupolar coefficients. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Assuming that any cosmic-ray anisotropy is dominated by dipole and quadrupole moments in this energy range, upper limits on their amplitudes are derived. These upper limits allow us to test the origin of cosmic rays above 10{sup 18} eV from stationary Galactic sources densely distributed in the Galactic disk and predominantly emitting light particles in all directions.

  19. The effect of the geomagnetic field on cosmic ray energy estimates and large scale anisotropy searches on data from the Pierre Auger Observatory

    SciTech Connect

    Collaboration: Pierre Auger Collaboration

    2011-11-01

    We present a comprehensive study of the influence of the geomagnetic field on the energy estimation of extensive air showers with a zenith angle smaller than 60°, detected at the Pierre Auger Observatory. The geomagnetic field induces an azimuthal modulation of the estimated energy of cosmic rays up to the ∼ 2% level at large zenith angles. We present a method to account for this modulation of the reconstructed energy. We analyse the effect of the modulation on large scale anisotropy searches in the arrival direction distributions of cosmic rays. At a given energy, the geomagnetic effect is shown to induce a pseudo-dipolar pattern at the percent level in the declination distribution that needs to be accounted for.

  20. The effect of the geomagnetic field on cosmic ray energy estimates and large scale anisotropy searches on data from the Pierre Auger Observatory

    SciTech Connect

    Abreu, P.; Aglietta, M.; Ahn, E.J.; Albuquerque, I.F.M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Ambrosio, M.; /Naples U. /INFN, Naples /Nijmegen U., IMAPP

    2011-11-01

    We present a comprehensive study of the influence of the geomagnetic field on the energy estimation of extensive air showers with a zenith angle smaller than 60{sup o}, detected at the Pierre Auger Observatory. The geomagnetic field induces an azimuthal modulation of the estimated energy of cosmic rays up to the {approx} 2% level at large zenith angles. We present a method to account for this modulation of the reconstructed energy. We analyse the effect of the modulation on large scale anisotropy searches in the arrival direction distributions of cosmic rays. At a given energy, the geomagnetic effect is shown to induce a pseudo-dipolar pattern at the percent level in the declination distribution that needs to be accounted for. In this work, we have identified and quantified a systematic uncertainty affecting the energy determination of cosmic rays detected by the surface detector array of the Pierre Auger Observatory. This systematic uncertainty, induced by the influence of the geomagnetic field on the shower development, has a strength which depends on both the zenith and the azimuthal angles. Consequently, we have shown that it induces distortions of the estimated cosmic ray event rate at a given energy at the percent level in both the azimuthal and the declination distributions, the latter of which mimics an almost dipolar pattern. We have also shown that the induced distortions are already at the level of the statistical uncertainties for a number of events N {approx_equal} 32 000 (we note that the full Auger surface detector array collects about 6500 events per year with energies above 3 EeV). Accounting for these effects is thus essential with regard to the correct interpretation of large scale anisotropy measurements taking explicitly profit from the declination distribution.

  1. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    NASA Astrophysics Data System (ADS)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning

  2. Precision measurements of large scale structure with future type Ia supernova surveys

    SciTech Connect

    Hannestad, Steen; Haugbolle, Troels; Thomsen, Bjarne E-mail: haugboel@phys.au.dk

    2008-02-15

    Type Ia supernovae are currently the best known standard candles at cosmological distances. In addition to providing a powerful probe of dark energy they are an ideal source of information about the peculiar velocity field of the local universe. Even with the very small number of supernovae presently available it has been possible to measure the dipole and quadrupole of the local velocity field out to z{approx}0.025. With future continuous all-sky surveys like the Large Synoptic Survey Telescope (LSST) project the luminosity distances of tens of thousands of nearby supernovae will be measured accurately. This will allow for a determination of the local velocity structure of the universe as a function of redshift with unprecedented accuracy, provided the redshifts of the host galaxies are known. Using catalogues of mock surveys we estimate that future low redshift supernova surveys will be able to probe {sigma}{sub 8} to a precision of roughly 5% at 95% C.L. This is comparable to the precision in future galaxy and weak lensing surveys, and with a relatively modest observational effort it will provide a crucial cross-check on future measurements of the matter power spectrum.

  3. Observing trans-Planckian ripples in the primordial power spectrum with future large scale structure probes

    SciTech Connect

    Hamann, Jan; Hannestad, Steen; Sloth, Martin S; Wong, Yvonne Y Y E-mail: sth@phys.au.dk E-mail: ywong@mppmu.mpg.de

    2008-09-15

    We revisit the issue of ripples in the primordial power spectra caused by trans-Planckian physics, and the potential for their detection by future cosmological probes. We find that for reasonably large values of the first slow-roll parameter {epsilon} ({approx}>0.001), a positive detection of trans-Planckian ripples can be made even if the amplitude is as low as 10{sup -4}. Data from the Large Synoptic Survey Telescope (LSST) and the proposed future 21 cm survey with the Fast Fourier Transform Telescope (FFTT) will be particularly useful in this regard. If the scale of inflation is close to its present upper bound, a scale of new physics as high as {approx}0.2 M{sub P} could lead to observable signatures.

  4. Efficiency and economics of large scale hydrogen liquefaction. [for future generation aircraft requirements

    NASA Technical Reports Server (NTRS)

    Baker, C. R.

    1975-01-01

    Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.

  5. Efficiency and economics of large scale hydrogen liquefaction. [for future generation aircraft requirements

    NASA Technical Reports Server (NTRS)

    Baker, C. R.

    1975-01-01

    Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.

  6. New ultracool subdwarfs identified in large-scale surveys using Virtual Observatory tools. I. UKIDSS LAS DR5 vs. SDSS DR7

    NASA Astrophysics Data System (ADS)

    Lodieu, N.; Espinoza Contreras, M.; Zapatero Osorio, M. R.; Solano, E.; Aberasturi, M.; Martín, E. L.

    2012-06-01

    Aims: The aim of the project is to improve our knowledge of the low-mass and low-metallicity population to investigate the influence of metallicity on the stellar (and substellar) mass function. Methods: We present the results of a photometric and proper motion search aimed at discovering ultracool subdwarfs in large-scale surveys. We employed and combined the Fifth Data Release (DR5) of the UKIRT Infrared Deep Sky Survey (UKIDSS) Large Area Survey (LAS) and the Sloan Digital Sky Survey (SDSS) Data Release 7 complemented with ancillary data from the Two Micron All-Sky Survey (2MASS), the DEep Near-Infrared Survey (DENIS) and the SuperCOSMOS Sky Surveys (SSS). Results: The SDSS DR7 vs. UKIDSS LAS DR5 search returned a total of 32 ultracool subdwarf candidates, only two of which are recognised as a subdwarf in the literature. Twenty-seven candidates, including the two known ones, were followed-up spectroscopically in the optical between 600 and 1000 nm, thus covering strong spectral features indicative of low metallicity (e.g., CaH), 21 with the Very Large Telescope, one with the Nordic Optical Telescope, and five were extracted from the Sloan spectroscopic database to assess (or refute) their low-metal content. We confirm 20 candidates as subdwarfs, extreme subdwarfs, or ultra-subdwarfs with spectral types later than M5; this represents a success rate of ≥ 60%. Among those 20 new subdwarfs, we identify two early-L subdwarfs that are very likely located within 100 pc, which we propose as templates for future searches because they are the first examples of their subclass. Another seven sources are solar-metallicity M dwarfs with spectral types between M4 and M7 without Hα emission, suggesting that they are old M dwarfs. The remaining five candidates do not have spectroscopic follow-up yet; only one remains as a bona-fide ultracool subdwarf after revision of their proper motions. We assigned spectral types based on the current classification schemes and, when

  7. Large-Scale Cooperative Dissemination of Governmental Information in Emergency — An Experiment and Future Strategies

    NASA Astrophysics Data System (ADS)

    Horiba, Katsuhiro; Okawa, Keiko; Murai, Jun

    On the 11th of March, 2011, a massive earthquake hit the northeast region of Japan. The government of Japan needed to publish information regarding the earthquake and its influences. However, their capacity of Web services overflowed. They called the industry and academia for help for providing stable information service to the people. Industry and academia formed a team to answer the call and named themselves the “EQ project”. This paper describes how the EQ Project was organized and operated, and gives analyses of the statistics. An academic organization took the lead in the EQ Project. Ten organizations which consisted of commercial IT industry and academics specialized in Internet technology, were participating in the EQ Project and they structured the three clusters based on their relationships and technological approach. In WIDE Cluster, one of three clusters in the structure of EQ, the peak number of file accesses per day was over 90 thousand, the mobile browsers was 3.4% and foreign languages (translated contents) were referred 35%. We have also discussed the future information distribution strategies in emergency situation based on the experiences of the EQ Project, and proposed nine suggestions to the MEXT as a future strategy.

  8. The Design of Large-Scale Complex Engineered Systems: Present Challenges and Future Promise

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.; McGowan, Anna-Maria Rivas

    2012-01-01

    Model-Based Systems Engineering techniques are used in the SE community to address the need for managing the development of complex systems. A key feature of the MBSE approach is the use of a model to capture the requirements, architecture, behavior, operating environment and other key aspects of the system. The focus on the model differentiates MBSE from traditional SE techniques that may have a document centric approach. In an effort to assess the benefit of utilizing MBSE on its flight projects, NASA Langley has implemented a pilot program to apply MBSE techniques during the early phase of the Materials International Space Station Experiment-X (MISSE-X). MISSE-X is a Technology Demonstration Mission being developed by the NASA Office of the Chief Technologist i . Designed to be installed on the exterior of the International Space Station (ISS), MISSE-X will host experiments that advance the technology readiness of materials and devices needed for future space exploration. As a follow-on to the highly successful series of previous MISSE experiments on ISS, MISSE-X benefits from a significant interest by the

  9. Methods, caveats and the future of large-scale microelectrode recordings in the non-human primate.

    PubMed

    Dotson, Nicholas M; Goodell, Baldwin; Salazar, Rodrigo F; Hoffman, Steven J; Gray, Charles M

    2015-01-01

    Cognitive processes play out on massive brain-wide networks, which produce widely distributed patterns of activity. Capturing these activity patterns requires tools that are able to simultaneously measure activity from many distributed sites with high spatiotemporal resolution. Unfortunately, current techniques with adequate coverage do not provide the requisite spatiotemporal resolution. Large-scale microelectrode recording devices, with dozens to hundreds of microelectrodes capable of simultaneously recording from nearly as many cortical and subcortical areas, provide a potential way to minimize these tradeoffs. However, placing hundreds of microelectrodes into a behaving animal is a highly risky and technically challenging endeavor that has only been pursued by a few groups. Recording activity from multiple electrodes simultaneously also introduces several statistical and conceptual dilemmas, such as the multiple comparisons problem and the uncontrolled stimulus response problem. In this perspective article, we discuss some of the techniques that we, and others, have developed for collecting and analyzing large-scale data sets, and address the future of this emerging field.

  10. Large-scale academic achievement testing of deaf and hard-of-hearing students: past, present, and future.

    PubMed

    Qi, Sen; Mitchell, Ross E

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the validity and reliability of using the Stanford for this special student population still require extensive scrutiny. Recent shifts in educational policy environment, which require that schools enable all children to achieve proficiency through accountability testing, warrants a close examination of the adequacy and relevance of the current large-scale testing of deaf and hard-of-hearing students. This study has three objectives: (a) it will summarize the historical data over the last three decades to indicate trends in academic achievement for this special population, (b) it will analyze the current federal laws and regulations related to educational testing and special education, thereby identifying gaps between policy and practice in the field, especially identifying the limitations of current testing programs in assessing what deaf and hard-of-hearing students know, and (c) it will offer some insights and suggestions for future testing programs for deaf and hard-of-hearing students.

  11. Large-scale atmospheric circulation and local particulate matter concentrations in Bavaria - from current observations to future projections

    NASA Astrophysics Data System (ADS)

    Beck, Christoph; Weitnauer, Claudia; Brosy, Caroline; Hald, Cornelius; Lochbihler, Kai; Siegmund, Stefan; Jacobeit, Jucundus

    2016-04-01

    Particulate matter with an aerodynamic diameter of 10 μm or less (PM10) may have distinct adverse effects on human health. Spatial and temporal variations in PM10 concentrations reflect local emission rates, but are as well influenced by the local and synoptic-scale atmospheric conditions. Against this background, it can be furthermore argued that potential future climate change and associated variations in large-scale atmospheric circulation and local meteorological parameters will probably provoke corresponding changes in future PM10 concentration levels. The DFG-funded research project „Particulate matter and climate change in Bavaria" aimed at establishing quantitative relationships between daily and monthly PM10 indices at different Bavarian urban stations and the corresponding large-scale atmospheric circulation as well as local meteorological conditions. To this end, several statistical downscaling approaches have been developed for the period 1980 to 2011. PM10 data from 19 stations from the air quality monitoring network (LÜB) of the Bavarian Environmental Agency (LfU) have been utilized as predictands. Large-scale atmospheric gridded data from the NCEP/NCAR reanalysis data base and local meteorological observational data provided by the German Meteorological Service (DWD) served as predictors. The downscaling approaches encompass the synoptic downscaling of daily PM10 concentrations and several multivariate statistical models for the estimation of daily and monthly PM10, i.e.monthly mean and number of days exceeding a certain PM10 concentration threshold. Both techniques utilize objective circulation type classifications, which have been optimized with respect to their synoptic skill for the target variable PM10. All downscaling approaches have been evaluated via cross validation using varying subintervals of the 1980-2011 period as calibration and validation periods respectively. The most suitable - in terms of model skill determined from cross

  12. The ecological future of the North American bison: Conceiving long-term, large-scale conservation of a species

    USGS Publications Warehouse

    Sanderson, E.W.; Redford, K.; Weber, Bill; Aune, K.; Baldes, Dick; Berger, J.; Carter, Dave; Curtin, C.; Derr, James N.; Dobrott, S.J.; Fearn, Eva; Fleener, Craig; Forrest, Steven C.; Gerlach, Craig; Gates, C. Cormack; Gross, J.E.; Gogan, P.; Grassel, Shaun M.; Hilty, Jodi A.; Jensen, Marv; Kunkel, Kyran; Lammers, Duane; List, R.; Minkowski, Karen; Olson, Tom; Pague, Chris; Robertson, Paul B.; Stephenson, Bob

    2008-01-01

    Many wide-ranging mammal species have experienced significant declines over the last 200 years; restoring these species will require long-term, large-scale recovery efforts. We highlight 5 attributes of a recent range-wide vision-setting exercise for ecological recovery of the North American bison (Bison bison) that are broadly applicable to other species and restoration targets. The result of the exercise, the “Vermejo Statement” on bison restoration, is explicitly (1) large scale, (2) long term, (3) inclusive, (4) fulfilling of different values, and (5) ambitious. It reads, in part, “Over the next century, the ecological recovery of the North American bison will occur when multiple large herds move freely across extensive landscapes within all major habitats of their historic range, interacting in ecologically significant ways with the fullest possible set of other native species, and inspiring, sustaining and connecting human cultures.” We refined the vision into a scorecard that illustrates how individual bison herds can contribute to the vision. We also developed a set of maps and analyzed the current and potential future distributions of bison on the basis of expert assessment. Although more than 500,000 bison exist in North America today, we estimated they occupy <1% of their historical range and in no place express the full range of ecological and social values of previous times. By formulating an inclusive, affirmative, and specific vision through consultation with a wide range of stakeholders, we hope to provide a foundation for conservation of bison, and other wide-ranging species, over the next 100 years.

  13. Impact of large-scale circulation changes in the North Atlantic sector on the current and future Mediterranean winter hydroclimate

    NASA Astrophysics Data System (ADS)

    Barcikowska, Monika J.; Kapnick, Sarah B.; Feser, Frauke

    2017-06-01

    The Mediterranean region, located in the transition zone between the dry subtropical and wet European mid-latitude climate, is very sensitive to changes in the global mean climate state. Projecting future changes of the Mediterranean hydroclimate under global warming therefore requires dynamic climate models to reproduce the main mechanisms controlling regional hydroclimate with sufficiently high resolution to realistically simulate climate extremes. To assess future winter precipitation changes in the Mediterranean region we use the Geophysical Fluid Dynamics Laboratory high-resolution general circulation model for control simulations with pre-industrial greenhouse gas and aerosol concentrations which are compared to future scenario simulations. Here we show that the coupled model is able to reliably simulate the large-scale winter circulation, including the North Atlantic Oscillation and Eastern Atlantic patterns of variability, and its associated impacts on the mean Mediterranean hydroclimate. The model also realistically reproduces the regional features of daily heavy rainfall, which are absent in lower-resolution simulations. A five-member future projection ensemble, which assumes comparatively high greenhouse gas emissions (RCP8.5) until 2100, indicates a strong winter decline in Mediterranean precipitation for the coming decades. Consistent with dynamical and thermodynamical consequences of a warming atmosphere, derived changes feature a distinct bipolar behavior, i.e. wetting in the north—and drying in the south. Changes are most pronounced over the northwest African coast, where the projected winter precipitation decline reaches 40% of present values. Despite a decrease in mean precipitation, heavy rainfall indices show drastic increases across most of the Mediterranean, except the North African coast, which is under the strong influence of the cold Canary Current.

  14. Challenges and Opportunities for Promoting Student Achievement through Large-Scale Assessment Results: Research, Reflections, and Future Directions

    ERIC Educational Resources Information Center

    Decker, Dawn M.; Bolt, Sara E.

    2008-01-01

    The intent of large-scale assessment systems is to promote student achievement toward specific standards by holding schools accountable for the performance of all students. However, it is difficult to know whether large-scale assessment systems are having this intended effect as they are currently implemented. In this article, the authors examine…

  15. Future Astronomical Observatories on the Moon

    NASA Technical Reports Server (NTRS)

    Burns, Jack O. (Editor); Mendell, Wendell W. (Editor)

    1988-01-01

    Papers at a workshop which consider the topic astronomical observations from a lunar base are presented. In part 1, the rationale for performing astronomy on the Moon is established and economic factors are considered. Part 2 includes concepts for individual lunar based telescopes at the shortest X-ray and gamma ray wavelengths, for high energy cosmic rays, and at optical and infrared wavelengths. Lunar radio frequency telescopes are considered in part 3, and engineering considerations for lunar base observatories are discussed in part 4. Throughout, advantages and disadvantages of lunar basing compared to terrestrial and orbital basing of observatories are weighted. The participants concluded that the Moon is very possibly the best location within the inner solar system from which to perform front-line astronomical research.

  16. Moving Water on a Malleable Planet - Large Scale Inter-basin Hydrological Transfers Now and in the Future

    NASA Astrophysics Data System (ADS)

    Lammers, R. B.; Proussevitch, A. A.; Frolking, S.; Grogan, D. S.

    2012-12-01

    Humans have been reorganizing the land surface components of the hydrological cycle for some time. One of the more hydrologically important changes are diversions of river water from one watershed to another. These are often initiated to mitigate water shortages in neighboring drainage basins or to increase the output of hydroelectric energy production. We describe a database of macro-scale inter-basin hydrological transfers covering all parts of the globe. The focus is on large-scale changes in the flow of water from one drainage basin to another or between sub-basins within the same watershed. Current counts from the database show several hundred identified diversions across five continents. Engineering works under construction as well as those that have been planned or proposed were included. Large projects now under construction in China (South-North Water Transfer Project) and planned in India (Himalayan and Peninsular components of the National River Linking Project) represent some of largest human created movements of water on the planet. We also explore the implications of the more speculative plans that have emerged over the last half century such as the Northern River Reversal, designed to deliver water from Siberia to Central Asia, and the North American Water and Power Alliance (NAWAPA) project linking high latitude rivers in Alaska and Canada to areas of the American southwest. The database has allowed us to explore, through hydrological modeling, the global impact of these engineering works. We also explore scenarios of changes in future water resources by looking at the inter-basin transfers against a suite of anticipated climate change from model output driven by the IPCC AR5 data set.

  17. Wetlands as large-scale nature-based solutions: status and future challenges for research and management

    NASA Astrophysics Data System (ADS)

    Thorslund, Josefin; Jarsjö, Jerker; Destouni, Georgia

    2017-04-01

    Wetlands are often considered as nature-based solutions that can provide a multitude of services of great social, economic and environmental value to humankind. The services may include recreation, greenhouse gas sequestration, contaminant retention, coastal protection, groundwater level and soil moisture regulation, flood regulation and biodiversity support. Changes in land-use, water use and climate can all impact wetland functions and occur at scales extending well beyond the local scale of an individual wetland. However, in practical applications, management decisions usually regard and focus on individual wetland sites and local conditions. To understand the potential usefulness and services of wetlands as larger-scale nature-based solutions, e.g. for mitigating negative impacts from large-scale change pressures, one needs to understand the combined function multiple wetlands at the relevant large scales. We here systematically investigate if and to what extent research so far has addressed the large-scale dynamics of landscape systems with multiple wetlands, which are likely to be relevant for understanding impacts of regional to global change. Our investigation regards key changes and impacts of relevance for nature-based solutions, such as large-scale nutrient and pollution retention, flow regulation and coastal protection. Although such large-scale knowledge is still limited, evidence suggests that the aggregated functions and effects of multiple wetlands in the landscape can differ considerably from those observed at individual wetlands. Such scale differences may have important implications for wetland function-effect predictability and management under large-scale change pressures and impacts, such as those of climate change.

  18. X6.9-CLASS FLARE-INDUCED VERTICAL KINK OSCILLATIONS IN A LARGE-SCALE PLASMA CURTAIN AS OBSERVED BY THE SOLAR DYNAMICS OBSERVATORY/ATMOSPHERIC IMAGING ASSEMBLY

    SciTech Connect

    Srivastava, A. K.; Goossens, M.

    2013-11-01

    We present rare observational evidence of vertical kink oscillations in a laminar and diffused large-scale plasma curtain as observed by the Atmospheric Imaging Assembly on board the Solar Dynamics Observatory. The X6.9-class flare in active region 11263 on 2011 August 9 induces a global large-scale disturbance that propagates in a narrow lane above the plasma curtain and creates a low density region that appears as a dimming in the observational image data. This large-scale propagating disturbance acts as a non-periodic driver that interacts asymmetrically and obliquely with the top of the plasma curtain and triggers the observed oscillations. In the deeper layers of the curtain, we find evidence of vertical kink oscillations with two periods (795 s and 530 s). On the magnetic surface of the curtain where the density is inhomogeneous due to coronal dimming, non-decaying vertical oscillations are also observed (period ≈ 763-896 s). We infer that the global large-scale disturbance triggers vertical kink oscillations in the deeper layers as well as on the surface of the large-scale plasma curtain. The properties of the excited waves strongly depend on the local plasma and magnetic field conditions.

  19. Large-Scale Academic Achievement Testing of Deaf and Hard-of-Hearing Students: Past, Present, and Future

    ERIC Educational Resources Information Center

    Qi, Sen; Mitchell, Ross E.

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the…

  20. On-orbit assembly and servicing of future space observatories

    NASA Astrophysics Data System (ADS)

    Lillie, C. F.

    2006-06-01

    NASA's experience servicing the Hubble Space Telescope, including the installation of optical elements to compensate for a mirror manufacturing error; replacement of failed avionics and worn-out batteries, gyros, thermal insulation and solar arrays; upgrades to the data handling subsystem; installation of far more capable instruments; and retrofitting the NICMOS experiment with a mechanical cryocooler has clearly demonstrated the advantages of on-orbit servicing. This effort has produced a unique astronomical observatory that is orders of magnitude more capable than when it was launched and can be operated for several times its original design life. The in-space operations capabilities that are developed for NASA's Exploration Program will make it possible to assemble and service spacecraft in space and to service them in cis-lunar and L2 orbits. Future space observatories should be designed to utilize these capabilities. This paper discusses the application of the lessons learned from HST and our plans for servicing the Advanced X-ray Astrophysical Observatory with the Orbital Maneuvering Vehicle and the Space Station Freedom Customer Servicing Facility to future space observatories, such as SAFIR and LifeFinder that are designed to operate in heliocentric orbits. It addresses the use of human and robotic in-space capabilities that would be required for on-orbit assembly and servicing for future space observatories, and describes some of our design concepts for these activities.

  1. New ultracool subdwarfs identified in large-scale surveys using Virtual Observatory tools (Corrigendum). I. UKIDSS LAS DR5 vs. SDSS DR7

    NASA Astrophysics Data System (ADS)

    Lodieu, N.; Espinoza Contreras, M.; Zapatero Osorio, M. R.; Solano, E.; Aberasturi, M.; Martín, E. L.

    2017-01-01

    Based on observations made with ESO Telescopes at the La Silla Paranal Observatory under programme ID 084.C-0928A.Based on observations made with the Nordic Optical Telescope, operated on the island of La Palma jointly by Denmark, Finland, Iceland, Norway, and Sweden, in the Spanish Observatorio del Roque de los Muchachos of the Instituto de Astrofísica de Canarias.

  2. Acute psychological impact of disaster and large-scale tauma: limitations of traditional interventions and future practice recommendations.

    PubMed

    Gray, Matt J; Maguen, Shira; Litz, Brett T

    2004-01-01

    Nearly everyone will experience emotional and psychological distress in the immediate aftermath of a disaster or other large-scale traumatic event. Although extremely upsetting and disruptive, the reaction is understood best as a human response to inordinate adversity, which in the majority of cases remits over time without formal intervention. Nevertheless, some people experience sustained difficulties. To prevent chronic post-traumatic difficulties, mental health professionals provide early interventions soon after traumatic exposure. These interventions typically take the form of single-session debriefings, which have been applied routinely following disasters. The research bearing on these traditional forms of early crisis interventions has shown that, although well-received by victims, there is no empirical support for their continued use. However, promising evidence-based, early interventions have been developed, which are highlighted. Finally, traumatic bereavement and complicated grief in survivors of disasters, an area largely neglected in the field, is discussed.

  3. The application of LiF:Mg,Cu,P to large scale personnel dosimetry: current status and future directions.

    PubMed

    Moscovitch, M; St John, T J; Cassata, J R; Blake, P K; Rotunda, J E; Ramlo, M; Velbeck, K J; Luo, L Z

    2006-01-01

    LiF:Mg,Cu,P is starting to replace LiF:Mg,Ti in a variety of personnel dosimetry applications. LiF:Mg,Cu,P has superior characteristics as compared to LiF:Mg,Ti including, higher sensitivity, improved energy response for photons, lack of supralinearity and insignificant fading. The use of LiF:Mg,Cu,P in large scale dosimetry programs is of particular interest due to the extreme sensitivity of this material to the maximum readout temperature, and the variety of different dosimetry aspects and details that must be considered for a successful implementation in routine dosimetry. Here we discuss and explain the various aspects of large scale LiF:Mg,Cu,P based dosimetry programs including the properties of the TL material, new generation of TLD readers, calibration methodologies, a new generation of dose calculation algorithms based on the use of artificial neural networks and the overall uncertainty of the dose measurement. The United States Navy (USN) will be the first US dosimetry processor who will use this new material for routine applications. Until June 2002, the Navy used two types of thermoluminescent materials for personnel dosimetry, CaF2:Mn and LiF:Mg,Ti. A program to upgrade the system and to implement LiF:Mg,Cu,P, started in the mid 1990s and was recently concluded. In 2002, the new system replaced the LiF:Mg,Ti and is scheduled to start replacing the CaF2:Mn system in 2006. A pilot study to determine the dosimetric performance of the new LiF:Mg,Cu,P based dosimetry system was recently completed, and the results show the new system to be as good or better than the current system in all areas tested. As a result, LiF:Mg,Cu,P is scheduled to become the primary personnel dosimeter for the entire US Navy in 2006.

  4. The Recent Rejuvenation of the Sun's Large-scale Magnetic Field: A Clue for Understanding Past and Future Sunspot Cycles

    NASA Astrophysics Data System (ADS)

    Sheeley, N. R., Jr.; Wang, Y.-M.

    2015-08-01

    The quiet nature of sunspot cycle 24 was disrupted during the second half of 2014 when the Sun’s large-scale field underwent a sudden rejuvenation: the solar mean field reached its highest value since 1991, the interplanetary field strength doubled, and galactic cosmic rays showed their strongest 27-day modulation since neutron-monitor observations began in 1957; in the outer corona, the large increase of field strength was reflected by unprecedentedly large numbers of coronal loops collapsing inward along the heliospheric current sheet. Here, we show that this rejuvenation was not caused by a significant increase in the level of solar activity as measured by the smoothed sunspot number and CME rate, but instead was caused by the systematic emergence of flux in active regions whose longitudinal distribution greatly increased the Sun’s dipole moment. A similar post-maximum increase in the dipole moment occurred during each of the previous three sunspot cycles, and marked the start of the declining phase of each cycle. We note that the north-south component of this peak dipole moment provides an early indicator of the amplitude of the next cycle, and conclude that the amplitude of cycle 25 may be comparable to that of cycle 24, and well above the amplitudes obtained during the Maunder Minimum.

  5. Machine learning for large-scale wearable sensor data in Parkinson's disease: Concepts, promises, pitfalls, and futures.

    PubMed

    Kubota, Ken J; Chen, Jason A; Little, Max A

    2016-09-01

    For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, "wearable," sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that "learn" from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society.

  6. Large scale distribution of ultra high energy cosmic rays detected at the Pierre Auger Observatory with zenith angles up to 80°

    SciTech Connect

    Aab, Alexander

    2015-03-30

    In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in the $E\\gt 8$ EeV energy bin, with an amplitude for the first harmonic in right ascension $r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$, that has a chance probability $P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$, reinforcing the hint previously reported with vertical events alone.

  7. Large Scale Distribution of Ultra High Energy Cosmic Rays Detected at the Pierre Auger Observatory with Zenith Angles up to 80°

    NASA Astrophysics Data System (ADS)

    Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Samarai, I. Al; Albuquerque, I. F. M.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Aramo, C.; Aranda, V. M.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Awal, N.; Badescu, A. M.; Barber, K. B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blaess, S. G.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Bridgeman, A.; Brogueira, P.; Brown, W. C.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Candusso, M.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Cordier, A.; Coutu, S.; Covault, C. E.; Cronin, J.; Curutiu, A.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; de Jong, S. J.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Diaz, J. C.; Díaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dorofeev, A.; Dorosti Hasankiadeh, Q.; Dova, M. T.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fernandes, M.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fox, B. D.; Fratu, O.; Freire, M. M.; Fröhlich, U.; Fuchs, B.; Fujii, T.; Gaior, R.; García, B.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gate, F.; Gemmeke, H.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Glaser, C.; Glass, H.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; González, N.; Gookin, B.; Gordon, J.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grebe, S.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Hartmann, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Isar, P. G.; Jandt, I.; Jansen, S.; Jarne, C.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kunka, N.; LaHurd, D.; Latronico, L.; Lauer, R.; Lauscher, M.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Malacari, M.; Maldera, S.; Mallamaci, M.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Mariş, I. C.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Meissner, R.; Melissas, M.; Melo, D.; Menshikov, A.; Messina, S.; Meyhandan, R.; Mićanović, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morello, C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Müller, S.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nguyen, P. H.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, L.; Ochilo, L.; Oikonomou, F.; Olinto, A.; Oliveira, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pȩkala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Petermann, E.; Peters, C.; Petrera, S.; Petrov, Y.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porcelli, A.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Purrello, V.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rizi, V.; Rodrigues de Carvalho, W.; Rodriguez Fernandez, G.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Rogozin, D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salamida, F.; Salazar, H.; Saleh, A.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Sanchez-Lucas, P.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, D.; Scholten, O.; Schoorlemmer, H.; Schovánek, P.; Schröder, F. G.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Squartini, R.; Srivastava, Y. N.; Stanič, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Taborda, O. A.; Tapia, A.; Tepe, A.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trovato, E.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van Bodegom, P.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Vlcek, B.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Widom, A.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Williams, C.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yamamoto, T.; Yapici, T.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.; Zuccarello, F.

    2015-04-01

    We present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in the E\\gt 8 EeV energy bin, with an amplitude for the first harmonic in right ascension r1α =(4.4+/- 1.0)× {{10}-2}, that has a chance probability P(≥slant r1α )=6.4× {{10}-5}, reinforcing the hint previously reported with vertical events alone.

  8. Large scale distribution of ultra high energy cosmic rays detected at the Pierre Auger Observatory with zenith angles up to 80°

    DOE PAGES

    Aab, Alexander

    2015-03-30

    In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in themore » $$E\\gt 8$$ EeV energy bin, with an amplitude for the first harmonic in right ascension $$r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$$, that has a chance probability $$P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$$, reinforcing the hint previously reported with vertical events alone.« less

  9. Climate change and large-scale land acquisitions in Africa: Quantifying the future impact on acquired water resources

    NASA Astrophysics Data System (ADS)

    Chiarelli, Davide Danilo; Davis, Kyle Frankel; Rulli, Maria Cristina; D'Odorico, Paolo

    2016-08-01

    Pressure on agricultural land has markedly increased since the start of the century, driven by demographic growth, changes in diet, increasing biofuel demand, and globalization. To better ensure access to adequate land and water resources, many investors and countries began leasing large areas of agricultural land in the global South, a phenomenon often termed "large-scale land acquisition" (LSLA). To date, this global land rush has resulted in the appropriation of 41million hectares and about 490 km3 of freshwater resources, affecting rural livelihoods and local environments. It remains unclear to what extent land and water acquisitions contribute to the emergence of water-stress conditions in acquired areas, and how these demands for water may be impacted by climate change. Here we analyze 18 African countries - 20 Mha (or 80%) of LSLA for the continent - and estimate that under present climate 210 km3 year-1of water would be appropriated if all acquired areas were actively under production. We also find that consumptive use of irrigation water is disproportionately contributed by water-intensive biofuel crops. Using the IPCCA1B scenario, we find only small changes in green (-1.6%) and blue (+2.0%) water demand in targeted areas. With a 3 °C temperature increase, crop yields are expected to decrease up to 20% with a consequent increase in the water footprint. When the effect of increasing atmospheric CO2concentrations is accounted for, crop yields increase by as much as 40% with a decrease in water footprint up to 29%. The relative importance of CO2 fertilization and warming will therefore determine water appropriations and changes in water footprint under climate change scenarios.

  10. Defining biotypes for depression and anxiety based on large-scale circuit dysfunction: a theoretical review of the evidence and future directions for clinical translation.

    PubMed

    Williams, Leanne M

    2017-01-01

    Complex emotional, cognitive and self-reflective functions rely on the activation and connectivity of large-scale neural circuits. These circuits offer a relevant scale of focus for conceptualizing a taxonomy for depression and anxiety based on specific profiles (or biotypes) of neural circuit dysfunction. Here, the theoretical review first outlines the current consensus as to what constitutes the organization of large-scale circuits in the human brain identified using parcellation and meta-analysis. The focus is on neural circuits implicated in resting reflection (default mode), detection of "salience," affective processing ("threat" and "reward"), "attention," and "cognitive control." Next, the current evidence regarding which type of dysfunctions in these circuits characterize depression and anxiety disorders is reviewed, with an emphasis on published meta-analyses and reviews of circuit dysfunctions that have been identified in at least two well-powered case:control studies. Grounded in the review of these topics, a conceptual framework is proposed for considering neural circuit-defined "biotypes." In this framework, biotypes are defined by profiles of extent of dysfunction on each large-scale circuit. The clinical implications of a biotype approach for guiding classification and treatment of depression and anxiety is considered. Future research directions will develop the validity and clinical utility of a neural circuit biotype model that spans diagnostic categories and helps to translate neuroscience into clinical practice in the real world.

  11. ANALYSIS OF CHARACTERISTIC PARAMETERS OF LARGE-SCALE CORONAL WAVES OBSERVED BY THE SOLAR-TERRESTRIAL RELATIONS OBSERVATORY/EXTREME ULTRAVIOLET IMAGER

    SciTech Connect

    Muhr, N.; Veronig, A. M.; Kienreich, I. W.; Temmer, M.; Vrsnak, B.

    2011-10-01

    The kinematical evolution of four extreme ultraviolet waves, well observed by the Extreme Ultraviolet Imager on board the Solar-Terrestrial Relations Observatory (STEREO), is studied by visually tracking wave fronts as well as by a semi-automatized perturbation profile method, which leads to results matching each other within the error limits. The derived mean velocities of the events under study lie in the range of 220-350 km s{sup -1}. The fastest of the events (2007 May 19) reveals a significant deceleration of {approx} - 190 m s{sup -2}, while the others are consistent with a constant velocity during wave propagation. The evolution of maximum-intensity values reveals initial intensification of 20%-70% and decays to original levels within 40-60 minutes, while the widths at half-maximum and full-maximum of the perturbation profiles broaden by a factor of two to four. The integral below the perturbation profile remains basically constant in two cases, while it shows a decrease by a factor of three to four in the other two cases. From the peak perturbation amplitudes, we estimate the corresponding magnetosonic Mach numbers M{sub ms}, which range from 1.08-1.21. The perturbation profiles reveal three distinct features behind the propagating wave fronts: coronal dimmings, stationary brightenings, and rarefaction regions. All features appear after the wave passage and only slowly fade away. Our findings indicate that the events under study are weak-shock fast-mode magnetohydrodynamic waves initiated by the CME lateral expansion.

  12. A Future Large-Aperture UVOIR Space Observatory: Reference Designs

    NASA Technical Reports Server (NTRS)

    Thronson, Harley; Rioux, Norman; Feinberg, Lee; Stahl, H. Philip; Redding, Dave; Jones, Andrew; Sturm, James; Collins, Christine; Liu, Alice

    2015-01-01

    Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. We describe the feasibility assessment of system thermal and dynamic stability for supporting coronagraphy. The observatory is in a Sun-Earth L2 orbit providing a stable thermal environment and excellent field of regard. Reference designs include a 36-segment 9.2 m aperture telescope that stows within a five meter diameter launch vehicle fairing. Performance needs developed under the study are traceable to a variety of reference designs including options for a monolithic primary mirror.

  13. A future large-aperture UVOIR space observatory: reference designs

    NASA Astrophysics Data System (ADS)

    Rioux, Norman; Thronson, Harley; Feinberg, Lee; Stahl, H. Philip; Redding, Dave; Jones, Andrew; Sturm, James; Collins, Christine; Liu, Alice

    2015-09-01

    Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. We describe the feasibility assessment of system thermal and dynamic stability for supporting coronagraphy. The observatory is in a Sun-Earth L2 orbit providing a stable thermal environment and excellent field of regard. Reference designs include a 36-segment 9.2 m aperture telescope that stows within a five meter diameter launch vehicle fairing. Performance needs developed under the study are traceable to a variety of reference designs including options for a monolithic primary mirror.

  14. A mission architecture for future space observatories optimized for SAFIR

    NASA Astrophysics Data System (ADS)

    Lillie, C. F.; Dailey, D. R.

    2005-08-01

    We have developed generic mission architecture with James Webb Space Telescope heritage that can accommodate a wide variety of future space observatories. This paper describes the optimization of this architecture for the Single Aperture Far InfraRed (SAFIR) mission. This mission calls for a 10-meter telescope in an L2 orbit that is actively cooled to 4 Kelvin, enabling background-limited observations of celestial objects in the 30 to 800 micron region of the spectrum. A key feature of our architecture is a boom that attaches the payload to the spacecraft, providing thermal and dynamic isolation and minimizing disturbances from the spacecraft bus. Precision mechanisms, hinges and latches enable folding the observatory into a 5-m diameter fairing for launch and a precision deployment once on orbit. Precision mechanisms also articulate the telescope to minimize solar torques and increase the field of regard. The details of our design and the trades considered during its development are also described

  15. Large-scale modeled contemporary and future water temperature estimates for 10774 Midwestern U.S. Lakes

    NASA Astrophysics Data System (ADS)

    Winslow, Luke A.; Hansen, Gretchen J. A.; Read, Jordan S.; Notaro, Michael

    2017-04-01

    Climate change has already influenced lake temperatures globally, but understanding future change is challenging. The response of lakes to changing climate drivers is complex due to the nature of lake-atmosphere coupling, ice cover, and stratification. To better understand the diversity of lake responses to climate change and give managers insight on individual lakes, we modelled daily water temperature profiles for 10,774 lakes in Michigan, Minnesota, and Wisconsin for contemporary (1979-2015) and future (2020-2040 and 2080-2100) time periods with climate models based on the Representative Concentration Pathway 8.5, the worst-case emission scenario. In addition to lake-specific daily simulated temperatures, we derived commonly used, ecologically relevant annual metrics of thermal conditions for each lake. We include all supporting lake-specific model parameters, meteorological drivers, and archived code for the model and derived metric calculations. This unique dataset offers landscape-level insight into the impact of climate change on lakes.

  16. Large-scale modeled contemporary and future water temperature estimates for 10774 Midwestern U.S. Lakes

    PubMed Central

    Winslow, Luke A.; Hansen, Gretchen J.A.; Read, Jordan S; Notaro, Michael

    2017-01-01

    Climate change has already influenced lake temperatures globally, but understanding future change is challenging. The response of lakes to changing climate drivers is complex due to the nature of lake-atmosphere coupling, ice cover, and stratification. To better understand the diversity of lake responses to climate change and give managers insight on individual lakes, we modelled daily water temperature profiles for 10,774 lakes in Michigan, Minnesota, and Wisconsin for contemporary (1979–2015) and future (2020–2040 and 2080–2100) time periods with climate models based on the Representative Concentration Pathway 8.5, the worst-case emission scenario. In addition to lake-specific daily simulated temperatures, we derived commonly used, ecologically relevant annual metrics of thermal conditions for each lake. We include all supporting lake-specific model parameters, meteorological drivers, and archived code for the model and derived metric calculations. This unique dataset offers landscape-level insight into the impact of climate change on lakes. PMID:28440790

  17. Genetic influences on schizophrenia and subcortical brain volumes: large-scale proof-of-concept and roadmap for future studies

    PubMed Central

    Anttila, Verneri; Hibar, Derrek P; van Hulzen, Kimm J E; Arias-Vasquez, Alejandro; Smoller, Jordan W; Nichols, Thomas E; Neale, Michael C; McIntosh, Andrew M; Lee, Phil; McMahon, Francis J; Meyer-Lindenberg, Andreas; Mattheisen, Manuel; Andreassen, Ole A; Gruber, Oliver; Sachdev, Perminder S; Roiz-Santiañez, Roberto; Saykin, Andrew J; Ehrlich, Stefan; Mather, Karen A; Turner, Jessica A; Schwarz, Emanuel; Thalamuthu, Anbupalam; Shugart, Yin Yao; Ho, Yvonne YW; Martin, Nicholas G; Wright, Margaret J

    2016-01-01

    Schizophrenia is a devastating psychiatric illness with high heritability. Brain structure and function differ, on average, between schizophrenia cases and healthy individuals. As common genetic associations are emerging for both schizophrenia and brain imaging phenotypes, we can now use genome-wide data to investigate genetic overlap. Here we integrated results from common variant studies of schizophrenia (33,636 cases, 43,008 controls) and volumes of several (mainly subcortical) brain structures (11,840 subjects). We did not find evidence of genetic overlap between schizophrenia risk and subcortical volume measures either at the level of common variant genetic architecture or for single genetic markers. The current study provides proof-of-concept (albeit based on a limited set of structural brain measures), and defines a roadmap for future studies investigating the genetic covariance between structural/functional brain phenotypes and risk for psychiatric disorders. PMID:26854805

  18. Phylogeny drives large scale patterns in Australian marine bioactivity and provides a new chemical ecology rationale for future biodiscovery.

    PubMed

    Evans-Illidge, Elizabeth A; Logan, Murray; Doyle, Jason; Fromont, Jane; Battershill, Christopher N; Ericson, Gavin; Wolff, Carsten W; Muirhead, Andrew; Kearns, Phillip; Abdo, David; Kininmonth, Stuart; Llewellyn, Lyndon

    2013-01-01

    Twenty-five years of Australian marine bioresources collecting and research by the Australian Institute of Marine Science (AIMS) has explored the breadth of latitudinally and longitudinally diverse marine habitats that comprise Australia's ocean territory. The resulting AIMS Bioresources Library and associated relational database integrate biodiversity with bioactivity data, and these resources were mined to retrospectively assess biogeographic, taxonomic and phylogenetic patterns in cytotoxic, antimicrobial, and central nervous system (CNS)-protective bioactivity. While the bioassays used were originally chosen to be indicative of pharmaceutically relevant bioactivity, the results have qualified ecological relevance regarding secondary metabolism. In general, metazoan phyla along the deuterostome phylogenetic pathway (eg to Chordata) and their ancestors (eg Porifera and Cnidaria) had higher percentages of bioactive samples in the assays examined. While taxonomy at the phylum level and higher-order phylogeny groupings helped account for observed trends, taxonomy to genus did not resolve the trends any further. In addition, the results did not identify any biogeographic bioactivity hotspots that correlated with biodiversity hotspots. We conclude with a hypothesis that high-level phylogeny, and therefore the metabolic machinery available to an organism, is a major determinant of bioactivity, while habitat diversity and ecological circumstance are possible drivers in the activation of this machinery and bioactive secondary metabolism. This study supports the strategy of targeting phyla from the deuterostome lineage (including ancestral phyla) from biodiverse marine habitats and ecological niches, in future biodiscovery, at least that which is focused on vertebrate (including human) health.

  19. Large-scale impact of climate change vs. land-use change on future biome shifts in Latin America.

    PubMed

    Boit, Alice; Sakschewski, Boris; Boysen, Lena; Cano-Crespo, Ana; Clement, Jan; Garcia-Alaniz, Nashieli; Kok, Kasper; Kolb, Melanie; Langerwisch, Fanny; Rammig, Anja; Sachse, René; van Eupen, Michiel; von Bloh, Werner; Clara Zemp, Delphine; Thonicke, Kirsten

    2016-11-01

    Climate change and land-use change are two major drivers of biome shifts causing habitat and biodiversity loss. What is missing is a continental-scale future projection of the estimated relative impacts of both drivers on biome shifts over the course of this century. Here, we provide such a projection for the biodiverse region of Latin America under four socio-economic development scenarios. We find that across all scenarios 5-6% of the total area will undergo biome shifts that can be attributed to climate change until 2099. The relative impact of climate change on biome shifts may overtake land-use change even under an optimistic climate scenario, if land-use expansion is halted by the mid-century. We suggest that constraining land-use change and preserving the remaining natural vegetation early during this century creates opportunities to mitigate climate-change impacts during the second half of this century. Our results may guide the evaluation of socio-economic scenarios in terms of their potential for biome conservation under global change.

  20. The large scale structure of the Universe revealed with high redshift emission-line galaxies: implications for future surveys

    NASA Astrophysics Data System (ADS)

    Antonino Orsi, Alvaro

    2015-08-01

    Nebular emission in galaxies trace their star-formation activity within the last 10 Myr or so. Hence, these objects are typically found in the outskirts of massive clusters, where otherwise environmental effects can effectively stop the star formation process. In this talk I discuss the nature of emission-line galaxies (ELGs) and its implications for their clustering properties. To account for the relevant physical ingredients that produce nebular emission, I combine semi-analytical models of galaxy formation with a radiative transfer code of Ly-alpha photons, and the photoionzation and shock code MAPPINGS-III. As a result, the clustering strength of ELGs is found to correlate weakly with the line luminosities. Also, their 2-d clustering displays a weak finger-of-god effect, and the clustering in linear scales is affected by assembly bias. I review the impact of the nature of this galaxy population for future spectroscopic large surveys targeting ELGs to extract cosmological results. In particular, I present forecasts for the ELG population in J-PAS, an 8000 deg^2 survey with 54 narrow-band filters covering the optical range, expected to start in 2016.

  1. Phylogeny Drives Large Scale Patterns in Australian Marine Bioactivity and Provides a New Chemical Ecology Rationale for Future Biodiscovery

    PubMed Central

    Evans-Illidge, Elizabeth A.; Logan, Murray; Doyle, Jason; Fromont, Jane; Battershill, Christopher N.; Ericson, Gavin; Wolff, Carsten W.; Muirhead, Andrew; Kearns, Phillip; Abdo, David; Kininmonth, Stuart; Llewellyn, Lyndon

    2013-01-01

    Twenty-five years of Australian marine bioresources collecting and research by the Australian Institute of Marine Science (AIMS) has explored the breadth of latitudinally and longitudinally diverse marine habitats that comprise Australia’s ocean territory. The resulting AIMS Bioresources Library and associated relational database integrate biodiversity with bioactivity data, and these resources were mined to retrospectively assess biogeographic, taxonomic and phylogenetic patterns in cytotoxic, antimicrobial, and central nervous system (CNS)-protective bioactivity. While the bioassays used were originally chosen to be indicative of pharmaceutically relevant bioactivity, the results have qualified ecological relevance regarding secondary metabolism. In general, metazoan phyla along the deuterostome phylogenetic pathway (eg to Chordata) and their ancestors (eg Porifera and Cnidaria) had higher percentages of bioactive samples in the assays examined. While taxonomy at the phylum level and higher-order phylogeny groupings helped account for observed trends, taxonomy to genus did not resolve the trends any further. In addition, the results did not identify any biogeographic bioactivity hotspots that correlated with biodiversity hotspots. We conclude with a hypothesis that high-level phylogeny, and therefore the metabolic machinery available to an organism, is a major determinant of bioactivity, while habitat diversity and ecological circumstance are possible drivers in the activation of this machinery and bioactive secondary metabolism. This study supports the strategy of targeting phyla from the deuterostome lineage (including ancestral phyla) from biodiverse marine habitats and ecological niches, in future biodiscovery, at least that which is focused on vertebrate (including human) health. PMID:24040076

  2. Impact of idealized future stratospheric aerosol injection on the large-scale ocean and land carbon cycles

    NASA Astrophysics Data System (ADS)

    Tjiputra, J. F.; Grini, A.; Lee, H.

    2016-01-01

    Using an Earth system model, we simulate stratospheric aerosol injection (SAI) on top of the Representative Concentration Pathways 8.5 future scenario. Our idealized method prescribes aerosol concentration, linearly increasing from 2020 to 2100, and thereafter remaining constant until 2200. In the aggressive scenario, the model projects a cooling trend toward 2100 despite warming that persists in the high latitudes. Following SAI termination in 2100, a rapid global warming of 0.35 K yr-1 is simulated in the subsequent 10 years, and the global mean temperature returns to levels close to the reference state, though roughly 0.5 K cooler. In contrast to earlier findings, we show a weak response in the terrestrial carbon sink during SAI implementation in the 21st century, which we attribute to nitrogen limitation. The SAI increases the land carbon uptake in the temperate forest-, grassland-, and shrub-dominated regions. The resultant lower temperatures lead to a reduction in the heterotrophic respiration rate and increase soil carbon retention. Changes in precipitation patterns are key drivers for variability in vegetation carbon. Upon SAI termination, the level of vegetation carbon storage returns to the reference case, whereas the soil carbon remains high. The ocean absorbs nearly 10% more carbon in the geoengineered simulation than in the reference simulation, leading to a ˜15 ppm lower atmospheric CO2 concentration in 2100. The largest enhancement in uptake occurs in the North Atlantic. In both hemispheres' polar regions, SAI delays the sea ice melting and, consequently, export production remains low. In the deep water of North Atlantic, SAI-induced circulation changes accelerate the ocean acidification rate and broaden the affected area.

  3. Sudbury Neutrino Observatory: Latest results and future prospects

    NASA Astrophysics Data System (ADS)

    Tolich, N.; SNO Collaboration

    2011-08-01

    This article summarizes measurements of the 8B decay rate in the sun and neutrino oscillation parameters made with the Sudbury Neutrino Observatory (SNO), and discusses prospects for future improvements to the analysis. These improvements include a particle identification analysis for the proportional counter data obtained from the final phase of data taking, which should significantly improve the signal to noise ratio for that phase. Other analyses discussed include searches for high frequency temporal fluctuations in the solar neutrino signal, and supernovae with no optical signal, both of which resulted in a null result.

  4. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  5. Genetic Diversity and Ecological Niche Modelling of Wild Barley: Refugia, Large-Scale Post-LGM Range Expansion and Limited Mid-Future Climate Threats?

    PubMed Central

    Russell, Joanne; van Zonneveld, Maarten; Dawson, Ian K.; Booth, Allan; Waugh, Robbie; Steffenson, Brian

    2014-01-01

    Describing genetic diversity in wild barley (Hordeum vulgare ssp. spontaneum) in geographic and environmental space in the context of current, past and potential future climates is important for conservation and for breeding the domesticated crop (Hordeum vulgare ssp. vulgare). Spatial genetic diversity in wild barley was revealed by both nuclear- (2,505 SNP, 24 nSSR) and chloroplast-derived (5 cpSSR) markers in 256 widely-sampled geo-referenced accessions. Results were compared with MaxEnt-modelled geographic distributions under current, past (Last Glacial Maximum, LGM) and mid-term future (anthropogenic scenario A2, the 2080s) climates. Comparisons suggest large-scale post-LGM range expansion in Central Asia and relatively small, but statistically significant, reductions in range-wide genetic diversity under future climate. Our analyses support the utility of ecological niche modelling for locating genetic diversity hotspots and determine priority geographic areas for wild barley conservation under anthropogenic climate change. Similar research on other cereal crop progenitors could play an important role in tailoring conservation and crop improvement strategies to support future human food security. PMID:24505252

  6. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  7. Future large-aperture UVOIR space observatory: reference designs

    NASA Astrophysics Data System (ADS)

    Rioux, Norman; Thronson, Harley; Feinberg, Lee; Stahl, H. Phillip; Redding, Dave; Jones, Andrew; Sturm, James; Collins, Christine; Liu, Alice; Bolcar, Matthew

    2016-10-01

    Our joint NASA GSFC/JPL/MSFC and STScI study team has used community-developed science goals to derive mission needs, design parameters, notional instruments, and candidate mission architectures for a future large-aperture, noncryogenic UVOIR space observatory. We describe the feasibility assessment of system dynamic stability that supports coronagraphy. The observatory is in a Sun-Earth L2 orbit, which provides a stable thermal environment and excellent field of regard. Reference designs include a 36-segment 9.2-m aperture telescope that stows within a 5-m diameter launch vehicle fairing. This paper presents results from the latest cycle of integrated modeling through January 2016. The latest findings support the feasibility of secondary mirror support struts with a thickness on the order of an inch. Thin struts were found not to have a significant negative effect on wavefront error stability. Struts with a width as small as 1 in. may benefit some coronagraph designs by allowing more optical throughput.

  8. Operations of and Future Plans for the Pierre Auger Observatory

    SciTech Connect

    Abraham, : J.; Abreu, P.; Aglietta, M.; Aguirre, C.; Ahn, E.J.; Allard, D.; Allekotte, I.; Allen, J.; Alvarez-Muniz, J.; Ambrosio, M.; Anchordoqui, L.

    2009-06-01

    These are presentations to be presented at the 31st International Cosmic Ray Conference, in Lodz, Poland during July 2009. It consists of the following presentations: (1) Performance and operation of the Surface Detectors of the Pierre Auger Observatory; (2) Extension of the Pierre Auger Observatory using high-elevation fluorescence telescopes (HEAT); (3) AMIGA - Auger Muons and Infill for the Ground Array of the Pierre Auger Observatory; (4) Radio detection of Cosmic Rays at the southern Auger Observatory; (5) Hardware Developments for the AMIGA enhancement at the Pierre Auger Observatory; (6) A simulation of the fluorescence detectors of the Pierre Auger Observatory using GEANT 4; (7) Education and Public Outreach at the Pierre Auger Observatory; (8) BATATA: A device to characterize the punch-through observed in underground muon detectors and to operate as a prototype for AMIGA; and (9) Progress with the Northern Part of the Pierre Auger Observatory.

  9. Large scale scientific computing

    SciTech Connect

    Deuflhard, P. ); Engquist, B. )

    1987-01-01

    This book presents papers on large scale scientific computing. It includes: Initial value problems of ODE's and parabolic PDE's; Boundary value problems of ODE's and elliptic PDE's; Hyperbolic PDE's; Inverse problems; Optimization and optimal control problems; and Algorithm adaptation on supercomputers.

  10. Hydrological projections under climate change in the near future by RegCM4 in Southern Africa using a large-scale hydrological model

    NASA Astrophysics Data System (ADS)

    Li, Lu; Diallo, Ismaïla; Xu, Chong-Yu; Stordal, Frode

    2015-09-01

    This study aims to provide model estimates of changes in hydrological elements, such as EvapoTranspiration (ET) and runoff, in Southern Africa in the near future until 2029. The climate change scenarios are projected by a high-resolution Regional Climate Model (RCM), RegCM4, which is the latest version of this model developed by the Abdus Salam International Centre for Theoretical Physics (ICTP). The hydrological projections are performed by using a large-scale hydrological model (WASMOD-D), which has been tested and customized on this region prior to this study. The results reveal that (1) the projected temperature shows an increasing tendency over Southern Africa in the near future, especially eastward of 25°E, while the precipitation changes are varying between different months and sub-regions; (2) an increase in runoff (and ET) was found in eastern part of Southern Africa, i.e. Southern Mozambique and Malawi, while a decrease was estimated across the driest region in a wide area encompassing Kalahari Desert, Namibia, southwest of South Africa and Angola; (3) the strongest climate change signals are found over humid tropical areas, i.e. north of Angola and Malawi and south of Dem Rep of Congo; and (4) large spatial and temporal variability of climate change signals is found in the near future over Southern Africa. This study presents the main results of work-package 2 (WP2) of the 'Socioeconomic Consequences of Climate Change in Sub-equatorial Africa (SoCoCA)' project, which is funded by the Research Council of Norway.

  11. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  12. A Future Large-Aperture UVOIR Space Observatory: Study Overview

    NASA Astrophysics Data System (ADS)

    Postman, Marc; Thronson, Harley A.; Feinberg, Lee; Redding, David; Stahl, H. Philip

    2015-01-01

    The scientific drivers for very high angular resolution coupled with very high sensitivity and wavefront stability in the UV and optical wavelength regime have been well established. These include characterization of exoplanets in the habitable zones of solar type stars, probing the physical properties of the circumgalactic medium around z < 2 galaxies, and resolving stellar populations across a broad range of galactic environments. The 2010 NRC Decadal Survey and the 2013 NASA Science Mission Directorate 30-Year Roadmap identified a large-aperture UVOIR observatory as a priority future space mission. Our joint NASA GSFC/JPL/MSFC/STScI team has extended several earlier studies of the technology and engineering requirements needed to design and build a single filled aperture 10-meter class space-based telescope that can enable these ambitious scientific observations. We present here an overview of our new technical work including a brief summary of the reference science drivers as well as in-depth investigations of the viable telescope architectures, the requirements on thermal control and active wavefront control systems, and the range of possible launch configurations.

  13. Future development of the PLATO Observatory for Antarctic science

    NASA Astrophysics Data System (ADS)

    Ashley, Michael C. B.; Bonner, Colin S.; Everett, Jon R.; Lawrence, Jon S.; Luong-Van, Daniel; McDaid, Scott; McLaren, Campbell; Storey, John W. V.

    2010-07-01

    PLATO is a self-contained robotic observatory built into two 10-foot shipping containers. It has been successfully deployed at Dome A on the Antarctic plateau since January 2008, and has accumulated over 730 days of uptime at the time of writing. PLATO provides 0.5{1kW of continuous electrical power for a year from diesel engines running on Jet-A1, supplemented during the summertime with solar panels. One of the 10-foot shipping containers houses the power system and fuel, the other provides a warm environment for instruments. Two Iridium satellite modems allow 45 MB/day of data to be transferred across the internet. Future enhancements to PLATO, currently in development, include a more modular design, using lithium iron-phosphate batteries, higher power output, and a light-weight low-power version for eld deployment from a Twin Otter aircraft. Technologies used in PLATO include a CAN (Controller Area Network) bus, high-reliability PC/104 com- puters, ultracapacitors for starting the engines, and fault-tolerant redundant design.

  14. Present and future connection of Asian-Pacific Oscillation to large-scale atmospheric circulations and East Asian rainfall: results of CMIP5

    NASA Astrophysics Data System (ADS)

    Zhou, Botao; Xu, Ying; Shi, Ying

    2017-03-01

    The summer Asian-Pacific oscillation (APO), one of the major modes of climate variability over the Asian-Pacific sector, has a pronounced effect on variations of large-scale atmospheric circulations and climate. This study evaluated the capability of 30 state-of-the-art climate models among the Coupled Model Intercomparison Project Phase 5 (CMIP5) in simulating its association with the atmospheric circulations over the Asian-Pacific region and the precipitation over East Asia. Furthermore, their future connections under the RCP8.5 scenario were examined. The evaluation results show that 5 out of 30 climate models can well capture the observed APO-related features in a comprehensive way, including the strengthened South Asian high (SAH), deepened North Pacific trough (NPT) and northward East Asian jet (EAJ) in the upper troposphere; an intensification of the Asian low and the North Pacific subtropical high (NPSH) as well as a northward shift of the western Pacific subtropical high (WPSH) in the lower troposphere; and a decrease in East Asian summer rainfall (EASR) under the positive APO phase. Based on the five CMIP5 models' simulations, the dynamic linkages of the APO to the SAH, NPT, AL, and NPSH are projected to maintain during the second half of the twenty-first century. However, its connection with the EASR tends to reduce significantly. Such a reduction might result from the weakening of the linkage of the APO to the meridional displacement of the EAJ and WPSH as a response to the warming scenario.

  15. Large Scale Nonlinear Programming.

    DTIC Science & Technology

    1978-06-15

    KEY WORDS (Conhinu. as, t.n.t.. aid. if nic••iary aid ld.ntify by block n,a,b.r) L. In,~~~ IP!CIE LARGE SCALE OPTIMIZATION APPLICATIONS OF NONLINEAR ... NONLINEAR PROGRAMMING by Garth P. McCormick 1. Introduction The general mathematical programming ( optimization ) problem can be stated in the following form...because the difficulty in solving a general nonlinear optimization problem has a~ much to do with the nature of the functions involved as it does with the

  16. Future climate change projections of the Okinawa baiu and large-scale feature using a 60-km-mesh global climate model

    NASA Astrophysics Data System (ADS)

    Okada, Y.; Takemi, T.; Ishikawa, H.

    2013-12-01

    Between May and July, East Asia experiences a rainy season that is known as the mei-yu in China and as the baiu in Japan. The precipitation of the baiu is indispensable to our life, but a heavy rain is often generated in this period and gives serious damage. As a starting point for further investigation, we focus on a rainy season around Okinawa region. Okinawa is located southwest of mainland Japan on the eastern fringe of the East China Sea, and consists numerous small islands. 'Baiu' in Okinawa (hereafter, Okinawa baiu) takes places earlier than that in mainland of Japan. It is different in the viewpoints of stationary of baiu front and atmospheric moisture flux from that in other regions of Japan. Okada and Yamazaki (2012) examined on the basis of the proposal of Sampe and Xie (2010), and showed the precipitation in May is explained by a strong north-south temperature gradient at 500 hPa and southerly winds in the vicinity of Okinawa. We investigate the future changes in precipitation and the feature large-scale meteorological fields around Okinawa region, with particular focus on the horizontal temperature advection at 500 hPa, using the present-day (1979-2003) and future (2075-2099) climate simulations data by the 60 km mesh Meteorological Research Institute atmospheric general circulation model (MRI-AGCM3.2H). Future changes in the precipitation of the Okinawa baiu increases slightly during May and June. However, when considered separately in May and June, it shows the precipitation in May increase, and is little change in June. In present-day climate, the May precipitation is associated with warm advection at 500 hPa, mainly due to the meridional temperature gradient and the prevailing southerly wind. This warm advection coincides with upward motion near Okinawa. Precipitation over Okinawa is reduced during late May when the baiu front shifts southeastward. A region of cold advection from the north shifts southward to cover the northern part of Okinawa

  17. An Observatory to Enhance the Preparation of Future California Teachers

    NASA Astrophysics Data System (ADS)

    Connolly, L.; Lederer, S.

    2004-12-01

    With a major grant from the W. M. Keck Foundation, California State University, San Bernardino is establishing a state-of-the-art teaching astronomical observatory. The Observatory will be fundamental to an innovative undergraduate physics and astronomy curriculum for Physics and Liberal Studies majors and will be integrated into our General Education program. The critical need for a research and educational observatory is linked to changes in California's Science Competencies for teacher certification. Development of the Observatory will also complement a new infusion of NASA funding and equipment support for our growing astronomy education programs and the University's established Strategic Plan for excellence in education and teacher preparation. The Observatory will consist of two domed towers. One tower will house a 20" Ritchey-Chretien telescope equipped with a CCD camera in conjunction with either UBVRI broadband filters or a spectrometer for evening laboratories and student research projects. The second tower will house the university's existing 12" Schmidt-Cassegrain optical telescope coupled with a CCD camera and an array of filters. A small aperture solar telescope will be attached to the 12" for observing solar prominences while a milar filter can be attached to the 12" for sunspot viewing. We have been very fortunate to receive a challenge grant of \\600,000 from the W. M. Keck Foundation to equip the two domed towers; we continue to seek a further \\800,000 to meet our construction needs. Funding also provided by the California State University, San Bernardino.

  18. The Renovation and Future Capabilities of the Thacher Observatory

    NASA Astrophysics Data System (ADS)

    O'Neill, Katie; Osuna, Natalie; Edwards, Nick; Klink, Douglas; Swift, Jonathan; Vyhnal, Chris; Meyer, Kurt

    2016-01-01

    The Thacher School is in the process of renovating the campus observatory with a new meter class telescope and full automation capabilities for the purpose of scientific research and education. New equipment on site has provided a preliminary site characterization including seeing and V-band sky brightness measurements. These data, along with commissioning data from the MINERVA project (which uses comparable hardware) are used to estimate the capabilities of the observatory once renovation is complete. Our V-band limiting magnitude is expected to be better than 21.3 for a one minute integration time, and we estimate that milli-magnitude precision photometry will be possible for a V=14.5 point source over approximately 5 min timescales. The quick response, autonomous operation, and multi-band photometric capabilities of the renovated observatory will make it a powerful follow-up science facility for exoplanets, eclipsing binaries, near-Earth objects, stellar variability, and supernovae.

  19. Future Large-Aperture Ultraviolet/Optical/Infrared Space Observatory

    NASA Technical Reports Server (NTRS)

    Thronson, Harley; Mandell, Avi; Polidan, Ron; Tumlinson, Jason

    2016-01-01

    Since the beginning of modern astronomical science in the early 1900s, astronomers have yearned to escape the turbulence and absorption of Earth's atmosphere by placing observatories in space. One of the first papers to lay out the advantages of space astronomy was by Lyman Spitzer in 1946, "Astronomical Advantages of an Extra-Terrestrial Observatory," though later in life he minimized the influence of this work. Since that time, and especially gaining momentum in the 1960s after the launch of Sputnik, astronomers, technologists, and engineers continued to advance, organizing scientific conferences, advocating for necessary technologies, and assessing sophisticated designs for increasingly ambitious space observations at ultraviolet, visual, and infrared (UVOIR) wavelengths. These community-wide endeavors, combined with the explosion in technological capability enabled by the Apollo era, led to rapid advancement in space observatory performance that culminated in the spectacularly successful Hubble Space Telescope (HST), launched in 1990 and still returning surpassing scientific results.

  20. Highly Adjustable Systems: An Architecture for Future Space Observatories

    NASA Astrophysics Data System (ADS)

    Arenberg, Jonathan; Conti, Alberto; Redding, David; Lawrence, Charles R.; Hachkowski, Roman; Laskin, Robert; Steeves, John

    2017-06-01

    Mission costs for ground breaking space astronomical observatories are increasing to the point of unsustainability. We are investigating the use of adjustable or correctable systems as a means to reduce development and therefore mission costs. The poster introduces the promise and possibility of realizing a “net zero CTE” system for the general problem of observatory design and introduces the basic systems architecture we are considering. This poster concludes with an overview of our planned study and demonstrations for proving the value and worth of highly adjustable telescopes and systems ahead of the upcoming decadal survey.

  1. The Calar Alto Observatory: current status and future instrumentation

    NASA Astrophysics Data System (ADS)

    Barrado, D.; Thiele, U.; Aceituno, J.; Pedraz, S.; Sánchez, S. F.; Aguirre, A.; Alises, M.; Bergond, G.; Galadí, D.; Guijarro, A.; Hoyo, F.; Mast, D.; Montoya, L.; Sengupta, Ch.; de Guindos, E.; Solano, E.

    2011-11-01

    The Calar Alto Observatory, located at 2168 m height above the sea level in continental Europe, holds a significant number of astronomical telescopes and experiments, covering a large range of the electromagnetic domain, from gamma-ray to near-infrared. It is a very well characterized site, with excellent logistics. Its main telescopes includes a large suite of instruments. At the present time, new instruments, namely CAFE, PANIC and Carmenes, are under development. We are also planning a new operational scheme in order to optimize the observatory resources.

  2. Large scale tracking algorithms

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  3. Large scale traffic simulations

    SciTech Connect

    Nagel, K.; Barrett, C.L. |; Rickert, M. |

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  4. Precision medicine in the age of big data: The present and future role of large-scale unbiased sequencing in drug discovery and development.

    PubMed

    Vicini, P; Fields, O; Lai, E; Litwack, E D; Martin, A-M; Morgan, T M; Pacanowski, M A; Papaluca, M; Perez, O D; Ringel, M S; Robson, M; Sakul, H; Vockley, J; Zaks, T; Dolsten, M; Søgaard, M

    2016-02-01

    High throughput molecular and functional profiling of patients is a key driver of precision medicine. DNA and RNA characterization has been enabled at unprecedented cost and scale through rapid, disruptive progress in sequencing technology, but challenges persist in data management and interpretation. We analyze the state-of-the-art of large-scale unbiased sequencing in drug discovery and development, including technology, application, ethical, regulatory, policy and commercial considerations, and discuss issues of LUS implementation in clinical and regulatory practice.

  5. Innovative telescope architectures for future large space observatories

    NASA Astrophysics Data System (ADS)

    Polidan, Ronald S.; Breckinridge, James B.; Lillie, Charles F.; MacEwen, Howard A.; Flannery, Martin R.; Dailey, Dean R.

    2016-10-01

    Over the past few years, we have developed a concept for an evolvable space telescope (EST) that is assembled on orbit in three stages, growing from a 4×12-m telescope in Stage 1, to a 12-m filled aperture in Stage 2, and then to a 20-m filled aperture in Stage 3. Stage 1 is launched as a fully functional telescope and begins gathering science data immediately after checkout on orbit. This observatory is then periodically augmented in space with additional mirror segments, structures, and newer instruments to evolve the telescope over the years to a 20-m space telescope. We discuss the EST architecture, the motivation for this approach, and the benefits it provides over current approaches to building and maintaining large space observatories.

  6. The new Arecibo Observatory Remote Optical Facility (AO-ROF) in Culebra Island, Puerto Rico: Current Status and Future Projects

    NASA Astrophysics Data System (ADS)

    Santos, P. T.

    2015-12-01

    The idea of establishing the Arecibo Observatory Remote Optical Facility (AO-ROF) in the island of Culebra is a solution to mitigate the ever cumulative quantity of cloud, fog, and rain that has distressed observations at the Arecibo Observatory (AO) during major optical campaigns and observations. Given Culebra Island's favorable geographical and climatological characteristics as its low elevation and geographic location, it appears to have more steady weather conditions than Arecibo, so therefore it provides more availability for optical observations. Placed on Culebra, optical instruments can observe the same thermospheric volume over AO sampled by the Incoherent Scatter Radar (ISR). This capability will become especially important during the High Frequency (HF) facility is on operation. Small and large scale irregularities created by that HF can be readily observed and tracked from the Culebra site, and simultaneous observations from AO of the same atmospheric volume will permit direct vector measurements of dynamical evolution of the irregularities. This work presents a discussion of the current status of AO-ROF facility, as well the future projects.

  7. Challenges for Large Scale Simulations

    NASA Astrophysics Data System (ADS)

    Troyer, Matthias

    2010-03-01

    With computational approaches becoming ubiquitous the growing impact of large scale computing on research influences both theoretical and experimental work. I will review a few examples in condensed matter physics and quantum optics, including the impact of computer simulations in the search for supersolidity, thermometry in ultracold quantum gases, and the challenging search for novel phases in strongly correlated electron systems. While only a decade ago such simulations needed the fastest supercomputers, many simulations can now be performed on small workstation clusters or even a laptop: what was previously restricted to a few experts can now potentially be used by many. Only part of the gain in computational capabilities is due to Moore's law and improvement in hardware. Equally impressive is the performance gain due to new algorithms - as I will illustrate using some recently developed algorithms. At the same time modern peta-scale supercomputers offer unprecedented computational power and allow us to tackle new problems and address questions that were impossible to solve numerically only a few years ago. While there is a roadmap for future hardware developments to exascale and beyond, the main challenges are on the algorithmic and software infrastructure side. Among the problems that face the computational physicist are: the development of new algorithms that scale to thousands of cores and beyond, a software infrastructure that lifts code development to a higher level and speeds up the development of new simulation programs for large scale computing machines, tools to analyze the large volume of data obtained from such simulations, and as an emerging field provenance-aware software that aims for reproducibility of the complete computational workflow from model parameters to the final figures. Interdisciplinary collaborations and collective efforts will be required, in contrast to the cottage-industry culture currently present in many areas of computational

  8. THE RECENT REJUVENATION OF THE SUN’S LARGE-SCALE MAGNETIC FIELD: A CLUE FOR UNDERSTANDING PAST AND FUTURE SUNSPOT CYCLES

    SciTech Connect

    Sheeley, N. R. Jr.; Wang, Y.-M.

    2015-08-20

    The quiet nature of sunspot cycle 24 was disrupted during the second half of 2014 when the Sun’s large-scale field underwent a sudden rejuvenation: the solar mean field reached its highest value since 1991, the interplanetary field strength doubled, and galactic cosmic rays showed their strongest 27-day modulation since neutron-monitor observations began in 1957; in the outer corona, the large increase of field strength was reflected by unprecedentedly large numbers of coronal loops collapsing inward along the heliospheric current sheet. Here, we show that this rejuvenation was not caused by a significant increase in the level of solar activity as measured by the smoothed sunspot number and CME rate, but instead was caused by the systematic emergence of flux in active regions whose longitudinal distribution greatly increased the Sun’s dipole moment. A similar post-maximum increase in the dipole moment occurred during each of the previous three sunspot cycles, and marked the start of the declining phase of each cycle. We note that the north–south component of this peak dipole moment provides an early indicator of the amplitude of the next cycle, and conclude that the amplitude of cycle 25 may be comparable to that of cycle 24, and well above the amplitudes obtained during the Maunder Minimum.

  9. Mercury Export from the Yukon River Basin: a unique opportunity to assess global atmospheric sources at large scales and potential future response to climate change

    NASA Astrophysics Data System (ADS)

    Schuster, P. F.; Streigl, R.; Dornblaser, M.; Aiken, G.; Krabbenhoft, D. P.; Dewild, J.; Butler, K.

    2010-12-01

    Mercury (Hg), a global pollutant, and Hg methylation is impacting aquatic resources and posing a serious potential threat to human health and aquatic biota. The Yukon River Basin (YRB) is the fourth largest drainage basin in North America (about twice the size of California) and the Yukon River has been defined as the largest free flowing river in the world (Nilsson, et al., 2005). The basin’s vast size, relatively pristine condition, naturally high sediment, dissolved and particulate organic carbon (OC) loads, and diverse sub-basin terrain provides a unique hydrologic setting to assess mercury (Hg) export from global atmospheric sources at large scales. Annually, the Yukon River exports about 4400 kg total Hg, 16 kg methylmercury, and 2.3 x109 kg total OC. Hg yields calculated from four other major northern rivers ranged from 5 to 31 percent of yields from the Yukon River. Concentrations of Hg and OC in the Yukon River and its major tributaries were highest during high flow; with total OC exceeding 30 mg/L and total Hg in excess of 40 ng/L (3 times higher than the EPA aquatic life Hg standard for adverse chronic effects to biota). Results from 5 consecutive years of Hg and OC measurements throughout the YRB show that the total OC yields account for 87-92% of the variance of total Hg yields from the YRB suggesting that OC yields can be used as a reliable predictor of Hg yields. As warming trends at high latitudes continue, permafrost is thawing (Jorgenson and others, 2006). Permafrost thaw will likely lead to changes in the ground water flows (Walvoord and Striegl, 2007), and the quantity and quality of OC in the rivers, streams, and lakes (Striegl and others, 2005; 2007). A better understanding of Hg-OC interactions in large-scale northern ecosystems is important given the vast reservoirs of OC that exist in arctic regions. In the northern regions permafrost is a potential terrestrial landscape source of Hg to the fluvial systems. Recent investigations measuring

  10. Large Scale Magnetostrictive Valve Actuator

    NASA Technical Reports Server (NTRS)

    Richard, James A.; Holleman, Elizabeth; Eddleman, David

    2008-01-01

    Marshall Space Flight Center's Valves, Actuators and Ducts Design and Development Branch developed a large scale magnetostrictive valve actuator. The potential advantages of this technology are faster, more efficient valve actuators that consume less power and provide precise position control and deliver higher flow rates than conventional solenoid valves. Magnetostrictive materials change dimensions when a magnetic field is applied; this property is referred to as magnetostriction. Magnetostriction is caused by the alignment of the magnetic domains in the material s crystalline structure and the applied magnetic field lines. Typically, the material changes shape by elongating in the axial direction and constricting in the radial direction, resulting in no net change in volume. All hardware and testing is complete. This paper will discuss: the potential applications of the technology; overview of the as built actuator design; discuss problems that were uncovered during the development testing; review test data and evaluate weaknesses of the design; and discuss areas for improvement for future work. This actuator holds promises of a low power, high load, proportionally controlled actuator for valves requiring 440 to 1500 newtons load.

  11. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  12. A comparative study of large-scale atmospheric circulation in the context of a future scenario (RCP4.5) and past warmth (mid-Pliocene)

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Ramstein, G.; Contoux, C.; Zhou, T.

    2013-07-01

    The mid-Pliocene warm period (~ 3.3-3.0 Ma) is often considered as the last sustained warm period with close enough geographic configurations compared to the present one associated with atmospheric CO2 concentration (405 ± 50 ppm) higher than the modern level. For this reason, this period is often considered as a potential analogue for the future climate warming, with the important advantage that for mid-Pliocene many marine and continental data are available. To investigate this issue, we selected the RCP4.5 scenario, one of the current available future projections, to compare the pattern of tropical atmospheric response with the past warm mid-Pliocene climate. We use three Atmosphere-Ocean General Circulation Model (AOGCM) simulations (RCP4.5 scenario, mid-Pliocene and present-day simulation) carried out with the IPSL-CM5A model and investigate atmospheric tropical dynamics through Hadley and Walker cell responses to warmer conditions, considering that the analysis can provide some assessment of how these circulations will change in the future. Our results show that there is a damping of the Hadley cell intensity in the northern tropics and an increase in both subtropics. Moreover, northern and southern Hadley cells expand poleward. The response of the Hadley cells is stronger for the RCP4.5 scenario than for the mid-Pliocene, but in very good agreement with the fact that the atmospheric CO2 concentration is higher in the future scenario than in the mid-Pliocene (543 versus 405 ppm). Concerning the response of the Walker cell, we show that despite very large similarities, there are also some differences. Common features to both scenarios are: weakening of the ascending branch, leading to a suppression of the precipitation over the western tropical Pacific. The response of the Walker cell is stronger in the RCP4.5 scenario than in the mid-Pliocene but also depicts some major differences, as an eastward shift of its rising branch in the future scenario compared to

  13. Current and future facility instruments at the Gemini Observatory

    NASA Astrophysics Data System (ADS)

    Jensen, Joseph B.; Kleinman, Scot J.; Simons, Douglas A.; Lazo, Manuel; Rigaut, François; White, John K.

    2008-07-01

    At the present time, several new Gemini instruments are being delivered and commissioned. The Near-Infrared Coronagraph has been extensively tested and commissioned on the Gemini-South telescope, and will soon begin a large survey to discover extrasolar planets. The FLAMINGOS-2 near-IR multi-object spectrograph is nearing completion at the University of Florida, and is expected to be delivered to Gemini-South by the end of 2008. Gemini's Multi-Conjugate Adaptive Optics bench has been successfully integrated and tested in the lab, and now awaits integration with the laser system and the Gemini-South AO Imager on the telescope. We also describe our efforts to repair thermal damage to the Gemini Near-IR Spectrograph that occurred last year. Since the last update, progress has been made on several of Gemini's next generation of ambitious "Aspen" instruments. The Gemini Planet Imager is now in the final design phase, and construction is scheduled to begin shortly. Two competitive conceptual design studies for the Wide-Field Fiber Multi-Object Spectrometer have now started. The Mauna Kea ground layer monitoring campaign has collected data for well over a year in support of the planning process for a future Ground Layer Adaptive Optics system.

  14. A comparative study of large scale atmospheric circulation in the context of future scenario (RCP4.5) and past warmth (Mid Pliocene)

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Ramstein, G.; Contoux, C.; Zhou, T.

    2013-03-01

    The Pliocene climate (3.3 ~ 3.0 Ma) is often considered as the last sustained warm period with close enough geographic configurations compared to the present one and associated with atmospheric CO2 concentration (405 ± 50 ppm) higher than the modern level. It is therefore suggested that the warm Pliocene climate may provide a plausible scenario for the future climate warming with the important advantage, that for mid-Pliocene, many marine and continental data are available. To investigate this issue, we selected RCP4.5 scenario, one of the current available future projections, to compare the pattern of tropical atmospheric response with past warm mid-Pliocene climate. We performed three OAGCM simulations (RCP4.5 scenario, mid-Pliocene and present day simulation) with the IPSL-CM5A model and investigated atmospheric tropical dynamics through Hadley and Walker cell responses to warmer conditions. Our results show that there is a damping of the Hadley cell intensity in the northern tropics and an increase in both subtropics. Moreover, northern and southern Hadley cells expand poleward. The response of Hadley cell is stronger for RCP4.5 scenario than for mid-Pliocene, but in very good agreement with the fact the atmospheric CO2 concentration is higher in future scenario than mid-Pliocene (543 versus 405 ppm). Concerning the response of the Walker cell, we showed that, despite very large similarities, there are also some differences. i.e. the common features are for both scenarios: weakening of the ascending branch, leading to a suppression of the precipitation over the western tropical Pacific. The response of Walker cell is stronger in RCP4.5 scenario than mid-Pliocene but also depicts some major difference as an eastward shift of the rising branch of Walker cell in future scenario compared to the mid-Pliocene. In this paper, we explain the dynamics of the Hadley and Walker cell, and show that despite minor discrepancy, mid-Pliocene is certainly an interesting

  15. Searches for Large-Scale Anisotropy in the Arrival Directions of Cosmic Rays Detected above Energy of $10^{19}$ eV at the Pierre Auger Observatory and the Telescope Array

    SciTech Connect

    Aab, Alexander; et al,

    2014-10-07

    Spherical harmonic moments are well-suited for capturing anisotropy at any scale in the flux of cosmic rays. An unambiguous measurement of the full set of spherical harmonic coefficients requires full-sky coverage. This can be achieved by combining data from observatories located in both the northern and southern hemispheres. To this end, a joint analysis using data recorded at the Telescope Array and the Pierre Auger Observatory above 1019 eV is presented in this work. The resulting multipolar expansion of the flux of cosmic rays allows us to perform a series of anisotropy searches, and in particular to report on the angular power spectrum of cosmic rays above 1019 eV. No significant deviation from isotropic expectations is found throughout the analyses performed. Upper limits on the amplitudes of the dipole and quadrupole moments are derived as a function of the direction in the sky, varying between 7% and 13% for the dipole and between 7% and 10% for a symmetric quadrupole.

  16. Searches for large-scale anisotropy in the arrival directions of cosmic rays detected above energy of 10{sup 19} eV at the Pierre Auger observatory and the telescope array

    SciTech Connect

    Aab, A.; Abreu, P.; Andringa, S.; Aglietta, M.; Ahn, E. J.; Al Samarai, I.; Albuquerque, I. F. M.; Allekotte, I.; Asorey, H.; Allen, J.; Allison, P.; Almela, A.; Castillo, J. Alvarez; Alvarez-Muñiz, J.; Batista, R. Alves; Ambrosio, M.; Aramo, C.; Aminaei, A.; Anchordoqui, L.; Arqueros, F.; Collaboration: Pierre Auger Collaboration; Telescope Array Collaboration; and others

    2014-10-20

    Spherical harmonic moments are well-suited for capturing anisotropy at any scale in the flux of cosmic rays. An unambiguous measurement of the full set of spherical harmonic coefficients requires full-sky coverage. This can be achieved by combining data from observatories located in both the northern and southern hemispheres. To this end, a joint analysis using data recorded at the Telescope Array and the Pierre Auger Observatory above 10{sup 19} eV is presented in this work. The resulting multipolar expansion of the flux of cosmic rays allows us to perform a series of anisotropy searches, and in particular to report on the angular power spectrum of cosmic rays above 10{sup 19} eV. No significant deviation from isotropic expectations is found throughout the analyses performed. Upper limits on the amplitudes of the dipole and quadrupole moments are derived as a function of the direction in the sky, varying between 7% and 13% for the dipole and between 7% and 10% for a symmetric quadrupole.

  17. Searches for Large-scale Anisotropy in the Arrival Directions of Cosmic Rays Detected above Energy of 1019 eV at the Pierre Auger Observatory and the Telescope Array

    NASA Astrophysics Data System (ADS)

    Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Samarai, I. Al; Albuquerque, I. F. M.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Aramo, C.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Badescu, A. M.; Barber, K. B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Brogueira, P.; Brown, W. C.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Candusso, M.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Cordier, A.; Coutu, S.; Covault, C. E.; Cronin, J.; Curutiu, A.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; de Jong, S. J.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Diaz, J. C.; Díaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dorofeev, A.; Dorosti Hasankiadeh, Q.; Dova, M. T.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fernandes, M.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fox, B. D.; Fratu, O.; Fröhlich, U.; Fuchs, B.; Fuji, T.; Gaior, R.; García, B.; Garcia Roca, S. T.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gate, F.; Gemmeke, H.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Glaser, C.; Glass, H.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; González, N.; Gookin, B.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grebe, S.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Hartmann, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Isar, P. G.; Islo, K.; Jandt, I.; Jansen, S.; Jarne, C.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kunka, N.; La Rosa, G.; LaHurd, D.; Latronico, L.; Lauer, R.; Lauscher, M.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Maccarone, M. C.; Malacari, M.; Maldera, S.; Mallamaci, M.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Mariş, I. C.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J. A. J.; Matthews, J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Messina, S.; Meyhandan, R.; Mićanović, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morello, C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, L.; Ochilo, L.; Olinto, A.; Oliveira, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pękala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Peters, C.; Petrera, S.; Petrolini, A.; Petrov, Y.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porcelli, A.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Purrello, V.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rizi, V.; Roberts, J.; Rodrigues de Carvalho, W.; Rodriguez Cabo, I.; Rodriguez Fernandez, G.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salamida, F.; Salazar, H.; Saleh, A.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Sanchez-Lucas, P.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Scholten, O.; Schoorlemmer, H.; Schovánek, P.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Squartini, R.; Srivastava, Y. N.; Stanič, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Taborda, O. A.; Tapia, A.; Tartare, M.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Vlcek, B.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Widom, A.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.; Pierre Auger Collaboration; Abbasi, R. U.; Abe, M.; Abu-Zayyad, T.; Allen, M.; Anderson, R.; Azuma, R.; Barcikowski, E.; Belz, J. W.; Bergman, D. R.; Blake, S. A.; Cady, R.; Chae, M. J.; Cheon, B. G.; Chiba, J.; Chikawa, M.; Cho, W. R.; Fujii, T.; Fukushima, M.; Goto, T.; Hanlon, W.; Hayashi, Y.; Hayashida, N.; Hibino, K.; Honda, K.; Ikeda, D.; Inoue, N.; Ishii, T.; Ishimori, R.; Ito, H.; Ivanov, D.; Jui, C. C. H.; Kadota, K.; Kakimoto, F.; Kalashev, O.; Kasahara, K.; Kawai, H.; Kawakami, S.; Kawana, S.; Kawata, K.; Kido, E.; Kim, H. B.; Kim, J. H.; Kim, J. H.; Kitamura, S.; Kitamura, Y.; Kuzmin, V.; Kwon, Y. J.; Lan, J.; Lim, S. I.; Lundquist, J. P.; Machida, K.; Martens, K.; Matsuda, T.; Matsuyama, T.; Matthews, J. N.; Minamino, M.; Mukai, K.; Myers, I.; Nagasawa, K.; Nagataki, S.; Nakamura, T.; Nonaka, T.; Nozato, A.; Ogio, S.; Ogura, J.; Ohnishi, M.; Ohoka, H.; Oki, K.; Okuda, T.; Ono, M.; Oshima, A.; Ozawa, S.; Park, I. H.; Pshirkov, M. S.; Rodriguez, D. C.; Rubtsov, G.; Ryu, D.; Sagawa, H.; Sakurai, N.; Sampson, A. L.; Scott, L. M.; Shah, P. D.; Shibata, F.; Shibata, T.; Shimodaira, H.; Shin, B. K.; Smith, J. D.; Sokolsky, P.; Springer, R. W.; Stokes, B. T.; Stratton, S. R.; Stroman, T. A.; Suzawa, T.; Takamura, M.; Takeda, M.; Takeishi, R.; Taketa, A.; Takita, M.; Tameda, Y.; Tanaka, H.; Tanaka, K.; Tanaka, M.; Thomas, S. B.; Thomson, G. B.; Tinyakov, P.; Tkachev, I.; Tokuno, H.; Tomida, T.; Troitsky, S.; Tsunesada, Y.; Tsutsumi, K.; Uchihori, Y.; Udo, S.; Urban, F.; Vasiloff, G.; Wong, T.; Yamane, R.; Yamaoka, H.; Yamazaki, K.; Yang, J.; Yashiro, K.; Yoneda, Y.; Yoshida, S.; Yoshii, H.; Zollinger, R.; Zundel, Z.; Telescope Array Collaboration

    2014-10-01

    Spherical harmonic moments are well-suited for capturing anisotropy at any scale in the flux of cosmic rays. An unambiguous measurement of the full set of spherical harmonic coefficients requires full-sky coverage. This can be achieved by combining data from observatories located in both the northern and southern hemispheres. To this end, a joint analysis using data recorded at the Telescope Array and the Pierre Auger Observatory above 1019 eV is presented in this work. The resulting multipolar expansion of the flux of cosmic rays allows us to perform a series of anisotropy searches, and in particular to report on the angular power spectrum of cosmic rays above 1019 eV. No significant deviation from isotropic expectations is found throughout the analyses performed. Upper limits on the amplitudes of the dipole and quadrupole moments are derived as a function of the direction in the sky, varying between 7% and 13% for the dipole and between 7% and 10% for a symmetric quadrupole.

  18. The Nonmydriatic Fundus Camera in Diabetic Retinopathy Screening: A Cost-Effective Study with Evaluation for Future Large-Scale Application

    PubMed Central

    Scarpa, Giuseppe; Urban, Francesca; Tessarin, Michele; Gallo, Giovanni; Midena, Edoardo

    2016-01-01

    Aims. The study aimed to present the experience of a screening programme for early detection of diabetic retinopathy (DR) using a nonmydriatic fundus camera, evaluating the feasibility in terms of validity, resources absorption, and future advantages of a potential application, in an Italian local health authority. Methods. Diabetic patients living in the town of Ponzano, Veneto Region (Northern Italy), were invited to be enrolled in the screening programme. The “no prevention strategy” with the inclusion of the estimation of blindness related costs was compared with screening costs in order to evaluate a future extensive and feasible implementation of the procedure, through a budget impact approach. Results. Out of 498 diabetic patients eligible, 80% was enrolled in the screening programme. 115 patients (34%) were referred to an ophthalmologist and 9 cases required prompt treatment for either proliferative DR or macular edema. Based on the pilot data, it emerged that an extensive use of the investigated screening programme, within the Greater Treviso area, could prevent 6 cases of blindness every year, resulting in a saving of €271,543.32 (−13.71%). Conclusions. Fundus images obtained with a nonmydriatic fundus camera could be considered an effective, cost-sparing, and feasible screening tool for the early detection of DR, preventing blindness as a result of diabetes. PMID:27885337

  19. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  20. Large-scale circuit simulation

    NASA Astrophysics Data System (ADS)

    Wei, Y. P.

    1982-12-01

    The simulation of VLSI (Very Large Scale Integration) circuits falls beyond the capabilities of conventional circuit simulators like SPICE. On the other hand, conventional logic simulators can only give the results of logic levels 1 and 0 with the attendent loss of detail in the waveforms. The aim of developing large-scale circuit simulation is to bridge the gap between conventional circuit simulation and logic simulation. This research is to investigate new approaches for fast and relatively accurate time-domain simulation of MOS (Metal Oxide Semiconductors), LSI (Large Scale Integration) and VLSI circuits. New techniques and new algorithms are studied in the following areas: (1) analysis sequencing (2) nonlinear iteration (3) modified Gauss-Seidel method (4) latency criteria and timestep control scheme. The developed methods have been implemented into a simulation program PREMOS which could be used as a design verification tool for MOS circuits.

  1. Large Scale Dynamos in Stars

    NASA Astrophysics Data System (ADS)

    Vishniac, Ethan T.

    2015-01-01

    We show that a differentially rotating conducting fluid automatically creates a magnetic helicity flux with components along the rotation axis and in the direction of the local vorticity. This drives a rapid growth in the local density of current helicity, which in turn drives a large scale dynamo. The dynamo growth rate derived from this process is not constant, but depends inversely on the large scale magnetic field strength. This dynamo saturates when buoyant losses of magnetic flux compete with the large scale dynamo, providing a simple prediction for magnetic field strength as a function of Rossby number in stars. Increasing anisotropy in the turbulence produces a decreasing magnetic helicity flux, which explains the flattening of the B/Rossby number relation at low Rossby numbers. We also show that the kinetic helicity is always a subdominant effect. There is no kinematic dynamo in real stars.

  2. Galaxy clustering on large scales.

    PubMed Central

    Efstathiou, G

    1993-01-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe. PMID:11607400

  3. Large scale structure of the sun's radio corona

    NASA Technical Reports Server (NTRS)

    Kundu, M. R.

    1986-01-01

    Results of studies of large scale structures of the corona at long radio wavelengths are presented, using data obtained with the multifrequency radioheliograph of the Clark Lake Radio Observatory. It is shown that features corresponding to coronal streamers and coronal holes are readily apparent in the Clark Lake maps.

  4. Property and instrumental heritage of the Bordeaux Astronomical Observatory; What future?

    NASA Astrophysics Data System (ADS)

    de La Noë, J.; Charlot, P.; Grousset, F.

    2009-11-01

    In the years 1870, the Government of the Third Republic decided to develop scientific and technical research. Such an effort contributed to supporting and creating universities and other institutes such as astronomical observatories. The dual wish of the Bordeaux council and professors at the Faculté des Sciences de Bordeaux led to the foundation of the astronomical Observatory of Bordeaux. It was set up by Georges Rayet in the years 1880's. The observatory owns a property of 12 hectares with a dozen of buildings, five domes housing an instrument, a Würzburg radiotelescope, a 2.5 meter radiotelescope, and a large collection of about 250 instruments, 4 500 photographic plates, drawings, slides for teaching astronomy, maps of the Carte du Ciel and 200 files of archives. In addition, the library contains about a thousand books for the period 1600-1950. The future of the observatory is not clear at the present time, when the Laboratoire d'Astrophysique will leave to the campus in a few years.

  5. A Future Large-Aperture UVOIR Space Observatory: Key Technologies and Capabilities

    NASA Technical Reports Server (NTRS)

    Bolcar, Matthew Ryan; Stahle, Carl M.; Balasubramaniam, Kunjithapatham; Clampin, Mark; Feinberg, Lee D.; Mosier, Gary E.; Quijada, Manuel A.; Rauscher, Bernard J.; Redding, David C.; Rioux, Norman M.; hide

    2015-01-01

    We present the key technologies and capabilities that will enable a future, large-aperture ultravioletopticalinfrared (UVOIR) space observatory. These include starlight suppression systems, vibration isolation and control systems, lightweight mirror segments, detector systems, and mirror coatings. These capabilities will provide major advances over current and near-future observatories for sensitivity, angular resolution, and starlight suppression. The goals adopted in our study for the starlight suppression system are 10-10 contrast with an inner working angle of 20 milliarcsec and broad bandpass. We estimate that a vibration and isolation control system that achieves a total system vibration isolation of 140 dB for a vibration-isolated mass of 5000 kg is required to achieve the high wavefront error stability needed for exoplanet coronagraphy. Technology challenges for lightweight mirror segments include diffraction-limited optical quality and high wavefront error stability as well as low cost, low mass, and rapid fabrication. Key challenges for the detector systems include visible-blind, high quantum efficiency UV arrays, photon counting visible and NIR arrays for coronagraphic spectroscopy and starlight wavefront sensing and control, and detectors with deep full wells with low persistence and radiation tolerance to enable transit imaging and spectroscopy at all wavelengths. Finally, mirror coatings with high reflectivity ( 90), high uniformity ( 1) and low polarization ( 1) that are scalable to large diameter mirror substrates will be essential for ensuring that both high throughput UV observations and high contrast observations can be performed by the same observatory.

  6. National Ecological Observatory Network's (NEON) future role in US carbon cycling and budgets

    NASA Astrophysics Data System (ADS)

    Loescher, H. W.

    2015-12-01

    The US National Ecological Observatory Network (NEON) is a National Science Foundation investment designed to observe the impacts of large-scale environment changes on the nation's ecosystems for 30 years with rigorous consistency. NEON does this through the construction (and operations) of new physical infrastructure and data infrastructure distributed across the North American Continent. This includes 47 terrestrial and 32 aquatic sites. Key to its design is its ability to provide ecosystem-scale carbon measurements of carbon stores, fluxes, processes—and the means to scale them from the local-to regional scales via remote sensed aircraft. NEON design NEON will be collecting these carbon data as a facility and providing openly providing them. NEON will not preform any high-level synthesis, rather the carbon data is an open resource for research, private and public communities, alike. Overall, these data are also harmonized with other international carbon-based infrastructures to facilitate cross-continental understanding and global carbon syntheses. Products, engagement and harmonization of data to facilitate syntheses will be discussed.

  7. Economically viable large-scale hydrogen liquefaction

    NASA Astrophysics Data System (ADS)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  8. Large-Scale Visual Data Analysis

    NASA Astrophysics Data System (ADS)

    Johnson, Chris

    2014-04-01

    Modern high performance computers have speeds measured in petaflops and handle data set sizes measured in terabytes and petabytes. Although these machines offer enormous potential for solving very large-scale realistic computational problems, their effectiveness will hinge upon the ability of human experts to interact with their simulation results and extract useful information. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most most important tools in helping to understand such large-scale information. Our research at the Scientific Computing and Imaging (SCI) Institute at the University of Utah has focused on innovative, scalable techniques for large-scale 3D visual data analysis. In this talk, I will present state- of-the-art visualization techniques, including scalable visualization algorithms and software, cluster-based visualization methods and innovate visualization techniques applied to problems in computational science, engineering, and medicine. I will conclude with an outline for a future high performance visualization research challenges and opportunities.

  9. Cosmology with Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Ho, Shirley; Cuesta, A.; Ross, A.; Seo, H.; DePutter, R.; Padmanabhan, N.; White, M.; Myers, A.; Bovy, J.; Blanton, M.; Hernandez, C.; Mena, O.; Percival, W.; Prada, F.; Ross, N. P.; Saito, S.; Schneider, D.; Skibba, R.; Smith, K.; Slosar, A.; Strauss, M.; Verde, L.; Weinberg, D.; Bachall, N.; Brinkmann, J.; da Costa, L. A.

    2012-01-01

    The Sloan Digital Sky Survey I-III surveyed 14,000 square degrees, and delivered over a trillion pixels of imaging data. I present cosmological results from this unprecedented data set which contains over a million galaxies distributed between redshift of 0.45 to 0.70. With such a large volume of data set, high precision cosmological constraints can be obtained given a careful control and understanding of observational systematics. I present a novel treatment of observational systematics and its application to the clustering signals from the data set. I will present cosmological constraints on dark components of the Universe and tightest constraints of the non-gaussianity of early Universe to date utilizing Large Scale Structure.

  10. Large scale biomimetic membrane arrays.

    PubMed

    Hansen, Jesper S; Perry, Mark; Vogel, Jörg; Groth, Jesper S; Vissing, Thomas; Larsen, Marianne S; Geschke, Oliver; Emneús, Jenny; Bohr, Henrik; Nielsen, Claus H

    2009-10-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO(2) laser micro-structured 8 x 8 aperture partition arrays with average aperture diameters of 301 +/- 5 microm. We addressed the electro-physical properties of the lipid bilayers established across the micro-structured scaffold arrays by controllable reconstitution of biotechnological and physiological relevant membrane peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays, and furthermore demonstrate that the design can conveniently be scaled up to support planar lipid bilayers in large square-centimeter partition arrays.

  11. Improving Recent Large-Scale Pulsar Surveys

    NASA Astrophysics Data System (ADS)

    Cardoso, Rogerio Fernando; Ransom, S.

    2011-01-01

    Pulsars are unique in that they act as celestial laboratories for precise tests of gravity and other extreme physics (Kramer 2004). There are approximately 2000 known pulsars today, which is less than ten percent of pulsars in the Milky Way according to theoretical models (Lorimer 2004). Out of these 2000 known pulsars, approximately ten percent are known millisecond pulsars, objects used for their period stability for detailed physics tests and searches for gravitational radiation (Lorimer 2008). As the field and instrumentation progress, pulsar astronomers attempt to overcome observational biases and detect new pulsars, consequently discovering new millisecond pulsars. We attempt to improve large scale pulsar surveys by examining three recent pulsar surveys. The first, the Green Bank Telescope 350MHz Drift Scan, a low frequency isotropic survey of the northern sky, has yielded a large number of candidates that were visually inspected and identified, resulting in over 34.000 thousands candidates viewed, dozens of detections of known pulsars, and the discovery of a new low-flux pulsar, PSRJ1911+22. The second, the PALFA survey, is a high frequency survey of the galactic plane with the Arecibo telescope. We created a processing pipeline for the PALFA survey at the National Radio Astronomy Observatory in Charlottesville- VA, in addition to making needed modifications upon advice from the PALFA consortium. The third survey examined is a new GBT 820MHz survey devoted to find new millisecond pulsars by observing the target-rich environment of unidentified sources in the FERMI LAT catalogue. By approaching these three pulsar surveys at different stages, we seek to improve the success rates of large scale surveys, and hence the possibility for ground-breaking work in both basic physics and astrophysics.

  12. Take a look at the ancient observatories in Iran and prospects for the future

    NASA Astrophysics Data System (ADS)

    Kayanikhoo, F.; Bahrani, F.

    2014-12-01

    In this article, we want to introduce ancient observatories of Iran and study about applications of two of them in ancient times. Then, we will introduce one of the robotic observatories of Iran that is located in university of Kashan. We, also, will study about features of Iranian National Observatory that is an under construction robotic observatory.

  13. Large-scale PACS implementation.

    PubMed

    Carrino, J A; Unkel, P J; Miller, I D; Bowser, C L; Freckleton, M W; Johnson, T G

    1998-08-01

    The transition to filmless radiology is a much more formidable task than making the request for proposal to purchase a (Picture Archiving and Communications System) PACS. The Department of Defense and the Veterans Administration have been pioneers in the transformation of medical diagnostic imaging to the electronic environment. Many civilian sites are expected to implement large-scale PACS in the next five to ten years. This presentation will related the empirical insights gleaned at our institution from a large-scale PACS implementation. Our PACS integration was introduced into a fully operational department (not a new hospital) in which work flow had to continue with minimal impact. Impediments to user acceptance will be addressed. The critical components of this enormous task will be discussed. The topics covered during this session will include issues such as phased implementation, DICOM (digital imaging and communications in medicine) standard-based interaction of devices, hospital information system (HIS)/radiology information system (RIS) interface, user approval, networking, workstation deployment and backup procedures. The presentation will make specific suggestions regarding the implementation team, operating instructions, quality control (QC), training and education. The concept of identifying key functional areas is relevant to transitioning the facility to be entirely on line. Special attention must be paid to specific functional areas such as the operating rooms and trauma rooms where the clinical requirements may not match the PACS capabilities. The printing of films may be necessary for certain circumstances. The integration of teleradiology and remote clinics into a PACS is a salient topic with respect to the overall role of the radiologists providing rapid consultation. A Web-based server allows a clinician to review images and reports on a desk-top (personal) computer and thus reduce the number of dedicated PACS review workstations. This session

  14. Large scale cluster computing workshop

    SciTech Connect

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  15. Large-Scale Sequence Comparison.

    PubMed

    Lal, Devi; Verma, Mansi

    2017-01-01

    There are millions of sequences deposited in genomic databases, and it is an important task to categorize them according to their structural and functional roles. Sequence comparison is a prerequisite for proper categorization of both DNA and protein sequences, and helps in assigning a putative or hypothetical structure and function to a given sequence. There are various methods available for comparing sequences, alignment being first and foremost for sequences with a small number of base pairs as well as for large-scale genome comparison. Various tools are available for performing pairwise large sequence comparison. The best known tools either perform global alignment or generate local alignments between the two sequences. In this chapter we first provide basic information regarding sequence comparison. This is followed by the description of the PAM and BLOSUM matrices that form the basis of sequence comparison. We also give a practical overview of currently available methods such as BLAST and FASTA, followed by a description and overview of tools available for genome comparison including LAGAN, MumMER, BLASTZ, and AVID.

  16. Large Scale Homing in Honeybees

    PubMed Central

    Pahl, Mario; Zhu, Hong; Tautz, Jürgen; Zhang, Shaowu

    2011-01-01

    Honeybee foragers frequently fly several kilometres to and from vital resources, and communicate those locations to their nest mates by a symbolic dance language. Research has shown that they achieve this feat by memorizing landmarks and the skyline panorama, using the sun and polarized skylight as compasses and by integrating their outbound flight paths. In order to investigate the capacity of the honeybees' homing abilities, we artificially displaced foragers to novel release spots at various distances up to 13 km in the four cardinal directions. Returning bees were individually registered by a radio frequency identification (RFID) system at the hive entrance. We found that homing rate, homing speed and the maximum homing distance depend on the release direction. Bees released in the east were more likely to find their way back home, and returned faster than bees released in any other direction, due to the familiarity of global landmarks seen from the hive. Our findings suggest that such large scale homing is facilitated by global landmarks acting as beacons, and possibly the entire skyline panorama. PMID:21602920

  17. An Engineering Design Reference Mission for a Future Large-Aperture UVOIR Space Observatory

    NASA Astrophysics Data System (ADS)

    Thronson, Harley A.; Bolcar, Matthew R.; Clampin, Mark; Crooke, Julie A.; Redding, David; Rioux, Norman; Stahl, H. Philip

    2016-01-01

    From the 2010 NRC Decadal Survey and the NASA Thirty-Year Roadmap, Enduring Quests, Daring Visions, to the recent AURA report, From Cosmic Birth to Living Earths, multiple community assessments have recommended development of a large-aperture UVOIR space observatory capable of achieving a broad range of compelling scientific goals. Of these priority science goals, the most technically challenging is the search for spectroscopic biomarkers in the atmospheres of exoplanets in the solar neighborhood. Here we present an engineering design reference mission (EDRM) for the Advanced Technology Large-Aperture Space Telescope (ATLAST), which was conceived from the start as capable of breakthrough science paired with an emphasis on cost control and cost effectiveness. An EDRM allows the engineering design trade space to be explored in depth to determine what are the most demanding requirements and where there are opportunities for margin against requirements. Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. The ATLAST observatory is designed to operate at a Sun-Earth L2 orbit, which provides a stable thermal environment and excellent field of regard. Our reference designs have emphasized a serviceable 36-segment 9.2 m aperture telescope that stows within a five-meter diameter launch vehicle fairing. As part of our cost-management effort, this particular reference mission builds upon the engineering design for JWST. Moreover, it is scalable to a variety of launch vehicle fairings. Performance needs developed under the study are traceable to a variety of additional reference designs, including options for a monolithic primary mirror.

  18. Methane emissions on large scales

    NASA Astrophysics Data System (ADS)

    Beswick, K. M.; Simpson, T. W.; Fowler, D.; Choularton, T. W.; Gallagher, M. W.; Hargreaves, K. J.; Sutton, M. A.; Kaye, A.

    with previous results from the area, indicating that this method of data analysis provided good estimates of large scale methane emissions.

  19. Large Scale Nanolaminate Deformable Mirror

    SciTech Connect

    Papavasiliou, A; Olivier, S; Barbee, T; Miles, R; Chang, K

    2005-11-30

    This work concerns the development of a technology that uses Nanolaminate foils to form light-weight, deformable mirrors that are scalable over a wide range of mirror sizes. While MEMS-based deformable mirrors and spatial light modulators have considerably reduced the cost and increased the capabilities of adaptive optic systems, there has not been a way to utilize the advantages of lithography and batch-fabrication to produce large-scale deformable mirrors. This technology is made scalable by using fabrication techniques and lithography that are not limited to the sizes of conventional MEMS devices. Like many MEMS devices, these mirrors use parallel plate electrostatic actuators. This technology replicates that functionality by suspending a horizontal piece of nanolaminate foil over an electrode by electroplated nickel posts. This actuator is attached, with another post, to another nanolaminate foil that acts as the mirror surface. Most MEMS devices are produced with integrated circuit lithography techniques that are capable of very small line widths, but are not scalable to large sizes. This technology is very tolerant of lithography errors and can use coarser, printed circuit board lithography techniques that can be scaled to very large sizes. These mirrors use small, lithographically defined actuators and thin nanolaminate foils allowing them to produce deformations over a large area while minimizing weight. This paper will describe a staged program to develop this technology. First-principles models were developed to determine design parameters. Three stages of fabrication will be described starting with a 3 x 3 device using conventional metal foils and epoxy to a 10-across all-metal device with nanolaminate mirror surfaces.

  20. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  1. The ISS as a Testbed for Future Large Astronomical Observatories: The OpTIIX Demonstration Program

    NASA Technical Reports Server (NTRS)

    Burdick, G.; Callen, P.; Ess, K.; Liu, F.; Postman, M.; Sparks, W.; Seery, B.; Thronson, H.

    2012-01-01

    Future large (diameters in excess of approx. 10 m) astronomical observatories in space will need to employ advanced technologies if they are to be affordable. Many of these technologies are ready to be validated on orbit and the International Space Station (ISS) provides a suitable platform for such demonstrations. These technologies include low-cost, low-density, highly deformable mirror segments, coupled with advanced sensing and control methods. In addition, the ISS offers available telerobotic assembly techniques to build an optical testbed that embodies this new cost-effective approach to assemble and achieve diffraction-limited optical performance for very large space telescopes. Given the importance that NASA attaches to the recommendations of the National Academy of Sciences "Decadal Survey" process, essential capabilities and technologies will be demonstrated well in advance of the next Survey, which commences in 2019. To achieve this objective, the Jet Propulsion Laboratory (JPL), NASA Johnson Space Center (JSC), NASA Goddard Space Flight Center (GSFC), and the Space Telescope Science Institute (STScI) are carrying out a Phase A/B study of the Optical Testbed and Integration on ISS eXperiment (OpTIIX). The overarching goal is to demonstrate well before the end of this decade key capabilities intended to enable very large optical systems in the decade of the 2020s. Such a demonstration will retire technical risk in the assembly, alignment, calibration, and operation of future space observatories. The OpTIIX system, as currently designed, is a six-hexagon element, segmented visual-wavelength telescope with an edge-to-edge aperture of 1.4 m, operating at its diffraction limit,

  2. Advanced situation awareness with localised environmental community observatories in the Future Internet

    NASA Astrophysics Data System (ADS)

    Sabeur, Z. A.; Denis, H.; Nativi, S.

    2012-04-01

    The phenomenal advances in information and communication technologies over the last decade have led to offering unprecedented connectivity with real potentials for "Smart living" between large segments of human populations around the world. In particular, Voluntary Groups(VGs) and individuals with interest in monitoring the state of their local environment can be connected through the internet and collaboratively generate important localised environmental observations. These could be considered as the Community Observatories(CO) of the Future Internet(FI). However, a set of FI enablers are needed to be deployed for these communities to become effective COs in the Future Internet. For example, these communities will require access to services for the intelligent processing of heterogeneous data and capture of advancend situation awarness about the environment. This important enablement will really unlock the communities true potential for participating in localised monitoring of the environment in addition to their contribution in the creation of business entreprise. Among the eight Usage Areas(UA) projects of the FP7 FI-PPP programme, the ENVIROFI Integrated Project focuses on the specifications of the Future Internet enablers of the Environment UA. The specifications are developed under multiple environmental domains in context of users needs for the development of mash-up applications in the Future Internet. It will enable users access to real-time, on-demand fused information with advanced situation awareness about the environment at localised scales. The mash-up applications shall get access to rich spatio-temporal information from structured fusion services which aggregate COs information with existing environmental monitoring stations data, established by research organisations and private entreprise. These applications are being developed in ENVIROFI for the atmospheric, marine and biodiversity domains, together with a potential to be extended to other

  3. Needs, opportunities, and options for large scale systems research

    SciTech Connect

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  4. The SOFIA Airborne Infrared Observatory - first science highlights and future science potential

    NASA Astrophysics Data System (ADS)

    Zinnecker, H.

    2014-10-01

    SOFIA, short for Stratospheric Observatory for Infrared Astronomy, is a Boeing 747SP aircraft with a 2.7m telescope flying as high as 45000 ft in the stratosphere above 99 percent of the precipitable water vapor. SOFIA normally operates from its base in Palmdale, California, and a typical observing flight lasts for 10 hours before returning to base. SOFIA has started astronomical observations in Dec 2010 and has completed some 30 early science flights in 2011, delivering a number of exciting results and discoveries, both in mid-infrared imaging (5-40mu) and in far-infrared (THz) heterodyne high-resolution spectroscopy which were published in mid-2012 in special issues of ApJ Letters and A & A, respectively. Meanwhile, in July 2013, as part of Cycle 1, SOFIA has deployed to New Zealand for a total of 9 flights (all of them successful) and has observed key targets in the southern hemisphere at THz frequencies, including star forming regions in the Large and Small Magellanic Clouds. In this talk, I will present a few highlights of SOFIA early science and its future potential, when the full suite of 7 instruments will be implemented by the time of full operations in 2015. As Herschel ran out of cryogens in April 2013, SOFIA will be the premier FIR-astronomical facility for many years to come. Synergies with ALMA and CCAT must be explored. SOFIA is a major bilateral project between NASA and the German Space Agency (DLR), however as an international observatory it offers observing time to the whole astronomical community world-wide, not only to the US and German primary partners.

  5. Development of a TES-Based Anti-Coincidence Detector for Future X-ray Observatories

    NASA Technical Reports Server (NTRS)

    Bailey, Catherine

    2011-01-01

    Microcalorimeters onboard future x-ray observatories require an anti-coincidence detector to remove environmental backgrounds. In order to most effectively integrate this anticoincidence detector with the main microcalorimeter array, both instruments should use similar read-out technology. The detectors used in the Cryogenic Dark Matter Search (CDMS) use a phonon measurement technique that is well suited for an anti-coincidence detector with a microcalorimeter array using SQUID readout. This technique works by using a transition-edge sensor (TES) connected to superconducting collection fins to measure the athermal phonon signal produced when an event occurs in the substrate crystal. Energy from the event propagates through the crystal to the superconducting collection fins, creating quasiparticles, which are then trapped as they enter the TES where they produce a signal. We are currently developing a prototype anti-coincidence detector for future x-ray missions and have recently fabricated test devices with Mo/Au TESs and Al collection fins. We will present results from the first tests of these devices which indicate a proof of concept that quasiparticle trapping is occurring in these materials.

  6. Survey of decentralized control methods. [for large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.

  7. Seismic observations at the Sodankylä Geophysical Observatory: history, present, and the future

    NASA Astrophysics Data System (ADS)

    Kozlovskaya, Elena; Narkilahti, Janne; Nevalainen, Jouni; Hurskainen, Riitta; Silvennoinen, Hanna

    2016-08-01

    Instrumental seismic observations in northern Finland started in the 1950s. They were originally initiated by the Institute of Seismology of the University of Helsinki (ISUH), but the staff of Sodankylä Geophysical Observatory (SGO) and later geophysicists of the University of Oulu (UO) were involved in the development of seismological observations and research in northern Finland from the very beginning. This close cooperation between seismologists and the technical staff of ISUH, UO, and SGO continued in many significant international projects and enabled a high level of seismological research in Finland. In our paper, we present history and current status of seismic observations and seismological research in northern Finland at the UO and SGO. These include both seismic observations at permanent seismic stations and temporary seismic experiments with portable seismic equipment. We describe the present seismic instrumentation and major research topics of the seismic group at SGO and discuss plans for future development of permanent seismological observations and portable seismic instrumentation at SGO as part of the European Plate Observing System (EPOS) research infrastructure. We also present the research topics of the recently organized Laboratory of Applied Seismology, and show examples of seismic observations performed by new seismic equipment located at this laboratory and selected results of time-lapse seismic body wave travel-time tomography using the data of microseismic monitoring in the Pyhäsalmi Mine (northern Finland).

  8. MEO and LEO space debris optical observations at Crimean Observatory: first experience and future perspectives.

    NASA Astrophysics Data System (ADS)

    Rumyantsev, Vasilij; Biryukov, Vadim; Agapov, Vladimir; Molotov, Igor

    The near Earth space observation group of Crimean Observatory is performing the regular op-tical monitoring of space debris at GEO region within framework of the International Scientific Optical Network (ISON). During last years we also paid attention to objects on lower orbits due to increasing interest to LEO and MEO regions caused by several catastrophic events happened in the recent past. Optical observations provide high quality information about position and physical properties of space debris at LEO and MEO so they can be considered as another source of data comple-mentary to traditional radar measurements. We will discuss our observations of fragments from Briz-M upper stage (object 28944) and Block-DM ullage motor (25054) explosions. Results of observation of USA-193 debris will be presented. Then we will focus on observations and some photometric properties of FengYun 1C debris as well as Cosmos 2251 and Iridium 33 fragments. Radar cross-section versus optical photometry will be compared. Moreover, estimates of orbital parameters as well as area-to-mass ratio for some observed objects will be given. Most of our observations which we discuss in the paper represent just the first attempt to investigate capabilities of our optical system to observe MEO and LEO objects. But these results are very promising and show good perspectives for the future. We will briefly describe future perspectives of our optical observations of space debris and other objects in MEO and LEO region after the new wide-field telescopes will be put into operation.

  9. Ali Observatory in Tibet: a unique northern site for future CMB ground-based observations

    NASA Astrophysics Data System (ADS)

    Su, Meng

    2015-08-01

    Ground-based CMB observations have been performed at the South Pole and the Atacama desert in Chile. However, a significant fraction of the sky can not be observed from just these two sites. For a full sky coverage from the ground in the future, a northern site for CMB observation, in particular CMB polarization, is required. Besides the long-thought site in Greenland, the high altitude Tibet plateau provides another opportunity. I will describe the Ali Observatory in Tibet, located at N32°19', E80°01', as a potential site for ground-based CMB observations. The new site is located on almost 5100m mountain, near Gar town, where is an excellent site for both infrared and submillimeter observations. Study with the long-term database of ground weather stations and archival satellite data has been performed. The site has enough relative height on the plateau and is accessible by car. The Shiquanhe town is 40 mins away by driving, and a recently opened airport with 40 mins driving, the site also has road excess, electricity, and optical fiber with fast internet. Preliminary measurement of the Precipitable Water Vapor is ~one quarter less than 0.5mm per year and the long term monitoring is under development. In addition, surrounding higher sites are also available and could be further developed if necessary. Ali provides unique northern sky coverage and together with the South Pole and the Atacama desert, future CMB observations will be able to cover the full sky from ground.

  10. Large-Scale Reform Comes of Age

    ERIC Educational Resources Information Center

    Fullan, Michael

    2009-01-01

    This article reviews the history of large-scale education reform and makes the case that large-scale or whole system reform policies and strategies are becoming increasingly evident. The review briefly addresses the pre 1997 period concluding that while the pressure for reform was mounting that there were very few examples of deliberate or…

  11. Automating large-scale reactor systems

    SciTech Connect

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig.

  12. Large-Scale periodic solar velocities: An observational study

    NASA Technical Reports Server (NTRS)

    Dittmer, P. H.

    1977-01-01

    Observations of large-scale solar velocities were made using the mean field telescope and Babcock magnetograph of the Stanford Solar Observatory. Observations were made in the magnetically insensitive ion line at 5124 A, with light from the center (limb) of the disk right (left) circularly polarized, so that the magnetograph measures the difference in wavelength between center and limb. Computer calculations are made of the wavelength difference produced by global pulsations for spherical harmonics up to second order and of the signal produced by displacing the solar image relative to polarizing optics or diffraction grating.

  13. Current and Future Capabilities of the 74-inch Telescope of Kottamia Astronomical Observatory in Egypt

    NASA Astrophysics Data System (ADS)

    Azzam, Y. A.; Ali, G. B.; Ismail, H. A.; Haroon, A.; Selim, I.

    In this paper, we are going to introduce the Kottamia Astronomical Observatory, KAO, to the astronomical community. The current status of the telescope together with the available instrumentations is described. An upgrade stage including a new optical system and a computer controlling of both the telescope and dome are achieved. The specifications of a set of CCD cameras for direct imaging and spectroscopy are given. A grating spectrograph is recently gifted to KAO from Okayama Astrophysical Observatory, OAO, of the National Astronomical Observatories in Japan. This spectrograph is successfully tested and installed at the F/18 Cassegrain focus of the KAO 74" telescope.

  14. Molecular clouds and the large-scale structure of the galaxy

    NASA Technical Reports Server (NTRS)

    Thaddeus, Patrick; Stacy, J. Gregory

    1990-01-01

    The application of molecular radio astronomy to the study of the large-scale structure of the Galaxy is reviewed and the distribution and characteristic properties of the Galactic population of Giant Molecular Clouds (GMCs), derived primarily from analysis of the Columbia CO survey, and their relation to tracers of Population 1 and major spiral features are described. The properties of the local molecular interstellar gas are summarized. The CO observing programs currently underway with the Center for Astrophysics 1.2 m radio telescope are described, with an emphasis on projects relevant to future comparison with high-energy gamma-ray observations. Several areas are discussed in which high-energy gamma-ray observations by the EGRET (Energetic Gamma-Ray Experiment Telescope) experiment aboard the Gamma Ray Observatory will directly complement radio studies of the Milky Way, with the prospect of significant progress on fundamental issues related to the structure and content of the Galaxy.

  15. Future space missions and ground observatory for measurements of coronal magnetic fields

    NASA Astrophysics Data System (ADS)

    Fineschi, Silvano; Gibson, Sarah; Bemporad, Alessandro; Zhukov, Andrei; Damé, Luc; Susino, Roberto; Larruquert, Juan

    2016-07-01

    This presentation gives an overview of the near-future perspectives for probing coronal magnetism from space missions (i.e., SCORE and ASPIICS) and ground-based observatory (ESCAPE). Spectro-polarimetric imaging of coronal emission-lines in the visible-light wavelength-band provides an important diagnostics tool of the coronal magnetism. The interpretation in terms of Hanle and Zeeman effect of the line-polarization in forbidden emission-lines yields information on the direction and strength of the coronal magnetic field. As study case, this presentation will describe the Torino Coronal Magnetograph (CorMag) for the spectro-polarimetric observation of the FeXIV, 530.3 nm, forbidden emission-line. CorMag - consisting of a Liquid Crystal (LC) Lyot filter and a LC linear polarimeter. The CorMag filter is part of the ESCAPE experiment to be based at the French-Italian Concordia base in Antarctica. The linear polarization by resonance scattering of coronal permitted line-emission in the ultraviolet (UV)can be modified by magnetic fields through the Hanle effect. Space-based UV spectro-polarimeters would provide an additional tool for the disgnostics of coronal magnetism. As a case study of space-borne UV spectro-polarimeters, this presentation will describe the future upgrade of the Sounding-rocket Coronagraphic Experiment (SCORE) to include new generation, high-efficiency UV polarizer with the capability of imaging polarimetry of the HI Lyman-α, 121.6 nm. SCORE is a multi-wavelength imager for the emission-lines, HeII 30.4 nm and HI 121.6 nm, and visible-light broad-band emission of the polarized K-corona. SCORE has flown successfully in 2009. The second lauch is scheduled in 2016. Proba-3 is the other future solar mission that would provide the opportunity of diagnosing the coronal magnetic field. Proba-3 is the first precision formation-flying mission to launched in 2019). A pair of satellites will fly together maintaining a fixed configuration as a 'large rigid

  16. Numerical Modeling for Large Scale Hydrothermal System

    NASA Astrophysics Data System (ADS)

    Sohrabi, Reza; Jansen, Gunnar; Malvoisin, Benjamin; Mazzini, Adriano; Miller, Stephen A.

    2017-04-01

    Moderate-to-high enthalpy systems are driven by multiphase and multicomponent processes, fluid and rock mechanics, and heat transport processes, all of which present challenges in developing realistic numerical models of the underlying physics. The objective of this work is to present an approach, and some initial results, for modeling and understanding dynamics of the birth of large scale hydrothermal systems. Numerical modeling of such complex systems must take into account a variety of coupled thermal, hydraulic, mechanical and chemical processes, which is numerically challenging. To provide first estimates of the behavior of this deep complex systems, geological structures must be constrained, and the fluid dynamics, mechanics and the heat transport need to be investigated in three dimensions. Modeling these processes numerically at adequate resolution and reasonable computation times requires a suite of tools that we are developing and/or utilizing to investigate such systems. Our long-term goal is to develop 3D numerical models, based on a geological models, which couples mechanics with the hydraulics and thermal processes driving hydrothermal system. Our first results from the Lusi hydrothermal system in East Java, Indonesia provide a basis for more sophisticated studies, eventually in 3D, and we introduce a workflow necessary to achieve these objectives. Future work focuses with the aim and parallelization suitable for High Performance Computing (HPC). Such developments are necessary to achieve high-resolution simulations to more fully understand the complex dynamics of hydrothermal systems.

  17. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  18. The Future of the Plate Boundary Observatory in the GAGE Facility and beyond 2018

    NASA Astrophysics Data System (ADS)

    Mattioli, G. S.; Bendick, R. O.; Foster, J. H.; Freymueller, J. T.; La Femina, P. C.; Miller, M. M.; Rowan, L.

    2014-12-01

    The Geodesy Advancing Geosciences and Earthscope (GAGE) Facility, which operates the Plate Boundary Observatory (PBO), builds on UNAVCO's strong record of facilitating research and education in the geosciences and geodesy-related engineering fields. Precise positions and velocities for the PBO's ~1100 continuous GPS stations and other PBO data products are used to address a wide range of scientific and technical issues across North America. A large US and international community of scientists, surveyors, and civil engineers access PBO data streams, software, and other on-line resources daily. In a global society that is increasingly technology-dependent, consistently risk-averse, and often natural resource-limited, communities require geodetic research, education, and infrastructure to make informed decisions about living on a dynamic planet. The western U.S. and Alaska, where over 95% of the PBO sensor assets are located, have recorded significant geophysical events like earthquakes, volcanic eruptions, and tsunami. UNAVCO community science provides first-order constraints on geophysical processes to support hazards mapping and zoning, and form the basis for earthquake and tsunami early warning applications currently under development. The future of PBO was discussed at a NSF-sponsored three-day workshop held in September 2014 in Breckenridge, CO. Over 40 invited participants and community members, including representatives from interested stakeholder groups, UNAVCO staff, and members of the PBO Working Group and Geodetic Infrastructure Advisory Committee participated in workshop, which included retrospective and prospective plenary presentations and breakout sessions focusing on specific scientific themes. We will present some of the findings of that workshop in order to continue a dialogue about policies and resources for long-term earth observing networks. How PBO fits into the recently released U.S. National Plan for Civil Earth Observations will also be

  19. A virtual observatory in a real world: building capacity for an uncertain future

    NASA Astrophysics Data System (ADS)

    Blair, Gordon; Buytaert, Wouter; Emmett, Bridget; Freer, Jim; Gurney, Robert; Haygarth, Phil; McDonald, Adrian; Rees, Gwyn; Tetzlaff, Doerthe

    2010-05-01

    Environmental managers and policy makers face a challenging future trying to accommodate growing expectations of environmental well-being, while subject to maturing regulation, constrained budgets and a public scrutiny that expects easier and more meaningful access. To support such a challenge requires new tools and new approaches. The VO is a new initiative from the Natural Environment Research Council (NERC) designed to deliver proof of concept for these new tools and approaches. The VO is at an early stage and we first evaluate the role of existing ‘observatories' in the UK and elsewhere both to learn good practice (and just as valuable - errors) and to define boundaries. A series of exemplar ‘big catchment science questions' are posed - distinguishing between science and society positions - and the prospects for their solution are assessed. The VO vision of being driven by these questions is outlined as are the seven key ambitions namely: i. being driven by the need to contribute to the solution of major environmental issues that impinge on, or link to, catchment science ii. having the flexibility and adaptability to address future problems not yet defined or fully clarified iii. being able to communicate issues and solutions to a range of audiences iv. supporting easy access by a variety of users v. drawing meaningful information from data and models and identifying the constraints on application in terms of errors, uncertainties, etc vi. adding value and cost effectiveness to current investigations by supporting transfer and scale adjustment thus limiting the repetition of expensive field monitoring addressing essentially the same issues in varying locations vii. promoting effective interfacing of robust science with a variety of end users by using terminology or measures familiar to the user (or required by regulation), including financial and carbon accounting, whole life or fixed period costing, risk as probability or as disability adjusted life years

  20. Acoustic Studies of the Large Scale Ocean Circulation

    NASA Technical Reports Server (NTRS)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  1. Working on binaries at the R. M. Aller Observatory. The last 30 years (and the near future)

    NASA Astrophysics Data System (ADS)

    Docobo, J. A.

    2011-07-01

    We performed exhaustive work during the past 30 years in the Astronomical Observatory of the University of Santiago de Compostela (called Ramón María Aller since 1981). In this communication, I explain all the necessary efforts made to obtain the current high level status of investigation (particularly on double and multiple stars), teaching, and scientific dissemination. The projects for the near future are also mentioned.

  2. Large Scale Metal Additive Techniques Review

    SciTech Connect

    Nycz, Andrzej; Adediran, Adeola I; Noakes, Mark W; Love, Lonnie J

    2016-01-01

    In recent years additive manufacturing made long strides toward becoming a main stream production technology. Particularly strong progress has been made in large-scale polymer deposition. However, large scale metal additive has not yet reached parity with large scale polymer. This paper is a review study of the metal additive techniques in the context of building large structures. Current commercial devices are capable of printing metal parts on the order of several cubic feet compared to hundreds of cubic feet for the polymer side. In order to follow the polymer progress path several factors are considered: potential to scale, economy, environment friendliness, material properties, feedstock availability, robustness of the process, quality and accuracy, potential for defects, and post processing as well as potential applications. This paper focuses on current state of art of large scale metal additive technology with a focus on expanding the geometric limits.

  3. Large-scale regions of antimatter

    SciTech Connect

    Grobov, A. V. Rubin, S. G.

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  4. The Large -scale Distribution of Galaxies

    NASA Astrophysics Data System (ADS)

    Flin, Piotr

    A review of the Large-scale structure of the Universe is given. A connection is made with the titanic work by Johannes Kepler in many areas of astronomy and cosmology. A special concern is made to spatial distribution of Galaxies, voids and walls (cellular structure of the Universe). Finaly, the author is concluding that the large scale structure of the Universe can be observed in much greater scale that it was thought twenty years ago.

  5. The Signature of Large Scale Structures on the Very High Energy Gamma-Ray Sky

    SciTech Connect

    Cuoco, A.; Hannestad, S.; Haugbolle, T.; Miele, G.; Serpico, P.D.; Tu, H.; /Aarhus U. /UC, Irvine

    2006-12-01

    If the diffuse extragalactic gamma ray emission traces the large scale structures of the universe, peculiar anisotropy patterns are expected in the gamma ray sky. In particular, because of the cutoff distance introduced by the absorption of 0.1-10 TeV photons on the infrared/optical background, prominent correlations with the local structures within a range of few hundreds Mpc should be present. We provide detailed predictions of the signal based on the PSCz map of the local universe. We also use mock N-body catalogues complemented with the halo model of structures to study some statistical features of the expected signatures. The results are largely independent from cosmological details, and depend mostly on the index of correlation (or bias) of the sources with respect to the large scale distribution of galaxies. For instance, the predicted signal in the case of a quadratic correlation (as it may happen for a dark matter annihilation contribution to the diffuse gamma flux) differs substantially from a linear correlation case, providing a complementary tool to unveil the nature of the sources of the diffuse gamma ray emission. The chances of the present and future space and ground based observatories to measure these features are discussed.

  6. Large-scale cortical networks and cognition.

    PubMed

    Bressler, S L

    1995-03-01

    The well-known parcellation of the mammalian cerebral cortex into a large number of functionally distinct cytoarchitectonic areas presents a problem for understanding the complex cortical integrative functions that underlie cognition. How do cortical areas having unique individual functional properties cooperate to accomplish these complex operations? Do neurons distributed throughout the cerebral cortex act together in large-scale functional assemblages? This review examines the substantial body of evidence supporting the view that complex integrative functions are carried out by large-scale networks of cortical areas. Pathway tracing studies in non-human primates have revealed widely distributed networks of interconnected cortical areas, providing an anatomical substrate for large-scale parallel processing of information in the cerebral cortex. Functional coactivation of multiple cortical areas has been demonstrated by neurophysiological studies in non-human primates and several different cognitive functions have been shown to depend on multiple distributed areas by human neuropsychological studies. Electrophysiological studies on interareal synchronization have provided evidence that active neurons in different cortical areas may become not only coactive, but also functionally interdependent. The computational advantages of synchronization between cortical areas in large-scale networks have been elucidated by studies using artificial neural network models. Recent observations of time-varying multi-areal cortical synchronization suggest that the functional topology of a large-scale cortical network is dynamically reorganized during visuomotor behavior.

  7. Large-scale nanophotonic phased array.

    PubMed

    Sun, Jie; Timurdogan, Erman; Yaacobi, Ami; Hosseini, Ehsan Shah; Watts, Michael R

    2013-01-10

    Electromagnetic phased arrays at radio frequencies are well known and have enabled applications ranging from communications to radar, broadcasting and astronomy. The ability to generate arbitrary radiation patterns with large-scale phased arrays has long been pursued. Although it is extremely expensive and cumbersome to deploy large-scale radiofrequency phased arrays, optical phased arrays have a unique advantage in that the much shorter optical wavelength holds promise for large-scale integration. However, the short optical wavelength also imposes stringent requirements on fabrication. As a consequence, although optical phased arrays have been studied with various platforms and recently with chip-scale nanophotonics, all of the demonstrations so far are restricted to one-dimensional or small-scale two-dimensional arrays. Here we report the demonstration of a large-scale two-dimensional nanophotonic phased array (NPA), in which 64 × 64 (4,096) optical nanoantennas are densely integrated on a silicon chip within a footprint of 576 μm × 576 μm with all of the nanoantennas precisely balanced in power and aligned in phase to generate a designed, sophisticated radiation pattern in the far field. We also show that active phase tunability can be realized in the proposed NPA by demonstrating dynamic beam steering and shaping with an 8 × 8 array. This work demonstrates that a robust design, together with state-of-the-art complementary metal-oxide-semiconductor technology, allows large-scale NPAs to be implemented on compact and inexpensive nanophotonic chips. In turn, this enables arbitrary radiation pattern generation using NPAs and therefore extends the functionalities of phased arrays beyond conventional beam focusing and steering, opening up possibilities for large-scale deployment in applications such as communication, laser detection and ranging, three-dimensional holography and biomedical sciences, to name just a few.

  8. The large-scale distribution of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.

    1989-01-01

    The spatial distribution of galaxies in the universe is characterized on the basis of the six completed strips of the Harvard-Smithsonian Center for Astrophysics redshift-survey extension. The design of the survey is briefly reviewed, and the results are presented graphically. Vast low-density voids similar to the void in Bootes are found, almost completely surrounded by thin sheets of galaxies. Also discussed are the implications of the results for the survey sampling problem, the two-point correlation function of the galaxy distribution, the possibility of detecting large-scale coherent flows, theoretical models of large-scale structure, and the identification of groups and clusters of galaxies.

  9. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    SciTech Connect

    Nusser, Adi; Branchini, Enzo; Davis, Marc E-mail: branchin@fis.uniroma3.it

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  10. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  11. Multimodel Design of Large Scale Systems with Multiple Decision Makers.

    DTIC Science & Technology

    1982-08-01

    virtue. 5- , Lead me from darkneu to light. - Lead me from death to eternal Life. ( Vedic Payer) p. I, MULTIMODEL DESIGN OF LARGE SCALE SYSTEMS WITH...guidance during the course of *: this research . He would also like to thank Professors W. R. Perkins, P. V. Kokotovic, T. Basar, and T. N. Trick for...thesis concludes with Chapter 7 where we summarize the results obtained, outline the main contributions, and indicate directions for future research . 7- I

  12. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  13. Large-scale multimedia modeling applications

    SciTech Connect

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications.

  14. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  15. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  16. Large-Scale Spacecraft Fire Safety Tests

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  17. Large Scale Deformation of the Western U.S. Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2002-01-01

    Over the past couple of years, with support from NASA, we used a large collection of data from GPS, VLBI, SLR, and DORIS networks which span the Western U.S. Cordillera (WUSC) to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work was roughly divided into an analysis of these space geodetic observations to infer the deformation field across and within the entire plate boundary zone, and an investigation of the implications of this deformation field regarding plate boundary dynamics. Following the determination of the first generation WUSC velocity solution, we placed high priority on the dissemination of the velocity estimates. With in-kind support from the Smithsonian Astrophysical Observatory, we constructed a web-site which allows anyone to access the data, and to determine their own velocity reference frame.

  18. Large Scale Deformation of the Western U.S. Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2002-01-01

    Over the past couple of years, with support from NASA, we used a large collection of data from GPS, VLBI, SLR, and DORIS networks which span the Westem U.S. Cordillera (WUSC) to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work was roughly divided into an analysis of these space geodetic observations to infer the deformation field across and within the entire plate boundary zone, and an investigation of the implications of this deformation field regarding plate boundary dynamics. Following the determination of the first generation WUSC velocity solution, we placed high priority on the dissemination of the velocity estimates. With in-kind support from the Smithsonian Astrophysical Observatory, we constructed a web-site which allows anyone to access the data, and to determine their own velocity reference frame.

  19. Back to the future: virtualization of the computing environment at the W. M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.

    2014-07-01

    Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.

  20. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  1. Large-scale Advanced Propfan (LAP) program

    NASA Technical Reports Server (NTRS)

    Sagerser, D. A.; Ludemann, S. G.

    1985-01-01

    The propfan is an advanced propeller concept which maintains the high efficiencies traditionally associated with conventional propellers at the higher aircraft cruise speeds associated with jet transports. The large-scale advanced propfan (LAP) program extends the research done on 2 ft diameter propfan models to a 9 ft diameter article. The program includes design, fabrication, and testing of both an eight bladed, 9 ft diameter propfan, designated SR-7L, and a 2 ft diameter aeroelastically scaled model, SR-7A. The LAP program is complemented by the propfan test assessment (PTA) program, which takes the large-scale propfan and mates it with a gas generator and gearbox to form a propfan propulsion system and then flight tests this system on the wing of a Gulfstream 2 testbed aircraft.

  2. Large-scale fibre-array multiplexing

    SciTech Connect

    Cheremiskin, I V; Chekhlova, T K

    2001-05-31

    The possibility of creating a fibre multiplexer/demultiplexer with large-scale multiplexing without any basic restrictions on the number of channels and the spectral spacing between them is shown. The operating capacity of a fibre multiplexer based on a four-fibre array ensuring a spectral spacing of 0.7 pm ({approx} 10 GHz) between channels is demonstrated. (laser applications and other topics in quantum electronics)

  3. Modeling Human Behavior at a Large Scale

    DTIC Science & Technology

    2012-01-01

    Discerning intentions in dynamic human action. Trends in Cognitive Sciences , 5(4):171 – 178, 2001. Shirli Bar-David, Israel Bar-David, Paul C. Cross, Sadie...Limits of predictability in human mobility. Science , 327(5968):1018, 2010. S.A. Stouffer. Intervening opportunities: a theory relating mobility and...Modeling Human Behavior at a Large Scale by Adam Sadilek Submitted in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy

  4. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2008-09-30

    aerosol species up to six days in advance anywhere on the globe. NAAPS and COAMPS are particularly useful for forecasts of dust storms in areas...impact cloud processes globally. With increasing dust storms due to climate change and land use changes in desert regions, the impact of the...bacteria in large-scale dust storms is expected to significantly impact warm ice cloud formation, human health, and ecosystems globally. In Niemi et al

  5. Large-scale instabilities of helical flows

    NASA Astrophysics Data System (ADS)

    Cameron, Alexandre; Alexakis, Alexandros; Brachet, Marc-Étienne

    2016-10-01

    Large-scale hydrodynamic instabilities of periodic helical flows of a given wave number K are investigated using three-dimensional Floquet numerical computations. In the Floquet formalism the unstable field is expanded in modes of different spacial periodicity. This allows us (i) to clearly distinguish large from small scale instabilities and (ii) to study modes of wave number q of arbitrarily large-scale separation q ≪K . Different flows are examined including flows that exhibit small-scale turbulence. The growth rate σ of the most unstable mode is measured as a function of the scale separation q /K ≪1 and the Reynolds number Re. It is shown that the growth rate follows the scaling σ ∝q if an AKA effect [Frisch et al., Physica D: Nonlinear Phenomena 28, 382 (1987), 10.1016/0167-2789(87)90026-1] is present or a negative eddy viscosity scaling σ ∝q2 in its absence. This holds both for the Re≪1 regime where previously derived asymptotic results are verified but also for Re=O (1 ) that is beyond their range of validity. Furthermore, for values of Re above a critical value ReSc beyond which small-scale instabilities are present, the growth rate becomes independent of q and the energy of the perturbation at large scales decreases with scale separation. The nonlinear behavior of these large-scale instabilities is also examined in the nonlinear regime where the largest scales of the system are found to be the most dominant energetically. These results are interpreted by low-order models.

  6. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  7. Large-scale controls on convective extreme precipitation

    NASA Astrophysics Data System (ADS)

    Loriaux, Jessica M.; Lenderink, Geert; Pier Siebesma, A.

    2017-04-01

    The influence of large-scale conditions on extreme precipitation is not yet understood well enough. We will present the results of Loriaux et al. (2017), in which we investigate the role of large-scale dynamics and environmental conditions on precipitation and on the precipitation response to climate change. To this end, we have set up a composite LES case for convective precipitation using strong large-scale forcing based on idealized profiles for the highest 10 percentiles of peak intensities over the Netherlands, as described by Loriaux et al. (2016). In this setting, we have performed sensitivity analyses for atmospheric stability, large-scale moisture convergence, and relative humidity, and compared present-day climate to a warmer future climate. The results suggest that amplification of the moisture convergence and destabilization of the atmosphere both lead to an increase in precipitation, but due to different effects; Atmospheric stability mainly influences the precipitation intensity, while the moisture convergence mainly controls the precipitation area fraction. Extreme precipitation intensities show qualitatively similar sensitivities to atmospheric stability and moisture convergence. Precipitation increases with RH due to an increase in area fraction, despite a decrease in intensity. The precipitation response to the climate perturbation shows a stronger response for the precipitation intensity than the overall precipitation, with no clear dependency of changes in atmospheric stability, moisture convergence and relative humidity. The difference in response between the precipitation intensity and overall precipitation is caused by a decrease in the precipitation area fraction from present-day to future climate. In other words, our climate perturbation indicates that with warming, it will rain more intensely but in less places. Loriaux, J.M., G. Lenderink, and A.P. Siebesma, 2016, doi: 10.1002/2015JD024274 Loriaux, J.M., G. Lenderink, and A.P. Siebesma

  8. Population generation for large-scale simulation

    NASA Astrophysics Data System (ADS)

    Hannon, Andrew C.; King, Gary; Morrison, Clayton; Galstyan, Aram; Cohen, Paul

    2005-05-01

    Computer simulation is used to research phenomena ranging from the structure of the space-time continuum to population genetics and future combat.1-3 Multi-agent simulations in particular are now commonplace in many fields.4, 5 By modeling populations whose complex behavior emerges from individual interactions, these simulations help to answer questions about effects where closed form solutions are difficult to solve or impossible to derive.6 To be useful, simulations must accurately model the relevant aspects of the underlying domain. In multi-agent simulation, this means that the modeling must include both the agents and their relationships. Typically, each agent can be modeled as a set of attributes drawn from various distributions (e.g., height, morale, intelligence and so forth). Though these can interact - for example, agent height is related to agent weight - they are usually independent. Modeling relations between agents, on the other hand, adds a new layer of complexity, and tools from graph theory and social network analysis are finding increasing application.7, 8 Recognizing the role and proper use of these techniques, however, remains the subject of ongoing research. We recently encountered these complexities while building large scale social simulations.9-11 One of these, the Hats Simulator, is designed to be a lightweight proxy for intelligence analysis problems. Hats models a "society in a box" consisting of many simple agents, called hats. Hats gets its name from the classic spaghetti western, in which the heroes and villains are known by the color of the hats they wear. The Hats society also has its heroes and villains, but the challenge is to identify which color hat they should be wearing based on how they behave. There are three types of hats: benign hats, known terrorists, and covert terrorists. Covert terrorists look just like benign hats but act like terrorists. Population structure can make covert hat identification significantly more

  9. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  10. What is a large-scale dynamo?

    NASA Astrophysics Data System (ADS)

    Nigro, G.; Pongkitiwanichakul, P.; Cattaneo, F.; Tobias, S. M.

    2017-01-01

    We consider kinematic dynamo action in a sheared helical flow at moderate to high values of the magnetic Reynolds number (Rm). We find exponentially growing solutions which, for large enough shear, take the form of a coherent part embedded in incoherent fluctuations. We argue that at large Rm large-scale dynamo action should be identified by the presence of structures coherent in time, rather than those at large spatial scales. We further argue that although the growth rate is determined by small-scale processes, the period of the coherent structures is set by mean-field considerations.

  11. Large-scale brightenings associated with flares

    NASA Technical Reports Server (NTRS)

    Mandrini, Cristina H.; Machado, Marcos E.

    1992-01-01

    It is shown that large-scale brightenings (LSBs) associated with solar flares, similar to the 'giant arches' discovered by Svestka et al. (1982) in images obtained by the SSM HXIS hours after the onset of two-ribbon flares, can also occur in association with confined flares in complex active regions. For these events, a clear link between the LSB and the underlying flare is clearly evident from the active-region magnetic field topology. The implications of these findings are discussed within the framework of the interacting loops of flares and the giant arch phenomenology.

  12. Large scale phononic metamaterials for seismic isolation

    SciTech Connect

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  13. Large-scale planar lightwave circuits

    NASA Astrophysics Data System (ADS)

    Bidnyk, Serge; Zhang, Hua; Pearson, Matt; Balakrishnan, Ashok

    2011-01-01

    By leveraging advanced wafer processing and flip-chip bonding techniques, we have succeeded in hybrid integrating a myriad of active optical components, including photodetectors and laser diodes, with our planar lightwave circuit (PLC) platform. We have combined hybrid integration of active components with monolithic integration of other critical functions, such as diffraction gratings, on-chip mirrors, mode-converters, and thermo-optic elements. Further process development has led to the integration of polarization controlling functionality. Most recently, all these technological advancements have been combined to create large-scale planar lightwave circuits that comprise hundreds of optical elements integrated on chips less than a square inch in size.

  14. Large-Scale PV Integration Study

    SciTech Connect

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  15. Colloquium: Large scale simulations on GPU clusters

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  16. Neutrinos and large-scale structure

    SciTech Connect

    Eisenstein, Daniel J.

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  17. Large-scale Heterogeneous Network Data Analysis

    DTIC Science & Technology

    2012-07-31

    Data for Multi-Player Influence Maximization on Social Networks.” KDD 2012 (Demo).  Po-Tzu Chang , Yen-Chieh Huang, Cheng-Lun Yang, Shou-De Lin, Pu...Jen Cheng. “Learning-Based Time-Sensitive Re-Ranking for Web Search.” SIGIR 2012 (poster)  Hung -Che Lai, Cheng-Te Li, Yi-Chen Lo, and Shou-De Lin...Exploiting and Evaluating MapReduce for Large-Scale Graph Mining.” ASONAM 2012 (Full, 16% acceptance ratio).  Hsun-Ping Hsieh , Cheng-Te Li, and Shou

  18. Internationalization Measures in Large Scale Research Projects

    NASA Astrophysics Data System (ADS)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  19. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  20. Large-scale Intelligent Transporation Systems simulation

    SciTech Connect

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  1. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  2. Large-scale Globally Propagating Coronal Waves.

    PubMed

    Warmuth, Alexander

    Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the "classical" interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which "pseudo waves" are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  3. Global food insecurity. Treatment of major food crops with elevated carbon dioxide or ozone under large-scale fully open-air conditions suggests recent models may have overestimated future yields

    PubMed Central

    Long, Stephen P; Ainsworth, Elizabeth A; Leakey, Andrew D.B; Morgan, Patrick B

    2005-01-01

    Predictions of yield for the globe's major grain and legume arable crops suggest that, with a moderate temperature increase, production may increase in the temperate zone, but decline in the tropics. In total, global food supply may show little change. This security comes from inclusion of the direct effect of rising carbon dioxide (CO2) concentration, [CO2], which significantly stimulates yield by decreasing photorespiration in C3 crops and transpiration in all crops. Evidence for a large response to [CO2] is largely based on studies made within chambers at small scales, which would be considered unacceptable for standard agronomic trials of new cultivars or agrochemicals. Yet, predictions of the globe's future food security are based on such inadequate information. Free-Air Concentration Enrichment (FACE) technology now allows investigation of the effects of rising [CO2] and ozone on field crops under fully open-air conditions at an agronomic scale. Experiments with rice, wheat, maize and soybean show smaller increases in yield than anticipated from studies in chambers. Experiments with increased ozone show large yield losses (20%), which are not accounted for in projections of global food security. These findings suggest that current projections of global food security are overoptimistic. The fertilization effect of CO2 is less than that used in many models, while rising ozone will cause large yield losses in the Northern Hemisphere. Unfortunately, FACE studies have been limited in geographical extent and interactive effects of CO2, ozone and temperature have yet to be studied. Without more extensive study of the effects of these changes at an agronomic scale in the open air, our ever-more sophisticated models will continue to have feet of clay. PMID:16433090

  4. Results from an Integrated Optical/Acoustic Communication System Installed at CORK 857D: Implications for Future Seafloor Observatories

    NASA Astrophysics Data System (ADS)

    Tivey, M.; Farr, N.; Ware, J.; Pontbriand, C.

    2011-12-01

    A CORK (Circulation Obviation Retrofit Kit) borehole represents all of the basic components required for a seafloor observatory: a stable environment for long-term continuous measurements of earth and ocean phenomena, access to a unique environment below the seafloor under controlled conditions (e.g. hydrologically sealed), and a standard interface for communication. Typically, however, due to power constraints and a limited frequency of data download opportunities, data sampling has been limited to rates on the order of several minutes. For full seismic wave sampling, at least 1 Hz or better is required. While some CORK systems are now being connected to an underwater cable to provide continuous power and real-time data (cf. Neptune network in the Northeast Pacific), there will be locations where cabled observatories are not viable. Another mode of communication is required to enable both high data rate communication and access for data download via more conventional vessels and not limited to those with ROV or submersibles. We here report on technology to enable high data rate download and transfer of data and information using underwater optical communications, which can be accomplished from a surface vessel of opportunity or, in the future, by autonomous underwater vehicle. In 2010, we successfully deployed and tested an underwater optical communication system that provides high data rate communications over a range of 100 meters from a deep sea CORK borehole observatory located in the northeast Pacific at IODP Hole 857D. The CORK is instrumented with a thermistor string and pressure sensors that record downhole formation pressures and temperatures within oceanic basement and is pressure sealed from the overlying water column. The seafloor Optical Telemetry System (OTS) was plugged into the CORK's existing underwater matable connector to provide an optical and acoustic communication interface and additional data storage and battery power for the CORK to sample

  5. Frequency domain multiplexing for large-scale bolometer arrays

    SciTech Connect

    Spieler, Helmuth

    2002-05-31

    The development of planar fabrication techniques for superconducting transition-edge sensors has brought large-scale arrays of 1000 pixels or more to the realm of practicality. This raises the problem of reading out a large number of sensors with a tractable number of connections. A possible solution is frequency-domain multiplexing. I summarize basic principles, present various circuit topologies, and discuss design trade-offs, noise performance, cross-talk and dynamic range. The design of a practical device and its readout system is described with a discussion of fabrication issues, practical limits and future prospects.

  6. OceanSITES format and Ocean Observatory Output harmonisation: past, present and future

    NASA Astrophysics Data System (ADS)

    Pagnani, Maureen; Galbraith, Nan; Diggs, Stephen; Lankhorst, Matthias; Hidas, Marton; Lampitt, Richard

    2015-04-01

    The Global Ocean Observing System (GOOS) initiative was launched in 1991, and was the first step in creating a global view of ocean observations. In 1999 oceanographers at the OceanObs conference envisioned a 'global system of eulerian observatories' which evolved into the OceanSITES project. OceanSITES has been generously supported by individual oceanographic institutes and agencies across the globe, as well as by the WMO-IOC Joint Technical Commission for Oceanography and Marine Meteorology (under JCOMMOPS). The project is directed by the needs of research scientists, but has a strong data management component, with an international team developing content standards, metadata specifications, and NetCDF templates for many types of in situ oceanographic data. The OceanSITES NetCDF format specification is intended as a robust data exchange and archive format specifically for time-series observatory data from the deep ocean. First released in February 2006, it has evolved to build on and extend internationally recognised standards such as the Climate and Forecast (CF) standard, BODC vocabularies, ISO formats and vocabularies, and in version 1.3, released in 2014, ACDD (Attribute Convention for Dataset Discovery). The success of the OceanSITES format has inspired other observational groups, such as autonomous vehicles and ships of opportunity, to also use the format and today it is fulfilling the original concept of providing a coherent set of data from eurerian observatories. Data in the OceanSITES format is served by 2 Global Data Assembly Centres (GDACs), one at Coriolis, in France, at ftp://ftp.ifremer.fr/ifremer/oceansites/ and one at the US NDBC, at ftp://data.ndbc.noaa.gov/data/oceansites/. These two centres serve over 26,800 OceanSITES format data files from 93 moorings. The use of standardised and controlled features enables the files held at the OceanSITES GDACs to be electronically discoverable and ensures the widest access to the data. The Ocean

  7. Efficient, large scale separation of coal macerals

    SciTech Connect

    Dyrkacz, G.R.; Bloomquist, C.A.A.

    1988-01-01

    The authors believe that the separation of macerals by continuous flow centrifugation offers a simple technique for the large scale separation of macerals. With relatively little cost (/approximately/ $10K), it provides an opportunity for obtaining quite pure maceral fractions. Although they have not completely worked out all the nuances of this separation system, they believe that the problems they have indicated can be minimized to pose only minor inconvenience. It cannot be said that this system completely bypasses the disagreeable tedium or time involved in separating macerals, nor will it by itself overcome the mental inertia required to make maceral separation an accepted necessary fact in fundamental coal science. However, they find their particular brand of continuous flow centrifugation is considerably faster than sink/float separation, can provide a good quality product with even one separation cycle, and permits the handling of more material than a conventional sink/float centrifuge separation.

  8. Primer design for large scale sequencing.

    PubMed Central

    Haas, S; Vingron, M; Poustka, A; Wiemann, S

    1998-01-01

    We have developed PRIDE, a primer design program that automatically designs primers in single contigs or whole sequencing projects to extend the already known sequence and to double strand single-stranded regions. The program is fully integrated into the Staden package (GAP4) and accessible with a graphical user interface. PRIDE uses a fuzzy logic-based system to calculate primer qualities. The computational performance of PRIDE is enhanced by using suffix trees to store the huge amount of data being produced. A test set of 110 sequencing primers and 11 PCR primer pairs has been designed on genomic templates, cDNAs and sequences containing repetitive elements to analyze PRIDE's success rate. The high performance of PRIDE, combined with its minimal requirement of user interaction and its fast algorithm, make this program useful for the large scale design of primers, especially in large sequencing projects. PMID:9611248

  9. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  10. Large-Scale Organization of Glycosylation Networks

    NASA Astrophysics Data System (ADS)

    Kim, Pan-Jun; Lee, Dong-Yup; Jeong, Hawoong

    2009-03-01

    Glycosylation is a highly complex process to produce a diverse repertoire of cellular glycans that are frequently attached to proteins and lipids. Glycans participate in fundamental biological processes including molecular trafficking and clearance, cell proliferation and apoptosis, developmental biology, immune response, and pathogenesis. N-linked glycans found on proteins are formed by sequential attachments of monosaccharides with the help of a relatively small number of enzymes. Many of these enzymes can accept multiple N-linked glycans as substrates, thus generating a large number of glycan intermediates and their intermingled pathways. Motivated by the quantitative methods developed in complex network research, we investigate the large-scale organization of such N-glycosylation pathways in a mammalian cell. The uncovered results give the experimentally-testable predictions for glycosylation process, and can be applied to the engineering of therapeutic glycoproteins.

  11. Large-scale optimization of neuron arbors

    NASA Astrophysics Data System (ADS)

    Cherniak, Christopher; Changizi, Mark; Won Kang, Du

    1999-05-01

    At the global as well as local scales, some of the geometry of types of neuron arbors-both dendrites and axons-appears to be self-organizing: Their morphogenesis behaves like flowing water, that is, fluid dynamically; waterflow in branching networks in turn acts like a tree composed of cords under tension, that is, vector mechanically. Branch diameters and angles and junction sites conform significantly to this model. The result is that such neuron tree samples globally minimize their total volume-rather than, for example, surface area or branch length. In addition, the arbors perform well at generating the cheapest topology interconnecting their terminals: their large-scale layouts are among the best of all such possible connecting patterns, approaching 5% of optimum. This model also applies comparably to arterial and river networks.

  12. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  13. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.

  14. Large scale preparation of pure phycobiliproteins.

    PubMed

    Padgett, M P; Krogmann, D W

    1987-01-01

    This paper describes simple procedures for the purification of large amounts of phycocyanin and allophycocyanin from the cyanobacterium Microcystis aeruginosa. A homogeneous natural bloom of this organism provided hundreds of kilograms of cells. Large samples of cells were broken by freezing and thawing. Repeated extraction of the broken cells with distilled water released phycocyanin first, then allophycocyanin, and provides supporting evidence for the current models of phycobilisome structure. The very low ionic strength of the aqueous extracts allowed allophycocyanin release in a particulate form so that this protein could be easily concentrated by centrifugation. Other proteins in the extract were enriched and concentrated by large scale membrane filtration. The biliproteins were purified to homogeneity by chromatography on DEAE cellulose. Purity was established by HPLC and by N-terminal amino acid sequence analysis. The proteins were examined for stability at various pHs and exposures to visible light.

  15. Primer design for large scale sequencing.

    PubMed

    Haas, S; Vingron, M; Poustka, A; Wiemann, S

    1998-06-15

    We have developed PRIDE, a primer design program that automatically designs primers in single contigs or whole sequencing projects to extend the already known sequence and to double strand single-stranded regions. The program is fully integrated into the Staden package (GAP4) and accessible with a graphical user interface. PRIDE uses a fuzzy logic-based system to calculate primer qualities. The computational performance of PRIDE is enhanced by using suffix trees to store the huge amount of data being produced. A test set of 110 sequencing primers and 11 PCR primer pairs has been designed on genomic templates, cDNAs and sequences containing repetitive elements to analyze PRIDE's success rate. The high performance of PRIDE, combined with its minimal requirement of user interaction and its fast algorithm, make this program useful for the large scale design of primers, especially in large sequencing projects.

  16. Large-scale synthesis of peptides.

    PubMed

    Andersson, L; Blomberg, L; Flegel, M; Lepsa, L; Nilsson, B; Verlander, M

    2000-01-01

    Recent advances in the areas of formulation and delivery have rekindled the interest of the pharmaceutical community in peptides as drug candidates, which, in turn, has provided a challenge to the peptide industry to develop efficient methods for the manufacture of relatively complex peptides on scales of up to metric tons per year. This article focuses on chemical synthesis approaches for peptides, and presents an overview of the methods available and in use currently, together with a discussion of scale-up strategies. Examples of the different methods are discussed, together with solutions to some specific problems encountered during scale-up development. Finally, an overview is presented of issues common to all manufacturing methods, i.e., methods used for the large-scale purification and isolation of final bulk products and regulatory considerations to be addressed during scale-up of processes to commercial levels. Copyright 2000 John Wiley & Sons, Inc. Biopolymers (Pept Sci) 55: 227-250, 2000

  17. Large Scale Quantum Simulations of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 < ρ < 0 . 10 fm-3, proton fractions 0 . 05

  18. Jovian large-scale stratospheric circulation

    NASA Technical Reports Server (NTRS)

    West, R. A.; Friedson, A. J.; Appleby, J. F.

    1992-01-01

    An attempt is made to diagnose the annual-average mean meridional residual Jovian large-scale stratospheric circulation from observations of the temperature and reflected sunlight that reveal the morphology of the aerosol heating. The annual mean solar heating, total radiative flux divergence, mass stream function, and Eliassen-Palm flux divergence are shown. The stratospheric radiative flux divergence is dominated the high latitudes by aerosol absorption. Between the 270 and 100 mbar pressure levels, where there is no aerosol heating in the model, the structure of the circulation at low- to midlatitudes is governed by the meridional variation of infrared cooling in association with the variation of zonal mean temperatures observed by IRIS. The principal features of the vertical velocity profile found by Gierasch et al. (1986) are recovered in the present calculation.

  19. Large-scale parametric survival analysis.

    PubMed

    Mittal, Sushil; Madigan, David; Cheng, Jerry Q; Burd, Randall S

    2013-10-15

    Survival analysis has been a topic of active statistical research in the past few decades with applications spread across several areas. Traditional applications usually consider data with only a small numbers of predictors with a few hundreds or thousands of observations. Recent advances in data acquisition techniques and computation power have led to considerable interest in analyzing very-high-dimensional data where the number of predictor variables and the number of observations range between 10(4) and 10(6). In this paper, we present a tool for performing large-scale regularized parametric survival analysis using a variant of the cyclic coordinate descent method. Through our experiments on two real data sets, we show that application of regularized models to high-dimensional data avoids overfitting and can provide improved predictive performance and calibration over corresponding low-dimensional models.

  20. Large-Scale Parametric Survival Analysis†

    PubMed Central

    Mittal, Sushil; Madigan, David; Cheng, Jerry; Burd, Randall S.

    2013-01-01

    Survival analysis has been a topic of active statistical research in the past few decades with applications spread across several areas. Traditional applications usually consider data with only small numbers of predictors with a few hundreds or thousands of observations. Recent advances in data acquisition techniques and computation power has led to considerable interest in analyzing very high-dimensional data where the number of predictor variables and the number of observations range between 104 – 106. In this paper, we present a tool for performing large-scale regularized parametric survival analysis using a variant of cyclic coordinate descent method. Through our experiments on two real data sets, we show that application of regularized models to high-dimensional data avoids overfitting and can provide improved predictive performance and calibration over corresponding low-dimensional models. PMID:23625862

  1. Large scale study of tooth enamel

    SciTech Connect

    Bodart, F.; Deconninck, G.; Martin, M.Th.

    1981-04-01

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. One hundred eighty samples of teeth were first analysed using PIXE, backscattering and nuclear reaction techniques. The results were analysed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population.

  2. The challenge of large-scale structure

    NASA Astrophysics Data System (ADS)

    Gregory, S. A.

    1996-03-01

    The tasks that I have assumed for myself in this presentation include three separate parts. The first, appropriate to the particular setting of this meeting, is to review the basic work of the founding of this field; the appropriateness comes from the fact that W. G. Tifft made immense contributions that are not often realized by the astronomical community. The second task is to outline the general tone of the observational evidence for large scale structures. (Here, in particular, I cannot claim to be complete. I beg forgiveness from any workers who are left out by my oversight for lack of space and time.) The third task is to point out some of the major aspects of the field that may represent the clues by which some brilliant sleuth will ultimately figure out how galaxies formed.

  3. Modeling the Internet's large-scale topology

    PubMed Central

    Yook, Soon-Hyung; Jeong, Hawoong; Barabási, Albert-László

    2002-01-01

    Network generators that capture the Internet's large-scale topology are crucial for the development of efficient routing protocols and modeling Internet traffic. Our ability to design realistic generators is limited by the incomplete understanding of the fundamental driving forces that affect the Internet's evolution. By combining several independent databases capturing the time evolution, topology, and physical layout of the Internet, we identify the universal mechanisms that shape the Internet's router and autonomous system level topology. We find that the physical layout of nodes form a fractal set, determined by population density patterns around the globe. The placement of links is driven by competition between preferential attachment and linear distance dependence, a marked departure from the currently used exponential laws. The universal parameters that we extract significantly restrict the class of potentially correct Internet models and indicate that the networks created by all available topology generators are fundamentally different from the current Internet. PMID:12368484

  4. The TMT International Observatory: A quick overview of future opportunities for planetary science exploration

    NASA Astrophysics Data System (ADS)

    Dumas, Christophe; Dawson, Sandra; Otarola, Angel; Skidmore, Warren; Squires, Gordon; Travouillon, Tony; Greathouse, Thomas K.; Li, Jian-Yang; Lu, Junjun; Marchis, Frank; Meech, Karen J.; Wong, Michael H.

    2015-11-01

    The construction of the Thirty-Meter-Telescope International Observatory (TIO) is scheduled to take about eight years, with first-light currently planned for the horizon 2023/24, and start of science operations soon after. Its innovative design, the unequalled astronomical quality of its location, and the scientific capabilities that will be offered by its suite of instruments, all contribute to position TIO as a major ground-based facility of the next decade.In this talk, we will review the expected observing performances of the facility, which will combine adaptive-optics corrected wavefronts with powerful imaging and spectroscopic capabilities. TMT will enable ground-based exploration of our solar system - and planetary systems at large - at a dramatically enhanced sensitivity and spatial resolution across the visible and near-/thermal- infrared regimes. This sharpened vision, spanning the study of planetary atmospheres, ring systems, (cryo-)volcanic activity, small body populations (asteroids, comets, trans-Neptunian objects), and exoplanets, will shed new lights on the processes involved in the formation and evolution of our solar system, including the search for life outside the Earth, and will expand our understanding of the physical and chemical properties of extra-solar planets, complementing TIO's direct studies of planetary systems around other stars.TIO operations will meet a wide range of observing needs. Observing support associated with "classical" and "queue" modes will be offered (including some flavors of remote observing). The TIO schedule will integrate observing programs so as to optimize scientific outputs and take into account the stringent observing time constraints often encountered for observations of our solar system such as, for instance, the scheduling of target-of-oportunity observations, the implementation of short observing runs, or the support of long-term "key-science" programmes.Complementary information about TIO, and the

  5. Large-scale sequential quadratic programming algorithms

    SciTech Connect

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  6. Supporting large-scale computational science

    SciTech Connect

    Musick, R., LLNL

    1998-02-19

    Business needs have driven the development of commercial database systems since their inception. As a result, there has been a strong focus on supporting many users, minimizing the potential corruption or loss of data, and maximizing performance metrics like transactions per second, or TPC-C and TPC-D results. It turns out that these optimizations have little to do with the needs of the scientific community, and in particular have little impact on improving the management and use of large-scale high-dimensional data. At the same time, there is an unanswered need in the scientific community for many of the benefits offered by a robust DBMS. For example, tying an ad-hoc query language such as SQL together with a visualization toolkit would be a powerful enhancement to current capabilities. Unfortunately, there has been little emphasis or discussion in the VLDB community on this mismatch over the last decade. The goal of the paper is to identify the specific issues that need to be resolved before large-scale scientific applications can make use of DBMS products. This topic is addressed in the context of an evaluation of commercial DBMS technology applied to the exploration of data generated by the Department of Energy`s Accelerated Strategic Computing Initiative (ASCI). The paper describes the data being generated for ASCI as well as current capabilities for interacting with and exploring this data. The attraction of applying standard DBMS technology to this domain is discussed, as well as the technical and business issues that currently make this an infeasible solution.

  7. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  8. Supporting large-scale computational science

    SciTech Connect

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  9. Voids in the Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    El-Ad, Hagai; Piran, Tsvi

    1997-12-01

    Voids are the most prominent feature of the large-scale structure of the universe. Still, their incorporation into quantitative analysis of it has been relatively recent, owing essentially to the lack of an objective tool to identify the voids and to quantify them. To overcome this, we present here the VOID FINDER algorithm, a novel tool for objectively quantifying voids in the galaxy distribution. The algorithm first classifies galaxies as either wall galaxies or field galaxies. Then, it identifies voids in the wall-galaxy distribution. Voids are defined as continuous volumes that do not contain any wall galaxies. The voids must be thicker than an adjustable limit, which is refined in successive iterations. In this way, we identify the same regions that would be recognized as voids by the eye. Small breaches in the walls are ignored, avoiding artificial connections between neighboring voids. We test the algorithm using Voronoi tesselations. By appropriate scaling of the parameters with the selection function, we apply it to two redshift surveys, the dense SSRS2 and the full-sky IRAS 1.2 Jy. Both surveys show similar properties: ~50% of the volume is filled by voids. The voids have a scale of at least 40 h-1 Mpc and an average -0.9 underdensity. Faint galaxies do not fill the voids, but they do populate them more than bright ones. These results suggest that both optically and IRAS-selected galaxies delineate the same large-scale structure. Comparison with the recovered mass distribution further suggests that the observed voids in the galaxy distribution correspond well to underdense regions in the mass distribution. This confirms the gravitational origin of the voids.

  10. Future Large - Scale Projects and Programmes in Astronomy and Astrophysics

    NASA Astrophysics Data System (ADS)

    Corbett, I.

    2004-03-01

    This workshop was proposed by Germany, which invited ESO to act as host, and took place on December 1-3, at the Deutsches Museum (December 1) and at the Ludwig- Maximilians-Universität (December 2, 3). It was attended by government-appointed delegates from fifteen Global Science Forum Member countries and Observers, three non- OECD countries, representatives of ESO, the President of the International Astronomical Union, invited speakers, and the OECD secretariat, and was chaired by Ian Corbett of ESO.

  11. Industrial Large Scale Applications of Superconductivity -- Current and Future Trends

    NASA Astrophysics Data System (ADS)

    Amm, Kathleen

    2011-03-01

    Since the initial development of NbTi and Nb3Sn superconducting wires in the early 1960's, superconductivity has developed a broad range of industrial applications in research, medicine and energy. Superconductivity has been used extensively in NMR low field and high field spectrometers and MRI systems, and has been demonstrated in many power applications, including power cables, transformers, fault current limiters, and motors and generators. To date, the most commercially successful application for superconductivity has been the high field magnets required for magnetic resonance imaging (MRI), with a global market well in excess of 4 billion excluding the service industry. The unique ability of superconductors to carry large currents with no losses enabled high field MRI and its unique clinical capabilities in imaging soft tissue. The rapid adoption of high field MRI with superconducting magnets was because superconductivity was a key enabler for high field magnets with their high field uniformity and image quality. With over 30 years of developing MRI systems and applications, MRI has become a robust clinical tool that is ever expanding into new and developing markets. Continued innovation in system design is continuing to address these market needs. One of the key questions that innovators in industrial superconducting magnet design must consider today is what application of superconductivity may lead to a market on the scale of MRI? What are the key considerations for where superconductivity can provide a unique solution as it did in the case of MRI? Many companies in the superconducting industry today are investigating possible technologies that may be the next large market like MRI.

  12. Koroaps - System for a Large Scale Monitoring and Variable Stars Searching

    NASA Astrophysics Data System (ADS)

    Parimucha, S.; Baludansky, D.; Vadila, M.

    We give an introduction information about the KOROAPS (KO·sice ROztoky Automatic Photometry System). It is a system for a large scale automatic multicolor monitoring and variable stars searching. It is in a development at Safarik University in Kosice in cooperation with Roztoky Observatory. The system is now in a test operation at Roztoky Observatory. System consists of Nikkor photolense 2/200 equipped with SBIG ST8 CCD camera and set of the standard VRI photometric filters. It is placed on Celestron's CG-5 Advanced mount. We give description of the basic properties of the instrument, data reduction pipeline and operational modes of the instrument.

  13. A large-scale dataset of solar event reports from automated feature recognition modules

    NASA Astrophysics Data System (ADS)

    Schuh, Michael A.; Angryk, Rafal A.; Martens, Petrus C.

    2016-05-01

    The massive repository of images of the Sun captured by the Solar Dynamics Observatory (SDO) mission has ushered in the era of Big Data for Solar Physics. In this work, we investigate the entire public collection of events reported to the Heliophysics Event Knowledgebase (HEK) from automated solar feature recognition modules operated by the SDO Feature Finding Team (FFT). With the SDO mission recently surpassing five years of operations, and over 280,000 event reports for seven types of solar phenomena, we present the broadest and most comprehensive large-scale dataset of the SDO FFT modules to date. We also present numerous statistics on these modules, providing valuable contextual information for better understanding and validating of the individual event reports and the entire dataset as a whole. After extensive data cleaning through exploratory data analysis, we highlight several opportunities for knowledge discovery from data (KDD). Through these important prerequisite analyses presented here, the results of KDD from Solar Big Data will be overall more reliable and better understood. As the SDO mission remains operational over the coming years, these datasets will continue to grow in size and value. Future versions of this dataset will be analyzed in the general framework established in this work and maintained publicly online for easy access by the community.

  14. Statistical Measures of Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Vogeley, Michael; Geller, Margaret; Huchra, John; Park, Changbom; Gott, J. Richard

    1993-12-01

    \\inv Mpc} To quantify clustering in the large-scale distribution of galaxies and to test theories for the formation of structure in the universe, we apply statistical measures to the CfA Redshift Survey. This survey is complete to m_{B(0)}=15.5 over two contiguous regions which cover one-quarter of the sky and include ~ 11,000 galaxies. The salient features of these data are voids with diameter 30-50\\hmpc and coherent dense structures with a scale ~ 100\\hmpc. Comparison with N-body simulations rules out the ``standard" CDM model (Omega =1, b=1.5, sigma_8 =1) at the 99% confidence level because this model has insufficient power on scales lambda >30\\hmpc. An unbiased open universe CDM model (Omega h =0.2) and a biased CDM model with non-zero cosmological constant (Omega h =0.24, lambda_0 =0.6) match the observed power spectrum. The amplitude of the power spectrum depends on the luminosity of galaxies in the sample; bright (L>L(*) ) galaxies are more strongly clustered than faint galaxies. The paucity of bright galaxies in low-density regions may explain this dependence. To measure the topology of large-scale structure, we compute the genus of isodensity surfaces of the smoothed density field. On scales in the ``non-linear" regime, <= 10\\hmpc, the high- and low-density regions are multiply-connected over a broad range of density threshold, as in a filamentary net. On smoothing scales >10\\hmpc, the topology is consistent with statistics of a Gaussian random field. Simulations of CDM models fail to produce the observed coherence of structure on non-linear scales (>95% confidence level). The underdensity probability (the frequency of regions with density contrast delta rho //lineρ=-0.8) depends strongly on the luminosity of galaxies; underdense regions are significantly more common (>2sigma ) in bright (L>L(*) ) galaxy samples than in samples which include fainter galaxies.

  15. Long-lived space observatories for astronomy and astrophysics

    NASA Technical Reports Server (NTRS)

    Savage, Blair D.; Becklin, Eric E.; Beckwith, Steven V. W.; Cowie, Lennox L.; Dupree, Andrea K.; Elliot, James L.; Gallagher, John S.; Helfand, David J.; Jenkins, Edward F.; Johnston, Kenneth J.

    1987-01-01

    NASA's plan to build and launch a fleet of long-lived space observatories that include the Hubble Space Telescope (HST), the Gamma Ray Observatory (GRO), the Advanced X Ray Astrophysics Observatory (AXAF), and the Space Infrared Telescope Facility (SIRTF) are discussed. These facilities are expected to have a profound impact on the sciences of astronomy and astrophysics. The long-lived observatories will provide new insights about astronomical and astrophysical problems that range from the presence of planets orbiting nearby stars to the large-scale distribution and evolution of matter in the universe. An important concern to NASA and the scientific community is the operation and maintenance cost of the four observatories described above. The HST cost about $1.3 billion (1984 dollars) to build and is estimated to require $160 million (1986 dollars) a year to operate and maintain. If HST is operated for 20 years, the accumulated costs will be considerably more than those required for its construction. Therefore, it is essential to plan carefully for observatory operations and maintenance before a long-lived facility is constructed. The primary goal of this report is to help NASA develop guidelines for the operations and management of these future observatories so as to achieve the best possible scientific results for the resources available. Eight recommendations are given.

  16. Management of large-scale multimedia conferencing

    NASA Astrophysics Data System (ADS)

    Cidon, Israel; Nachum, Youval

    1998-12-01

    The goal of this work is to explore management strategies and algorithms for large-scale multimedia conferencing over a communication network. Since the use of multimedia conferencing is still limited, the management of such systems has not yet been studied in depth. A well organized and human friendly multimedia conference management should utilize efficiently and fairly its limited resources as well as take into account the requirements of the conference participants. The ability of the management to enforce fair policies and to quickly take into account the participants preferences may even lead to a conference environment that is more pleasant and more effective than a similar face to face meeting. We suggest several principles for defining and solving resource sharing problems in this context. The conference resources which are addressed in this paper are the bandwidth (conference network capacity), time (participants' scheduling) and limitations of audio and visual equipment. The participants' requirements for these resources are defined and translated in terms of Quality of Service requirements and the fairness criteria.

  17. Large-scale wind turbine structures

    NASA Technical Reports Server (NTRS)

    Spera, David A.

    1988-01-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  18. Large-scale tides in general relativity

    NASA Astrophysics Data System (ADS)

    Ip, Hiu Yan; Schmidt, Fabian

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the "separate universe" paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  19. Large scale mechanical metamaterials as seismic shields

    NASA Astrophysics Data System (ADS)

    Miniaci, Marco; Krushynska, Anastasiia; Bosia, Federico; Pugno, Nicola M.

    2016-08-01

    Earthquakes represent one of the most catastrophic natural events affecting mankind. At present, a universally accepted risk mitigation strategy for seismic events remains to be proposed. Most approaches are based on vibration isolation of structures rather than on the remote shielding of incoming waves. In this work, we propose a novel approach to the problem and discuss the feasibility of a passive isolation strategy for seismic waves based on large-scale mechanical metamaterials, including for the first time numerical analysis of both surface and guided waves, soil dissipation effects, and adopting a full 3D simulations. The study focuses on realistic structures that can be effective in frequency ranges of interest for seismic waves, and optimal design criteria are provided, exploring different metamaterial configurations, combining phononic crystals and locally resonant structures and different ranges of mechanical properties. Dispersion analysis and full-scale 3D transient wave transmission simulations are carried out on finite size systems to assess the seismic wave amplitude attenuation in realistic conditions. Results reveal that both surface and bulk seismic waves can be considerably attenuated, making this strategy viable for the protection of civil structures against seismic risk. The proposed remote shielding approach could open up new perspectives in the field of seismology and in related areas of low-frequency vibration damping or blast protection.

  20. Food appropriation through large scale land acquisitions

    NASA Astrophysics Data System (ADS)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  1. Large scale structure of the sun's corona

    NASA Astrophysics Data System (ADS)

    Kundu, Mukul R.

    Results concerning the large-scale structure of the solar corona obtained by observations at meter-decameter wavelengths are reviewed. Coronal holes observed on the disk at multiple frequencies show the radial and azimuthal geometry of the hole. At the base of the hole there is good correspondence to the chromospheric signature in He I 10,830 A, but at greater heights the hole may show departures from symmetry. Two-dimensional imaging of weak-type III bursts simultaneously with the HAO SMM coronagraph/polarimeter measurements indicate that these bursts occur along elongated features emanating from the quiet sun, corresponding in position angle to the bright coronal streamers. It is shown that the densest regions of streamers and the regions of maximum intensity of type II bursts coincide closely. Non-flare-associated type II/type IV bursts associated with coronal streamer disruption events are studied along with correlated type II burst emissions originating from distant centers on the sun.

  2. Large-scale carbon fiber tests

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A realistic release of carbon fibers was established by burning a minimum of 45 kg of carbon fiber composite aircraft structural components in each of five large scale, outdoor aviation jet fuel fire tests. This release was quantified by several independent assessments with various instruments developed specifically for these tests. The most likely values for the mass of single carbon fibers released ranged from 0.2 percent of the initial mass of carbon fiber for the source tests (zero wind velocity) to a maximum of 0.6 percent of the initial carbon fiber mass for dissemination tests (5 to 6 m/s wind velocity). Mean fiber lengths for fibers greater than 1 mm in length ranged from 2.5 to 3.5 mm. Mean diameters ranged from 3.6 to 5.3 micrometers which was indicative of significant oxidation. Footprints of downwind dissemination of the fire released fibers were measured to 19.1 km from the fire.

  3. Large-scale clustering of cosmic voids

    NASA Astrophysics Data System (ADS)

    Chan, Kwan Chuen; Hamaus, Nico; Desjacques, Vincent

    2014-11-01

    We study the clustering of voids using N -body simulations and simple theoretical models. The excursion-set formalism describes fairly well the abundance of voids identified with the watershed algorithm, although the void formation threshold required is quite different from the spherical collapse value. The void cross bias bc is measured and its large-scale value is found to be consistent with the peak background split results. A simple fitting formula for bc is found. We model the void auto-power spectrum taking into account the void biasing and exclusion effect. A good fit to the simulation data is obtained for voids with radii ≳30 Mpc h-1 , especially when the void biasing model is extended to 1-loop order. However, the best-fit bias parameters do not agree well with the peak-background results. Being able to fit the void auto-power spectrum is particularly important not only because it is the direct observable in galaxy surveys, but also our method enables us to treat the bias parameters as nuisance parameters, which are sensitive to the techniques used to identify voids.

  4. Large-scale autostereoscopic outdoor display

    NASA Astrophysics Data System (ADS)

    Reitterer, Jörg; Fidler, Franz; Saint Julien-Wallsee, Ferdinand; Schmid, Gerhard; Gartner, Wolfgang; Leeb, Walter; Schmid, Ulrich

    2013-03-01

    State-of-the-art autostereoscopic displays are often limited in size, effective brightness, number of 3D viewing zones, and maximum 3D viewing distances, all of which are mandatory requirements for large-scale outdoor displays. Conventional autostereoscopic indoor concepts like lenticular lenses or parallax barriers cannot simply be adapted for these screens due to the inherent loss of effective resolution and brightness, which would reduce both image quality and sunlight readability. We have developed a modular autostereoscopic multi-view laser display concept with sunlight readable effective brightness, theoretically up to several thousand 3D viewing zones, and maximum 3D viewing distances of up to 60 meters. For proof-of-concept purposes a prototype display with two pixels was realized. Due to various manufacturing tolerances each individual pixel has slightly different optical properties, and hence the 3D image quality of the display has to be calculated stochastically. In this paper we present the corresponding stochastic model, we evaluate the simulation and measurement results of the prototype display, and we calculate the achievable autostereoscopic image quality to be expected for our concept.

  5. Large Scale EOF Analysis of Climate Data

    NASA Astrophysics Data System (ADS)

    Prabhat, M.; Gittens, A.; Kashinath, K.; Cavanaugh, N. R.; Mahoney, M.

    2016-12-01

    We present a distributed approach towards extracting EOFs from 3D climate data. We implement the method in Apache Spark, and process multi-TB sized datasets on O(1000-10,000) cores. We apply this method to latitude-weighted ocean temperature data from CSFR, a 2.2 terabyte-sized data set comprising ocean and subsurface reanalysis measurements collected at 41 levels in the ocean, at 6 hour intervals over 31 years. We extract the first 100 EOFs of this full data set and compare to the EOFs computed simply on the surface temperature field. Our analyses provide evidence of Kelvin and Rossy waves and components of large-scale modes of oscillation including the ENSO and PDO that are not visible in the usual SST EOFs. Further, they provide information on the the most influential parts of the ocean, such as the thermocline, that exist below the surface. Work is ongoing to understand the factors determining the depth-varying spatial patterns observed in the EOFs. We will experiment with weighting schemes to appropriately account for the differing depths of the observations. We also plan to apply the same distributed approach to analysis of analysis of 3D atmospheric climatic data sets, including multiple variables. Because the atmosphere changes on a quicker time-scale than the ocean, we expect that the results will demonstrate an even greater advantage to computing 3D EOFs in lieu of 2D EOFs.

  6. Space Active Optics: toward optimized correcting mirrors for future large spaceborne observatories

    NASA Astrophysics Data System (ADS)

    Laslandes, Marie; Hugot, Emmanuel; Ferrari, Marc; Lemaitre, Gérard; Liotard, Arnaud

    2011-10-01

    Wave-front correction in optical instruments is often needed, either to compensate Optical Path Differences, off-axis aberrations or mirrors deformations. Active optics techniques are developed to allow efficient corrections with deformable mirrors. In this paper, we will present the conception of particular deformation systems which could be used in space telescopes and instruments in order to improve their performances while allowing relaxing specifications on the global system stability. A first section will be dedicated to the design and performance analysis of an active mirror specifically designed to compensate for aberrations that might appear in future 3m-class space telescopes, due to lightweight primary mirrors, thermal variations or weightless conditions. A second section will be dedicated to a brand new design of active mirror, able to compensate for given combinations of aberrations with a single actuator. If the aberrations to be corrected in an instrument and their evolutions are known in advance, an optimal system geometry can be determined thanks to the elasticity theory and Finite Element Analysis.

  7. Solving large scale structure in ten easy steps with COLA

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 109Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 1011Msolar/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  8. Solving large scale structure in ten easy steps with COLA

    SciTech Connect

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J. E-mail: matiasz@ias.edu

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  9. LARGE-SCALE CO2 TRANSPORTATION AND DEEP OCEAN SEQUESTRATION

    SciTech Connect

    Hamid Sarv

    1999-03-01

    Technical and economical feasibility of large-scale CO{sub 2} transportation and ocean sequestration at depths of 3000 meters or grater was investigated. Two options were examined for transporting and disposing the captured CO{sub 2}. In one case, CO{sub 2} was pumped from a land-based collection center through long pipelines laid on the ocean floor. Another case considered oceanic tanker transport of liquid carbon dioxide to an offshore floating structure for vertical injection to the ocean floor. In the latter case, a novel concept based on subsurface towing of a 3000-meter pipe, and attaching it to the offshore structure was considered. Budgetary cost estimates indicate that for distances greater than 400 km, tanker transportation and offshore injection through a 3000-meter vertical pipe provides the best method for delivering liquid CO{sub 2} to deep ocean floor depressions. For shorter distances, CO{sub 2} delivery by parallel-laid, subsea pipelines is more cost-effective. Estimated costs for 500-km transport and storage at a depth of 3000 meters by subsea pipelines and tankers were 1.5 and 1.4 dollars per ton of stored CO{sub 2}, respectively. At these prices, economics of ocean disposal are highly favorable. Future work should focus on addressing technical issues that are critical to the deployment of a large-scale CO{sub 2} transportation and disposal system. Pipe corrosion, structural design of the transport pipe, and dispersion characteristics of sinking CO{sub 2} effluent plumes have been identified as areas that require further attention. Our planned activities in the next Phase include laboratory-scale corrosion testing, structural analysis of the pipeline, analytical and experimental simulations of CO{sub 2} discharge and dispersion, and the conceptual economic and engineering evaluation of large-scale implementation.

  10. Investigation of Coronal Large Scale Structures Utilizing Spartan 201 Data

    NASA Technical Reports Server (NTRS)

    Guhathakurta, Madhulika

    1998-01-01

    Two telescopes aboard Spartan 201, a small satellite has been launched from the Space Shuttles, on April 8th, 1993, September 8th, 1994, September 7th, 1995 and November 20th, 1997. The main objective of the mission was to answer some of the most fundamental unanswered questions of solar physics-What accelerates the solar wind and what heats the corona? The two telescopes are 1) Ultraviolet Coronal Spectrometer (UVCS) provided by the Smithsonian Astrophysical Observatory which uses ultraviolet emissions from neutral hydrogen and ions in the corona to determine velocities of the coronal plasma within the solar wind source region, and the temperature and density distributions of protons and 2) White Light Coronagraph (WLC) provided by NASA's Goddard Space Flight Center which measures visible light to determine the density distribution of coronal electrons within the same region. The PI has had the primary responsibility in the development and application of computer codes necessary for scientific data analysis activities, end instrument calibration for the white-light coronagraph for the entire Spartan mission. The PI was responsible for the science output from the WLC instrument. PI has also been involved in the investigation of coronal density distributions in large-scale structures by use of numerical models which are (mathematically) sufficient to reproduce the details of the observed brightness and polarized brightness distributions found in SPARTAN 201 data.

  11. Terminology of Large-Scale Waves in the Solar Atmosphere

    NASA Astrophysics Data System (ADS)

    Vršnak, Bojan

    2005-03-01

    This is the fourth in a series of essays on terms used in solar-terrestrial physics that are thought to be in need of clarification. Terms are identified and essays are commissioned by a committee chartered by Division II (Sun and Heliosphere) of the International Astronomical Union. Terminology Committee members include Ed Cliver (chair), Jean-Louis Bougeret, Hilary Cane, Takeo Kosugi, Sara Martin, Rainer Schwenn, and Lidia van Driel-Gestelyi. Authors are asked to review the origins of terms and their current usage/misusage. The goals are to inform the community and to open a discussion. The following article by Bojan Vršnak focuses on terms used to describe large-scale waves in the solar atmosphere, an area of research that has been given great impetus by the images of waves from the Extreme ultraviolet Imaging Telescope (EIT) on board the Solar and Heliospheric Observatory (SOHO). The committee welcomes suggestions for other terms to address in this forum.

  12. Sensitivity technologies for large scale simulation.

    SciTech Connect

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  13. Large Scale Flame Spread Environmental Characterization Testing

    NASA Technical Reports Server (NTRS)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  14. UAV Data Processing for Large Scale Topographical Mapping

    NASA Astrophysics Data System (ADS)

    Tampubolon, W.; Reinhardt, W.

    2014-06-01

    data acquisition in the future in which it can support national large scale topographical mapping program up to the 1:1.000 map scale.

  15. Practical considerations for large-scale gut microbiome studies.

    PubMed

    Vandeputte, Doris; Tito, Raul Y; Vanleeuwen, Rianne; Falony, Gwen; Raes, Jeroen

    2017-08-01

    First insights on the human gut microbiome have been gained from medium-sized, cross-sectional studies. However, given the modest portion of explained variance of currently identified covariates and the small effect size of gut microbiota modulation strategies, upscaling seems essential for further discovery and characterisation of the multiple influencing factors and their relative contribution. In order to guide future research projects and standardisation efforts, we here review currently applied collection and preservation methods for gut microbiome research. We discuss aspects such as sample quality, applicable omics techniques, user experience and time and cost efficiency. In addition, we evaluate the protocols of a large-scale microbiome cohort initiative, the Flemish Gut Flora Project, to give an idea of perspectives, and pitfalls of large-scale faecal sampling studies. Although cryopreservation can be regarded as the gold standard, freezing protocols generally require more resources due to cold chain management. However, here we show that much can be gained from an optimised transport chain and sample aliquoting before freezing. Other protocols can be useful as long as they preserve the microbial signature of a sample such that relevant conclusions can be drawn regarding the research question, and the obtained data are stable and reproducible over time. © FEMS 2017.

  16. Simulator for Large-scale Planetary and Terrestrial Radar Sounding

    NASA Astrophysics Data System (ADS)

    Haynes, M.; Schroeder, D. M.; Duan, X.; Arumugam, D.; McMichael, J. G.; Hensley, S.; Cwik, T. A.

    2016-12-01

    We are developing a radar sounding simulation tool that can simulate radar scattering from large-scale, heterogeneous sub-surfaces for all existing, proposed, and future potential planetary and terrestrial sounder missions for Mars, Venus, Earth (e.g., atmosphere, ice sheets), Europa, Ganymede, Enceladus or other icy planetary bodies. This tool will be the first of its kind in planetary and terrestrial radar sounding simulation to support system engineering and to test scientific observables. No extant radar simulator is capable of producing echoes with realistic phase histories, heterogeneous media propagation effects, and processing gains at the spatial scales of planetary or terrestrial radar sounding (e.g., computational subsurface volumes of 10,000s of wavelengths in three dimensions at sounding frequencies of 5-100 MHz). Today's radar point target simulators are fast, but do not model transmission and propagation through heterogeneous dielectric media. We present progress on two simulation modules aimed at addressing different regimes of the sounding scattering problem: the Pseudo-Spectral Time-Domain (PSTD) for scattering from shallow subsurface dielectric heterogeneities, and the Multi-layer Fast Multipole Method for scattering from deep, large-scale dielectric interfaces. We will show simulated radargrams and compare computation times for realistic radar sounding scenes, in addition we solicit community input for this tool and outline the development path.

  17. Some ecological guidelines for large-scale biomass plantations

    SciTech Connect

    Hoffman, W.; Cook, J.H.; Beyea, J.

    1993-12-31

    The National Audubon Society sees biomass as an appropriate and necessary source of energy to help replace fossil fuels in the near future, but is concerned that large-scale biomass plantations could displace significant natural vegetation and wildlife habitat, and reduce national and global biodiversity. We support the development of an industry large enough to provide significant portions of our energy budget, but we see a critical need to ensure that plantations are designed and sited in ways that minimize ecological disruption, or even provide environmental benefits. We have been studying the habitat value of intensively managed short-rotation tree plantations. Our results show that these plantations support large populations of some birds, but not all of the species using the surrounding landscape, and indicate that their value as habitat can be increased greatly by including small areas of mature trees within them. We believe short-rotation plantations can benefit regional biodiversity if they can be deployed as buffers for natural forests, or as corridors connecting forest tracts. To realize these benefits, and to avoid habitat degradation, regional biomass plantation complexes (e.g., the plantations supplying all the fuel for a powerplant) need to be planned, sited, and developed as large-scale units in the context of the regional landscape mosaic.

  18. Synchronization of coupled large-scale Boolean networks

    SciTech Connect

    Li, Fangfei

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  19. The School Principal's Role in Large-Scale Assessment

    ERIC Educational Resources Information Center

    Newton, Paul; Tunison, Scott; Viczko, Melody

    2010-01-01

    This paper reports on an interpretive study in which 25 elementary principals were asked about their assessment knowledge, the use of large-scale assessments in their schools, and principals' perceptions on their roles with respect to large-scale assessments. Principals in this study suggested that the current context of large-scale assessment and…

  20. Synchronization of coupled large-scale Boolean networks

    NASA Astrophysics Data System (ADS)

    Li, Fangfei

    2014-03-01

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  1. Activities with Goto 45-CM Reflector at Bosscha Observatory, Lembang, Indonesia: Results and Aspects for Future Development

    NASA Astrophysics Data System (ADS)

    Malasan, H. L.

    In 1989 a 45-cm telescope of a cassegrainian type was installed, tested and commissioned at the Bosscha Observatory, Institut Teknologi Bandung. It was immediately put into use for UBV photometric observations of close binary systems. While the main function of the telescope was for photometric observations, the versatile design inherent to a reflector made possible to include a spectrograph which spectral dispersion could match the MK spectral classification. Activities related both to education and research conducted using this reflector since its installation comprise of scientific (photometry, spectroscopy, imagery) and experiment in instrumentation (fiber-fed spectrograph, CCD camera insitu testing). An important side result of the photometric observations is atmospheric study based on long-term atmospheric extionction coefficients. A multidiscipline approach, involving meteorologist and mathematiciants, on the study of natural and antrophogenic pollution of the atmosphere over Lembang has been recently undertaken. At present the telescope is, however, suffered from obsolete technology in its control functions. This has hampered it to be utilized fully, and therefore a plan to upgrade and extend the capability of the telescope has been made. The background, activities and results with emphasize to the collaborative work will be presented. Aspects for future development of the telescope and its auxiliary instruments will be discussed.

  2. A virtual observatory in a real world: building capacity for an uncertain future - the UK pVO

    NASA Astrophysics Data System (ADS)

    Gurney, R. J.; Tetzlaff, D.; Freer, J. E.; Emmett, B.; McDonald, A.; Rees, G.; Buytaert, W.; Blair, G.; Haygarth, P.

    2010-12-01

    The scientific community, environmental managers and policy makers face a challenging future trying to accommodate growing expectations of environmental well-being, while subject to maturing regulation, constrained budgets and a public scrutiny that expects easier and more meaningful access to data and knowledge. The pilot Virtual Observatory (pVO) is a new initiative funded by the UK Natural Environment Research Council (NERC) designed to deliver proof of concept for novel tools and approaches. During an initial phase, the pVO aims to first evaluate the role of existing ‘observatories’ in the UK and elsewhere both to learn good practice and to define boundaries. The aim of the two year pilot project is to investigate novel methods of linking data and models and to demonstrate scenario analysis for both research and environmental management, using effective communication tools such as portals to provide cost effective answers to vital questions in the water resources / soils area. The project will exploit cloud computing to develop new applications for accessing, filtering and synthesising data to develop new knowledge and evaluation tools. As such the cloud enables the integration of a variety of information sources (including disparate data sets, sensor data and models) at different granularities and scales together with associated information services to provide both interoperability between such services and encourage the flow from data to knowledge to policy setting in the quest for answering big science questions. A wide range of possible management and environmental futures will be explored at a range of spatial and temporal scales. Novel visualisation tools will promote cross-disciplinary communication and illustrate the effects of alternative strategies and solutions.

  3. The Nitrate Inventory of Unsaturated Soils at the Barrow Environmental Observatory: Current Conditions and Potential Future Trajectories

    NASA Astrophysics Data System (ADS)

    Heikoop, J. M.; Newman, B. D.; Arendt, C. A.; Andresen, C. G.; Lara, M. J.; Wainwright, H. M.; Throckmorton, H.; Graham, D. E.; Wilson, C. J.; Wullschleger, S. D.; Romanovsky, V. E.; Bolton, W. R.; Wales, N. A.; Rowland, J. C.

    2016-12-01

    Studies conducted in the Barrow Environmental Observatory under the auspices of the United States Department of Energy Next Generation Ecosystem Experiment (NGEE) - Arctic have demonstrated measurable nitrate concentrations ranging from <1 to 17 mg/L in the unsaturated centers of high-centered polygons. Conversely, nitrate concentrations in saturated areas of polygonal terrain were generally below the limit of detection. Isotopic analysis of this nitrate demonstrates that it results from microbial nitrification. The study site currently comprises mostly saturated soils. Several factors, however, could lead to drying of soils on different time scales. These include 1) topographic inversion of polygonal terrain associated with ice-wedge degradation, 2) increased connectivity and drainage of polygon troughs, similarly related to the thawing and subsidence of ice-wedges, and 3) near-surface soil drainage associated with wide-spread permafrost thaw and active layer deepening. Using a GIS approach we will estimate the current inventory of nitrate in the NGEE intensive study site using soil moisture data and existing unsaturated zone nitrate concentration data and new concentration data collected in the summer of 2016 from high- and flat-centered polygons and the elevated rims of low-centered polygons. Using this baseline, we will present potential future inventories based on various scenarios of active layer thickening and landscape geomorphic reorganization associated with permafrost thaw. Predicted inventories will be based solely on active layer moisture changes, ignoring for now potential changes associated with mineralization and nitrification of previously frozen old organic matter and changes in vegetation communities. We wish to demonstrate that physical landscape changes alone could have a profound effect on future nitrate availability. Nitrate data from recent NGEE campaigns in the Seward Peninsula of Alaska will also be presented.

  4. A kilo-pixel imaging system for future space based far-infrared observatories using microwave kinetic inductance detectors

    NASA Astrophysics Data System (ADS)

    Baselmans, J. J. A.; Bueno, J.; Yates, S. J. C.; Yurduseven, O.; Llombart, N.; Karatsu, K.; Baryshev, A. M.; Ferrari, L.; Endo, A.; Thoen, D. J.; de Visser, P. J.; Janssen, R. M. J.; Murugesan, V.; Driessen, E. F. C.; Coiffard, G.; Martin-Pintado, J.; Hargrave, P.; Griffin, M.

    2017-05-01

    Aims: Future astrophysics and cosmic microwave background space missions operating in the far-infrared to millimetre part of the spectrum will require very large arrays of ultra-sensitive detectors in combination with high multiplexing factors and efficient low-noise and low-power readout systems. We have developed a demonstrator system suitable for such applications. Methods: The system combines a 961 pixel imaging array based upon Microwave Kinetic Inductance Detectors (MKIDs) with a readout system capable of reading out all pixels simultaneously with only one readout cable pair and a single cryogenic amplifier. We evaluate, in a representative environment, the system performance in terms of sensitivity, dynamic range, optical efficiency, cosmic ray rejection, pixel-pixel crosstalk and overall yield at an observation centre frequency of 850 GHz and 20% fractional bandwidth. Results: The overall system has an excellent sensitivity, with an average detector sensitivity < NEPdet> =3×10-19 WHz measured using a thermal calibration source. At a loading power per pixel of 50 fW we demonstrate white, photon noise limited detector noise down to 300 mHz. The dynamic range would allow the detection of 1 Jy bright sources within the field of view without tuning the readout of the detectors. The expected dead time due to cosmic ray interactions, when operated in an L2 or a similar far-Earth orbit, is found to be <4%. Additionally, the achieved pixel yield is 83% and the crosstalk between the pixels is <-30 dB. Conclusions: This demonstrates that MKID technology can provide multiplexing ratios on the order of a 1000 with state-of-the-art single pixel performance, and that the technology is now mature enough to be considered for future space based observatories and experiments.

  5. Large scale dynamics of protoplanetary discs

    NASA Astrophysics Data System (ADS)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  6. Large scale simulations of Brownian suspensions

    NASA Astrophysics Data System (ADS)

    Viera, Marc Nathaniel

    Particle suspensions occur in a wide variety of natural and engineering materials. Some examples are colloids, polymers, paints, and slurries. These materials exhibit complex behavior owing to the forces which act among the particles and are transmitted through the fluid medium. Depending on the application, particle sizes range from large macroscopic molecules of 100mum to smaller colloidal particles in the range of 10nm to 1mum. Particles of this size interact though interparticle forces such as electrostatic and van der Waals, as well as hydrodynamic forces transmitted through the fluid medium. Additionally, the particles are subjected to random thermal fluctuations in the fluid giving rise to Brownian motion. The central objective of our research is to develop efficient numerical algorithms for the large scale dynamic simulation of particle suspensions. While previous methods have incurred a computational cost of O(N3), where N is the number of particles, we have developed a novel algorithm capable of solving this problem in O(N ln N) operations. This has allowed us to perform dynamic simulations with up to 64,000 particles and Monte Carlo realizations of up to 1 million particles. Our algorithm follows a Stokesian dynamics formulation by evaluating many-body hydrodynamic interactions using a far-field multipole expansion combined with a near-field lubrication correction. The breakthrough O(N ln N) scaling is obtained by employing a Particle-Mesh-Ewald (PME) approach whereby near-field interactions are evaluated directly and far-field interactions are evaluated using a grid based velocity computed with FFT's. This approach is readily extended to include the effects of Brownian motion. For interacting particles, the fluctuation-dissipation theorem requires that the individual Brownian forces satisfy a correlation based on the N body resistance tensor R. The accurate modeling of these forces requires the computation of a matrix square root R 1/2 for matrices up

  7. Large Scale Coordination of Small Scale Structures

    NASA Astrophysics Data System (ADS)

    Kobelski, Adam; Tarr, Lucas A.; Jaeggli, Sarah A.; Savage, Sabrina

    2017-08-01

    Transient brightenings are ubiquitous features of the solar atmosphere across many length and energy scales, the most energetic of which manifest as large-class solar flares. Often, transient brightenings originate in regions of strong magnetic activity and create strong observable enhancements across wavelengths from X-ray to radio, with notable dynamics on timescales of seconds to hours.The coronal aspects of these brightenings have often been studied by way of EUV and X-ray imaging and spectra. These events are likely driven by photospheric activity (such as flux emergence) with the coronal brightenings originating largely from chromospheric ablation (evaporation). Until recently, chromospheric and transition region observations of these events have been limited. However, new observational capabilities have become available which significantly enhance our ability to understand the bi-directional flow of energy through the chromosphere between the photosphere and the corona.We have recently obtained a unique data set with which to study this flow of energy through the chromosphere via the Interface Region Imaging Spectrograph (IRIS), Hinode EUV Imaging Spectrometer (EIS), Hinode X-Ray Telescope (XRT), Hinode Solar Optical Telescope (SOT), Solar Dynamics Observatory (SDO) Atmospheric Imaging Assembly (AIA), SDO Helioseismic and Magnetic Imager (HMI), Nuclear Spectroscopic Telescope Array (NuStar), Atacama Large Millimeter Array (ALMA), and Interferometric BIdimensional Spectropolarimeter (IBIS) at the Dunn Solar Telescope (DST). This data set targets a small active area near disk center which was tracked simultaneously for approximately four hours. Within this region, many transient brightenings detected through multiple layers of the solar atmosphere. In this study, we combine the imaging data and use the spectra from EIS and IRIS to track flows from the photosphere (HMI, SOT) through the chromosphere and transition region (AIA, IBIS, IRIS, ALMA) into the corona

  8. Galaxies and large scale structure at high redshifts

    PubMed Central

    Steidel, Charles C.

    1998-01-01

    It is now straightforward to assemble large samples of very high redshift (z ∼ 3) field galaxies selected by their pronounced spectral discontinuity at the rest frame Lyman limit of hydrogen (at 912 Å). This makes possible both statistical analyses of the properties of the galaxies and the first direct glimpse of the progression of the growth of their large-scale distribution at such an early epoch. Here I present a summary of the progress made in these areas to date and some preliminary results of and future plans for a targeted redshift survey at z = 2.7–3.4. Also discussed is how the same discovery method may be used to obtain a “census” of star formation in the high redshift Universe, and the current implications for the history of galaxy formation as a function of cosmic epoch. PMID:9419319

  9. Testing Inflation with Large Scale Structure: Connecting Hopes with Reality

    SciTech Connect

    Alvarez, Marcello; Baldauf, T.; Bond, J. Richard; Dalal, N.; Putter, R. D.; Dore, O.; Green, Daniel; Hirata, Chris; Huang, Zhiqi; Huterer, Dragan; Jeong, Donghui; Johnson, Matthew C.; Krause, Elisabeth; Loverde, Marilena; Meyers, Joel; Meeburg, Daniel; Senatore, Leonardo; Shandera, Sarah; Silverstein, Eva; Slosar, Anze; Smith, Kendrick; Zaldarriaga, Matias; Assassi, Valentin; Braden, Jonathan; Hajian, Amir; Kobayashi, Takeshi; Stein, George; Engelen, Alexander van

    2014-12-15

    The statistics of primordial curvature fluctuations are our window into the period of inflation, where these fluctuations were generated. To date, the cosmic microwave background has been the dominant source of information about these perturbations. Large-scale structure is, however, from where drastic improvements should originate. In this paper, we explain the theoretical motivations for pursuing such measurements and the challenges that lie ahead. In particular, we discuss and identify theoretical targets regarding the measurement of primordial non-Gaussianity. We argue that when quantified in terms of the local (equilateral) template amplitude f$loc\\atop{NL}$ (f$eq\\atop{NL}$), natural target levels of sensitivity are Δf$loc, eq\\atop{NL}$ ≃ 1. We highlight that such levels are within reach of future surveys by measuring 2-, 3- and 4-point statistics of the galaxy spatial distribution. This paper summarizes a workshop held at CITA (University of Toronto) on October 23-24, 2014.

  10. Hydrokinetic approach to large-scale cardiovascular blood flow

    NASA Astrophysics Data System (ADS)

    Melchionna, Simone; Bernaschi, Massimo; Succi, Sauro; Kaxiras, Efthimios; Rybicki, Frank J.; Mitsouras, Dimitris; Coskun, Ahmet U.; Feldman, Charles L.

    2010-03-01

    We present a computational method for commodity hardware-based clinical cardiovascular diagnosis based on accurate simulation of cardiovascular blood flow. Our approach leverages the flexibility of the Lattice Boltzmann method to implementation on high-performance, commodity hardware, such as Graphical Processing Units. We developed the procedure for the analysis of real-life cardiovascular blood flow case studies, namely, anatomic data acquisition, geometry and mesh generation, flow simulation and data analysis and visualization. We demonstrate the usefulness of our computational tool through a set of large-scale simulations of the flow patterns associated with the arterial tree of a patient which involves two hundred million computational cells. The simulations show evidence of a very rich and heterogeneous endothelial shear stress pattern (ESS), a quantity of recognized key relevance to the localization and progression of major cardiovascular diseases, such as atherosclerosis, and set the stage for future studies involving pulsatile flows.

  11. A large-scale computer facility for computational aerodynamics

    SciTech Connect

    Bailey, F.R.; Balhaus, W.F.

    1985-02-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans.

  12. Towards large-scale plasma-assisted synthesis of nanowires

    NASA Astrophysics Data System (ADS)

    Cvelbar, U.

    2011-05-01

    Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.

  13. Successful Physician Training Program for Large Scale EMR Implementation

    PubMed Central

    Stevens, L.A.; Mailes, E.S.; Goad, B.A.; Longhurst, C.A.

    2015-01-01

    Summary End-user training is an essential element of electronic medical record (EMR) implementation and frequently suffers from minimal institutional investment. In addition, discussion of successful EMR training programs for physicians is limited in the literature. The authors describe a successful physician-training program at Stanford Children’s Health as part of a large scale EMR implementation. Evaluations of classroom training, obtained at the conclusion of each class, revealed high physician satisfaction with the program. Free-text comments from learners focused on duration and timing of training, the learning environment, quality of the instructors, and specificity of training to their role or department. Based upon participant feedback and institutional experience, best practice recommendations, including physician engagement, curricular design, and assessment of proficiency and recognition, are suggested for future provider EMR training programs. The authors strongly recommend the creation of coursework to group providers by common workflow. PMID:25848415

  14. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    NASA Astrophysics Data System (ADS)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  15. Planning under uncertainty solving large-scale stochastic linear programs

    SciTech Connect

    Infanger, G. |

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  16. Climatological context for large-scale coral bleaching

    NASA Astrophysics Data System (ADS)

    Barton, A. D.; Casey, K. S.

    2005-12-01

    Large-scale coral bleaching was first observed in 1979 and has occurred throughout virtually all of the tropics since that time. Severe bleaching may result in the loss of live coral and in a decline of the integrity of the impacted coral reef ecosystem. Despite the extensive scientific research and increased public awareness of coral bleaching, uncertainties remain about the past and future of large-scale coral bleaching. In order to reduce these uncertainties and place large-scale coral bleaching in the longer-term climatological context, specific criteria and methods for using historical sea surface temperature (SST) data to examine coral bleaching-related thermal conditions are proposed by analyzing three, 132 year SST reconstructions: ERSST, HadISST1, and GISST2.3b. These methodologies are applied to case studies at Discovery Bay, Jamaica (77.27°W, 18.45°N), Sombrero Reef, Florida, USA (81.11°W, 24.63°N), Academy Bay, Galápagos, Ecuador (90.31°W, 0.74°S), Pearl and Hermes Reef, Northwest Hawaiian Islands, USA (175.83°W, 27.83°N), Midway Island, Northwest Hawaiian Islands, USA (177.37°W, 28.25°N), Davies Reef, Australia (147.68°E, 18.83°S), and North Male Atoll, Maldives (73.35°E, 4.70°N). The results of this study show that (1) The historical SST data provide a useful long-term record of thermal conditions in reef ecosystems, giving important insight into the thermal history of coral reefs and (2) While coral bleaching and anomalously warm SSTs have occurred over much of the world in recent decades, case studies in the Caribbean, Northwest Hawaiian Islands, and parts of other regions such as the Great Barrier Reef exhibited SST conditions and cumulative thermal stress prior to 1979 that were comparable to those conditions observed during the strong, frequent coral bleaching events since 1979. This climatological context and knowledge of past environmental conditions in reef ecosystems may foster a better understanding of how coral reefs will

  17. Large-scale Fractal Motion of Clouds

    NASA Image and Video Library

    2017-09-27

    waters surrounding the island.) The “swallowed” gulps of clear island air get carried along within the vortices, but these are soon mixed into the surrounding clouds. Landsat is unique in its ability to image both the small-scale eddies that mix clear and cloudy air, down to the 30 meter pixel size of Landsat, but also having a wide enough field-of-view, 180 km, to reveal the connection of the turbulence to large-scale flows such as the subtropical oceanic gyres. Landsat 7, with its new onboard digital recorder, has extended this capability away from the few Landsat ground stations to remote areas such as Alejandro Island, and thus is gradually providing a global dynamic picture of evolving human-scale phenomena. For more details on von Karman vortices, refer to climate.gsfc.nasa.gov/~cahalan. Image and caption courtesy Bob Cahalan, NASA GSFC Instrument: Landsat 7 - ETM+ Credit: NASA/GSFC/Landsat NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  18. Large-scale assembly of colloidal particles

    NASA Astrophysics Data System (ADS)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  19. The influence of large-scale motion on turbulent transport for confined coaxial jets

    NASA Technical Reports Server (NTRS)

    Brondum, D. C.; Bennett, J. C.

    1984-01-01

    The existence of large-scale coherent structures in turbulent shear flows has been well documented in the literature. The importance of these structures in flow entrainment, momentum transport and mass transport in the shear layer has been suggested by several researchers. Comparisons between existing models and experimental data for shear flow in confined coaxial jets reinforce the necessity of further investigation of the large scale structures. These comparisons show the greatest discrepancy between prediction and actual results in the developing flow region where the large scales exist. It was also observed that the momentum transport rate comparisons were very bad. Finally, Schetz has reviewed mixing flows and concluded that large-scale structures were essential aspects of future modeling efforts.

  20. Multitree Algorithms for Large-Scale Astrostatistics

    NASA Astrophysics Data System (ADS)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  1. Very Large-Scale Multiuser Detection (VLSMUD)

    DTIC Science & Technology

    2006-09-01

    the decoder can operate either simultaneously [25] or successively [7] [13] [23]. An example of the former strategy applied to ML sequence detection...algorithm and the corresponding estimates are used for the detection of future symbols. In the latter strategy , the decisions are fed back only when...34 iL# = " j L#, 4P 2e ( 1 + 1N ) a% iL &, mod(i,L)a ∗ % jL &, mod(j,L) , if " iL# &= " j L# , (7) almost surely, asM →∞. For δan, which is caused by

  2. Modelling large-scale halo bias using the bispectrum

    NASA Astrophysics Data System (ADS)

    Pollack, Jennifer E.; Smith, Robert E.; Porciani, Cristiano

    2012-03-01

    We study the relation between the density distribution of tracers for large-scale structure and the underlying matter distribution - commonly termed bias - in the Λ cold dark matter framework. In particular, we examine the validity of the local model of biasing at quadratic order in the matter density. This model is characterized by parameters b1 and b2. Using an ensemble of N-body simulations, we apply several statistical methods to estimate the parameters. We measure halo and matter fluctuations smoothed on various scales. We find that, whilst the fits are reasonably good, the parameters vary with smoothing scale. We argue that, for real-space measurements, owing to the mixing of wavemodes, no smoothing scale can be found for which the parameters are independent of smoothing. However, this is not the case in Fourier space. We measure halo and halo-mass power spectra and from these construct estimates of the effective large-scale bias as a guide for b1. We measure the configuration dependence of the halo bispectra Bhhh and reduced bispectra Qhhh for very large-scale k-space triangles. From these data, we constrain b1 and b2, taking into account the full bispectrum covariance matrix. Using the lowest order perturbation theory, we find that for Bhhh the best-fitting parameters are in reasonable agreement with one another as the triangle scale is varied, although the fits become poor as smaller scales are included. The same is true for Qhhh. The best-fitting values were found to depend on the discreteness correction. This led us to consider halo-mass cross-bispectra. The results from these statistics supported our earlier findings. We then developed a test to explore whether the inconsistency in the recovered bias parameters could be attributed to missing higher order corrections in the models. We prove that low-order expansions are not sufficiently accurate to model the data, even on scales k1˜ 0.04 h Mpc-1. If robust inferences concerning bias are to be drawn

  3. Svetloe Radio Astronomical Observatory

    NASA Technical Reports Server (NTRS)

    Smolentsev, Sergey; Rahimov, Ismail

    2013-01-01

    This report summarizes information about the Svetloe Radio Astronomical Observatory activities in 2012. Last year, a number of changes took place in the observatory to improve some technical characteristics and to upgrade some units to their required status. The report provides an overview of current geodetic VLBI activities and gives an outlook for the future.

  4. Zelenchukskaya Radio Astronomical Observatory

    NASA Technical Reports Server (NTRS)

    Smolentsev, Sergey; Dyakov, Andrei

    2013-01-01

    This report summarizes information about Zelenchukskaya Radio Astronomical Observatory activities in 2012. Last year a number of changes took place in the observatory to improve some technical characteristics and to upgrade some units to the required status. The report provides an overview of current geodetic VLBI activities and gives an outlook for the future.

  5. Simulating the large-scale structure of HI intensity maps

    SciTech Connect

    Seehars, Sebastian; Paranjape, Aseem; Witzemann, Amadeus; Refregier, Alexandre; Amara, Adam; Akeret, Joel E-mail: aseem@iucaa.in E-mail: alexandre.refregier@phys.ethz.ch E-mail: joel.akeret@phys.ethz.ch

    2016-03-01

    Intensity mapping of neutral hydrogen (HI) is a promising observational probe of cosmology and large-scale structure. We present wide field simulations of HI intensity maps based on N-body simulations of a 2.6 Gpc / h box with 2048{sup 3} particles (particle mass 1.6 × 10{sup 11} M{sub ⊙} / h). Using a conditional mass function to populate the simulated dark matter density field with halos below the mass resolution of the simulation (10{sup 8} M{sub ⊙} / h < M{sub halo} < 10{sup 13} M{sub ⊙} / h), we assign HI to those halos according to a phenomenological halo to HI mass relation. The simulations span a redshift range of 0.35 ∼< z ∼< 0.9 in redshift bins of width Δ z ≈ 0.05 and cover a quarter of the sky at an angular resolution of about 7'. We use the simulated intensity maps to study the impact of non-linear effects and redshift space distortions on the angular clustering of HI. Focusing on the autocorrelations of the maps, we apply and compare several estimators for the angular power spectrum and its covariance. We verify that these estimators agree with analytic predictions on large scales and study the validity of approximations based on Gaussian random fields, particularly in the context of the covariance. We discuss how our results and the simulated maps can be useful for planning and interpreting future HI intensity mapping surveys.

  6. Large Scale CW ECRH Systems: Some considerations

    NASA Astrophysics Data System (ADS)

    Erckmann, V.; Kasparek, W.; Plaum, B.; Lechte, C.; Petelin, M. I.; Braune, H.; Gantenbein, G.; Laqua, H. P.; Lubiako, L.; Marushchenko, N. B.; Michel, G.; Turkin, Y.; Weissgerber, M.

    2012-09-01

    Electron Cyclotron Resonance Heating (ECRH) is a key component in the heating arsenal for the next step fusion devices like W7-X and ITER. These devices are equipped with superconducting coils and are designed to operate steady state. ECRH must thus operate in CW-mode with a large flexibility to comply with various physics demands such as plasma start-up, heating and current drive, as well as configurationand MHD - control. The request for many different sophisticated applications results in a growing complexity, which is in conflict with the request for high availability, reliability, and maintainability. `Advanced' ECRH-systems must, therefore, comply with both the complex physics demands and operational robustness and reliability. The W7-X ECRH system is the first CW- facility of an ITER relevant size and is used as a test bed for advanced components. Proposals for future developments are presented together with improvements of gyrotrons, transmission components and launchers.

  7. Large-Scale Pattern Discovery in Music

    NASA Astrophysics Data System (ADS)

    Bertin-Mahieux, Thierry

    This work focuses on extracting patterns in musical data from very large collections. The problem is split in two parts. First, we build such a large collection, the Million Song Dataset, to provide researchers access to commercial-size datasets. Second, we use this collection to study cover song recognition which involves finding harmonic patterns from audio features. Regarding the Million Song Dataset, we detail how we built the original collection from an online API, and how we encouraged other organizations to participate in the project. The result is the largest research dataset with heterogeneous sources of data available to music technology researchers. We demonstrate some of its potential and discuss the impact it already has on the field. On cover song recognition, we must revisit the existing literature since there are no publicly available results on a dataset of more than a few thousand entries. We present two solutions to tackle the problem, one using a hashing method, and one using a higher-level feature computed from the chromagram (dubbed the 2DFTM). We further investigate the 2DFTM since it has potential to be a relevant representation for any task involving audio harmonic content. Finally, we discuss the future of the dataset and the hope of seeing more work making use of the different sources of data that are linked in the Million Song Dataset. Regarding cover songs, we explain how this might be a first step towards defining a harmonic manifold of music, a space where harmonic similarities between songs would be more apparent.

  8. Optimal Wind Energy Integration in Large-Scale Electric Grids

    NASA Astrophysics Data System (ADS)

    Albaijat, Mohammad H.

    profit for investors for renting their transmission capacity, and cheaper electricity for end users. We propose a hybrid method based on a heuristic and deterministic method to attain new transmission lines additions and increase transmission capacity. Renewable energy resources (RES) have zero operating cost, which makes them very attractive for generation companies and market participants. In addition, RES have zero carbon emission, which helps relieve the concerns of environmental impacts of electric generation resources' carbon emission. RES are wind, solar, hydro, biomass, and geothermal. By 2030, the expectation is that more than 30% of electricity in the U.S. will come from RES. One major contributor of RES generation will be from wind energy resources (WES). Furthermore, WES will be an important component of the future generation portfolio. However, the nature of WES is that it experiences a high intermittency and volatility. Because of the great expectation of high WES penetration and the nature of such resources, researchers focus on studying the effects of such resources on the electric grid operation and its adequacy from different aspects. Additionally, current market operations of electric grids add another complication to consider while integrating RES (e.g., specifically WES). Mandates by market rules and long-term analysis of renewable penetration in large-scale electric grid are also the focus of researchers in recent years. We advocate a method for high-wind resources penetration study on large-scale electric grid operations. PMU is a geographical positioning system (GPS) based device, which provides immediate and precise measurements of voltage angle in a high-voltage transmission system. PMUs can update the status of a transmission line and related measurements (e.g., voltage magnitude and voltage phase angle) more frequently. Every second, a PMU can provide 30 samples of measurements compared to traditional systems (e.g., supervisory control and

  9. An Ice-Tethered Profiler: Initial results and role in a future Arctic network of Ice-Based Observatories

    NASA Astrophysics Data System (ADS)

    Toole, J. M.; Doherty, K. W.; Frye, D. E.; Hammar, T. R.; Kemp, J. N.; Krishfield, R. A.; Packard, G. E.; Peters, D. B.; Proshutinsky, A.; von der Heydt, K.

    2004-12-01

    Winter Bering Strait Waters and winter shelf waters emanating from Barrow and possibly Herald Canyons, and the temperature maximum around 350 m depth characterizing the Atlantic Water. Additionally, the 1 Hz CTD data resolve well the thermohaline staircase stratification above the Atlantic Layer thought to be caused by double diffusion and the "nested" intrusive structures that incise the Atlantic Water. In addition to results from this prototype instrument, a concept for future deployments of ITP's within a network of Ice-Based Observatories will also be presented.

  10. Large-scale solar magnetic fields and H-alpha patterns

    NASA Technical Reports Server (NTRS)

    Mcintosh, P. S.

    1972-01-01

    Coronal and interplanetary magnetic fields computed from measurements of large-scale photospheric magnetic fields suffer from interruptions in day-to-day observations and the limitation of using only measurements made near the solar central meridian. Procedures were devised for inferring the lines of polarity reversal from H-alpha solar patrol photographs that map the same large-scale features found on Mt. Wilson magnetograms. These features may be monitored without interruption by combining observations from the global network of observatories associated with NOAA's Space Environment Services Center. The patterns of inferred magnetic fields may be followed accurately as far as 60 deg from central meridian. Such patterns will be used to improve predictions of coronal features during the next solar eclipse.

  11. Using Web-Based Testing for Large-Scale Assessment.

    ERIC Educational Resources Information Center

    Hamilton, Laura S.; Klein, Stephen P.; Lorie, William

    This paper describes an approach to large-scale assessment that uses tests that are delivered to students over the Internet and that are tailored (adapted) to each student's own level of proficiency. A brief background on large-scale assessment is followed by a description of this new technology and an example. Issues that need to be investigated…

  12. Large Scale Turbulent Structures in Supersonic Jets

    NASA Technical Reports Server (NTRS)

    Rao, Ram Mohan; Lundgren, Thomas S.

    1997-01-01

    Jet noise is a major concern in the design of commercial aircraft. Studies by various researchers suggest that aerodynamic noise is a major contributor to jet noise. Some of these studies indicate that most of the aerodynamic jet noise due to turbulent mixing occurs when there is a rapid variation in turbulent structure, i.e. rapidly growing or decaying vortices. The objective of this research was to simulate a compressible round jet to study the non-linear evolution of vortices and the resulting acoustic radiations. In particular, to understand the effect of turbulence structure on the noise. An ideal technique to study this problem is Direct Numerical Simulations(DNS), because it provides precise control on the initial and boundary conditions that lead to the turbulent structures studied. It also provides complete 3-dimensional time dependent data. Since the dynamics of a temporally evolving jet are not greatly different from those, of a spatially evolving jet, a temporal jet problem was solved, using periodicity ill the direction of the jet axis. This enables the application of Fourier spectral methods in the streamwise direction. Physically this means that turbulent structures in the jet are repeated in successive downstream cells instead of being gradually modified downstream into a jet plume. The DNS jet simulation helps us understand the various turbulent scales and mechanisms of turbulence generation in the evolution of a compressible round jet. These accurate flow solutions will be used in future research to estimate near-field acoustic radiation by computing the total outward flux across a surface and determine how it is related to the evolution of the turbulent solutions. Furthermore, these simulations allow us to investigate the sensitivity of acoustic radiations to inlet/boundary conditions, with possible application to active noise suppression. In addition, the data generated can be used to compute various turbulence quantities such as mean velocities

  13. Large Scale Turbulent Structures in Supersonic Jets

    NASA Technical Reports Server (NTRS)

    Rao, Ram Mohan; Lundgren, Thomas S.

    1997-01-01

    Jet noise is a major concern in the design of commercial aircraft. Studies by various researchers suggest that aerodynamic noise is a major contributor to jet noise. Some of these studies indicate that most of the aerodynamic jet noise due to turbulent mixing occurs when there is a rapid variation in turbulent structure, i.e. rapidly growing or decaying vortices. The objective of this research was to simulate a compressible round jet to study the non-linear evolution of vortices and the resulting acoustic radiations. In particular, to understand the effect of turbulence structure on the noise. An ideal technique to study this problem is Direct Numerical Simulations (DNS), because it provides precise control on the initial and boundary conditions that lead to the turbulent structures studied. It also provides complete 3-dimensional time dependent data. Since the dynamics of a temporally evolving jet are not greatly different from those of a spatially evolving jet, a temporal jet problem was solved, using periodicity in the direction of the jet axis. This enables the application of Fourier spectral methods in the streamwise direction. Physically this means that turbulent structures in the jet are repeated in successive downstream cells instead of being gradually modified downstream into a jet plume. The DNS jet simulation helps us understand the various turbulent scales and mechanisms of turbulence generation in the evolution of a compressible round jet. These accurate flow solutions will be used in future research to estimate near-field acoustic radiation by computing the total outward flux across a surface and determine how it is related to the evolution of the turbulent solutions. Furthermore, these simulations allow us to investigate the sensitivity of acoustic radiations to inlet/boundary conditions, with possible appli(,a- tion to active noise suppression. In addition, the data generated can be used to compute, various turbulence quantities such as mean

  14. The Virtual Solar Observatory - Status and Plans

    NASA Astrophysics Data System (ADS)

    Hill, F.

    2001-05-01

    The Virtual Solar Observatory (VSO) is a software environment for searching, obtaining and analyzing data from archives of solar data that are distributed at many different observatories around the world. This "observatory" is virtual since it exists only on the Internet, not as a physical structure. As a research tool, the VSO would enable a new field of correlative statistical solar physics in which large-scale comparative studies spanning many dimensions and data sources could be carried out. Several groups with solar archives have indicated their willingness to particpate as a VSO component. These include NSO (KPVT GONG, and SOLIS); NASA/GSFC SDAC; SOHO; Stanford (SOI/MDI, TON, WSO); Lockheed (TRACE); MSU (Yohkoh); UCLA (Mt. Wilson 150-ft Tower); USC (Mt. Wilson 60-ft Tower); BBSO/NJIT; Arcetri (ARTHEMIS); Meudon; HAO; and CSUN/SFO. The VSO will be implemented so that additional systems can be easily incorporated. The VSO technical concept includes the federation of distributed solar archives, an adaptive metadata thesaurus, a single unified intuitive GUI, context-based searches, and distributed computing. The underlying structure would most likely be constructed using platform-independent tools such as XML and JavaScript. There are several technical challenges facing the VSO development. Issues of security, bandwidth, metadata, and load balancing must be faced. While the VSO is currently in the concept phase, a number of funding opportunities are bing pursued. The status of these proposals and plans for the future will be updated at the meeting.

  15. Collaborative Large-scale Engineering Analysis Network for Environmental Research (CLEANER)Science Planning

    NASA Astrophysics Data System (ADS)

    Schnoor, J. L.; Minsker, B. S.; Haas, C. N.

    2005-12-01

    The Project Office of the Collaborative Large-scale Engineering Analysis Network for Environmental Research (CLEANER) was awarded a cooperative agreement from the National Science Foundation (NSF)and began operations on August 1, 2005. Since that time we have organized six standing committees and an executive committee with an advisory board. The first all-hands meeting of CLEANER took place at NSF and the National Center for Supercomputing Applications (NCSA) Access facility in Arlington, Virginia, in September. Among the initial tasks of CLEANER is to join with the Consortium of Universities for the Advancement of Hydrological Sciences Incorporated (CUAHSI) in developing a joint science plan for a national observatory for environmental research utilizing NSF Major Research Equipment and Facilities Construction (MREFC) funds slated for 2011. This presentation describes our initial thinking on the science plan and our vision for the national environmental observatory and cyberinfrastructure.

  16. Large-Scale Events: New Ways of Working Across the Organization.

    ERIC Educational Resources Information Center

    Brigham, Steven E.

    1996-01-01

    Eight approaches to organizational change and problem solving that use large-scale events and involve a broad range of stakeholders are described, and their applications to college administration are discussed. They include future searches; open space technology; interactive design method; home-grown events such as retreats; great teaching…

  17. Distribution probability of large-scale landslides in central Nepal

    NASA Astrophysics Data System (ADS)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  18. Assessing large-scale wildlife responses to human infrastructure development

    PubMed Central

    Torres, Aurora; Jaeger, Jochen A. G.; Alonso, Juan Carlos

    2016-01-01

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future. PMID:27402749

  19. How large-scale subsidence affects stratocumulus transitions

    NASA Astrophysics Data System (ADS)

    van der Dussen, J. J.; de Roode, S. R.; Siebesma, A. P.

    2016-01-01

    Some climate modeling results suggest that the Hadley circulation might weaken in a future climate, causing a subsequent reduction in the large-scale subsidence velocity in the subtropics. In this study we analyze the cloud liquid water path (LWP) budget from large-eddy simulation (LES) results of three idealized stratocumulus transition cases, each with a different subsidence rate. As shown in previous studies a reduced subsidence is found to lead to a deeper stratocumulus-topped boundary layer, an enhanced cloud-top entrainment rate and a delay in the transition of stratocumulus clouds into shallow cumulus clouds during its equatorwards advection by the prevailing trade winds. The effect of a reduction of the subsidence rate can be summarized as follows. The initial deepening of the stratocumulus layer is partly counteracted by an enhanced absorption of solar radiation. After some hours the deepening of the boundary layer is accelerated by an enhancement of the entrainment rate. Because this is accompanied by a change in the cloud-base turbulent fluxes of moisture and heat, the net change in the LWP due to changes in the turbulent flux profiles is negligibly small.

  20. Planck intermediate results. XLII. Large-scale Galactic magnetic fields

    NASA Astrophysics Data System (ADS)

    Planck Collaboration; Adam, R.; Ade, P. A. R.; Alves, M. I. R.; Ashdown, M.; Aumont, J.; Baccigalupi, C.; Banday, A. J.; Barreiro, R. B.; Bartolo, N.; Battaner, E.; Benabed, K.; Benoit-Lévy, A.; Bernard, J.-P.; Bersanelli, M.; Bielewicz, P.; Bonavera, L.; Bond, J. R.; Borrill, J.; Bouchet, F. R.; Boulanger, F.; Bucher, M.; Burigana, C.; Butler, R. C.; Calabrese, E.; Cardoso, J.-F.; Catalano, A.; Chiang, H. C.; Christensen, P. R.; Colombo, L. P. L.; Combet, C.; Couchot, F.; Crill, B. P.; Curto, A.; Cuttaia, F.; Danese, L.; Davis, R. J.; de Bernardis, P.; de Rosa, A.; de Zotti, G.; Delabrouille, J.; Dickinson, C.; Diego, J. M.; Dolag, K.; Doré, O.; Ducout, A.; Dupac, X.; Elsner, F.; Enßlin, T. A.; Eriksen, H. K.; Ferrière, K.; Finelli, F.; Forni, O.; Frailis, M.; Fraisse, A. A.; Franceschi, E.; Galeotta, S.; Ganga, K.; Ghosh, T.; Giard, M.; Gjerløw, E.; González-Nuevo, J.; Górski, K. M.; Gregorio, A.; Gruppuso, A.; Gudmundsson, J. E.; Hansen, F. K.; Harrison, D. L.; Hernández-Monteagudo, C.; Herranz, D.; Hildebrandt, S. R.; Hobson, M.; Hornstrup, A.; Hurier, G.; Jaffe, A. H.; Jaffe, T. R.; Jones, W. C.; Juvela, M.; Keihänen, E.; Keskitalo, R.; Kisner, T. S.; Knoche, J.; Kunz, M.; Kurki-Suonio, H.; Lamarre, J.-M.; Lasenby, A.; Lattanzi, M.; Lawrence, C. R.; Leahy, J. P.; Leonardi, R.; Levrier, F.; Liguori, M.; Lilje, P. B.; Linden-Vørnle, M.; López-Caniego, M.; Lubin, P. M.; Macías-Pérez, J. F.; Maggio, G.; Maino, D.; Mandolesi, N.; Mangilli, A.; Maris, M.; Martin, P. G.; Martínez-González, E.; Masi, S.; Matarrese, S.; Melchiorri, A.; Mennella, A.; Migliaccio, M.; Miville-Deschênes, M.-A.; Moneti, A.; Montier, L.; Morgante, G.; Munshi, D.; Murphy, J. A.; Naselsky, P.; Nati, F.; Natoli, P.; Nørgaard-Nielsen, H. U.; Oppermann, N.; Orlando, E.; Pagano, L.; Pajot, F.; Paladini, R.; Paoletti, D.; Pasian, F.; Perotto, L.; Pettorino, V.; Piacentini, F.; Piat, M.; Pierpaoli, E.; Plaszczynski, S.; Pointecouteau, E.; Polenta, G.; Ponthieu, N.; Pratt, G. W.; Prunet, S.; Puget, J.-L.; Rachen, J. P.; Reinecke, M.; Remazeilles, M.; Renault, C.; Renzi, A.; Ristorcelli, I.; Rocha, G.; Rossetti, M.; Roudier, G.; Rubiño-Martín, J. A.; Rusholme, B.; Sandri, M.; Santos, D.; Savelainen, M.; Scott, D.; Spencer, L. D.; Stolyarov, V.; Stompor, R.; Strong, A. W.; Sudiwala, R.; Sunyaev, R.; Suur-Uski, A.-S.; Sygnet, J.-F.; Tauber, J. A.; Terenzi, L.; Toffolatti, L.; Tomasi, M.; Tristram, M.; Tucci, M.; Valenziano, L.; Valiviita, J.; Van Tent, F.; Vielva, P.; Villa, F.; Wade, L. A.; Wandelt, B. D.; Wehus, I. K.; Yvon, D.; Zacchei, A.; Zonca, A.

    2016-12-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured by the Planck satellite. We first update these models to match the Planck synchrotron products using a common model for the cosmic-ray leptons. We discuss the impact on this analysis of the ongoing problems of component separation in the Planck microwave bands and of the uncertain cosmic-ray spectrum. In particular, the inferred degree of ordering in the magnetic fields is sensitive to these systematic uncertainties, and we further show the importance of considering the expected variations in the observables in addition to their mean morphology. We then compare the resulting simulated emission to the observed dust polarization and find that the dust predictions do not match the morphology in the Planck data but underpredict the dust polarization away from the plane. We modify one of the models to roughly match both observables at high latitudes by increasing the field ordering in the thin disc near the observer. Though this specific analysis is dependent on the component separation issues, we present the improved model as a proof of concept for how these studies can be advanced in future using complementary information from ongoing and planned observational projects.

  1. Scalable pattern recognition for large-scale scientific data mining

    SciTech Connect

    Kamath, C.; Musick, R.

    1998-03-23

    Our ability to generate data far outstrips our ability to explore and understand it. The true value of this data lies not in its final size or complexity, but rather in our ability to exploit the data to achieve scientific goals. The data generated by programs such as ASCI have such a large scale that it is impractical to manually analyze, explore, and understand it. As a result, useful information is overlooked, and the potential benefits of increased computational and data gathering capabilities are only partially realized. The difficulties that will be faced by ASCI applications in the near future are foreshadowed by the challenges currently facing astrophysicists in making full use of the data they have collected over the years. For example, among other difficulties, astrophysicists have expressed concern that the sheer size of their data restricts them to looking at very small, narrow portions at any one time. This narrow focus has resulted in the loss of ``serendipitous`` discoveries which have been so vital to progress in the area in the past. To solve this problem, a new generation of computational tools and techniques is needed to help automate the exploration and management of large scientific data. This whitepaper proposes applying and extending ideas from the area of data mining, in particular pattern recognition, to improve the way in which scientists interact with large, multi-dimensional, time-varying data.

  2. Episodic memory in aspects of large-scale brain networks

    PubMed Central

    Jeong, Woorim; Chung, Chun Kee; Kim, June Sic

    2015-01-01

    Understanding human episodic memory in aspects of large-scale brain networks has become one of the central themes in neuroscience over the last decade. Traditionally, episodic memory was regarded as mostly relying on medial temporal lobe (MTL) structures. However, recent studies have suggested involvement of more widely distributed cortical network and the importance of its interactive roles in the memory process. Both direct and indirect neuro-modulations of the memory network have been tried in experimental treatments of memory disorders. In this review, we focus on the functional organization of the MTL and other neocortical areas in episodic memory. Task-related neuroimaging studies together with lesion studies suggested that specific sub-regions of the MTL are responsible for specific components of memory. However, recent studies have emphasized that connectivity within MTL structures and even their network dynamics with other cortical areas are essential in the memory process. Resting-state functional network studies also have revealed that memory function is subserved by not only the MTL system but also a distributed network, particularly the default-mode network (DMN). Furthermore, researchers have begun to investigate memory networks throughout the entire brain not restricted to the specific resting-state network (RSN). Altered patterns of functional connectivity (FC) among distributed brain regions were observed in patients with memory impairments. Recently, studies have shown that brain stimulation may impact memory through modulating functional networks, carrying future implications of a novel interventional therapy for memory impairment. PMID:26321939

  3. Generating intrinsic dipole anisotropy in the large scale structures

    NASA Astrophysics Data System (ADS)

    Ghosh, Shamik

    2014-03-01

    There have been recent reports of unexpectedly large velocity dipole in the NRAO VLA Sky Survey (NVSS) data. We investigate whether the excess in the NVSS dipole reported can be of cosmological origin. We assume a long wavelength inhomogeneous scalar perturbation of the form αsin(κz) and study its effects on the matter density contrasts. Assuming an ideal fluid model, we calculate, in the linear regime, the contribution of the inhomogeneous mode to the density contrast. We calculate the expected dipole in the large scale structure (LSS) for two cases, first assuming that the mode is still superhorizon everywhere, and second assuming the mode is subhorizon but has crossed the horizon deep in matter domination and is subhorizon everywhere in the region of the survey (NVSS). In both cases, we find that such an inhomogeneous scalar perturbation is sufficient to generate the reported values of dipole anisotropy in LSS. For the superhorizon modes, we find values which are consistent with both cosmic microwave background and NVSS results. We also predict signatures for the model which can be tested by future observations.

  4. Simulation and experiment for large scale space structure

    NASA Astrophysics Data System (ADS)

    Sun, Hongbo; Zhou, Jian; Zha, Zuoliang

    2013-04-01

    The future space structures are relatively large, flimsy, and lightweight. As a result, they are more easily affected or distortion by space environments compared to other space structures. This study examines the structural integrity of a large scale space structure. A new design of transient temperature field analysis method of the developable reflector on orbit environment is presented, which simulates physical characteristic of developable antenna reflector with a high precision. The different kinds of analysis denote that different thermal elastic characteristics of different materials. The three-dimension multi-physics coupling transient thermal distortion equations for the antenna are founded based on the Galerkins method. For a reflector on geosynchronous orbit, the transient temperature field results from this method are compared with these from NASA. It follows from the analysis that the precision of this method is high. An experimental system is established to verify the control mechanism with IEBIS and thermal sensor technique. The shape control experiments are finished by measuring and analyzing developable tube. Results reveal that the temperature levels of the developable antenna reflector alternate greatly in the orbital period, which is about ±120° when considering solar flux ,earth radiating flux and albedo scattering flux.

  5. Analyzing large-scale proteomics projects with latent semantic indexing.

    PubMed

    Klie, Sebastian; Martens, Lennart; Vizcaíno, Juan Antonio; Côté, Richard; Jones, Phil; Apweiler, Rolf; Hinneburg, Alexander; Hermjakob, Henning

    2008-01-01

    Since the advent of public data repositories for proteomics data, readily accessible results from high-throughput experiments have been accumulating steadily. Several large-scale projects in particular have contributed substantially to the amount of identifications available to the community. Despite the considerable body of information amassed, very few successful analyses have been performed and published on this data, leveling off the ultimate value of these projects far below their potential. A prominent reason published proteomics data is seldom reanalyzed lies in the heterogeneous nature of the original sample collection and the subsequent data recording and processing. To illustrate that at least part of this heterogeneity can be compensated for, we here apply a latent semantic analysis to the data contributed by the Human Proteome Organization's Plasma Proteome Project (HUPO PPP). Interestingly, despite the broad spectrum of instruments and methodologies applied in the HUPO PPP, our analysis reveals several obvious patterns that can be used to formulate concrete recommendations for optimizing proteomics project planning as well as the choice of technologies used in future experiments. It is clear from these results that the analysis of large bodies of publicly available proteomics data by noise-tolerant algorithms such as the latent semantic analysis holds great promise and is currently underexploited.

  6. Assessing large-scale wildlife responses to human infrastructure development.

    PubMed

    Torres, Aurora; Jaeger, Jochen A G; Alonso, Juan Carlos

    2016-07-26

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future.

  7. Organised convection embedded in a large-scale flow

    NASA Astrophysics Data System (ADS)

    Naumann, Ann Kristin; Stevens, Bjorn; Hohenegger, Cathy

    2017-04-01

    In idealised simulations of radiative convective equilibrium, convection aggregates spontaneously from randomly distributed convective cells into organized mesoscale convection despite homogeneous boundary conditions. Although these simulations apply very idealised setups, the process of self-aggregation is thought to be relevant for the development of tropical convective systems. One feature that idealised simulations usually neglect is the occurrence of a large-scale background flow. In the tropics, organised convection is embedded in a large-scale circulation system, which advects convection in along-wind direction and alters near surface convergence in the convective areas. A large-scale flow also modifies the surface fluxes, which are expected to be enhanced upwind of the convective area if a large-scale flow is applied. Convective clusters that are embedded in a large-scale flow therefore experience an asymmetric component of the surface fluxes, which influences the development and the pathway of a convective cluster. In this study, we use numerical simulations with explicit convection and add a large-scale flow to the established setup of radiative convective equilibrium. We then analyse how aggregated convection evolves when being exposed to wind forcing. The simulations suggest that convective line structures are more prevalent if a large-scale flow is present and that convective clusters move considerably slower than advection by the large-scale flow would suggest. We also study the asymmetric component of convective aggregation due to enhanced surface fluxes, and discuss the pathway and speed of convective clusters as a function of the large-scale wind speed.

  8. Food security through large scale investments in agriculture

    NASA Astrophysics Data System (ADS)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  9. Large-scale dynamo growth rates from numerical simulations and implications for mean-field theories.

    PubMed

    Park, Kiwan; Blackman, Eric G; Subramanian, Kandaswamy

    2013-05-01

    Understanding large-scale magnetic field growth in turbulent plasmas in the magnetohydrodynamic limit is a goal of magnetic dynamo theory. In particular, assessing how well large-scale helical field growth and saturation in simulations match those predicted by existing theories is important for progress. Using numerical simulations of isotropically forced turbulence without large-scale shear with its implications, we focus on several additional aspects of this comparison: (1) Leading mean-field dynamo theories which break the field into large and small scales predict that large-scale helical field growth rates are determined by the difference between kinetic helicity and current helicity with no dependence on the nonhelical energy in small-scale magnetic fields. Our simulations show that the growth rate of the large-scale field from fully helical forcing is indeed unaffected by the presence or absence of small-scale magnetic fields amplified in a precursor nonhelical dynamo. However, because the precursor nonhelical dynamo in our simulations produced fields that were strongly subequipartition with respect to the kinetic energy, we cannot yet rule out the potential influence of stronger nonhelical small-scale fields. (2) We have identified two features in our simulations which cannot be explained by the most minimalist versions of two-scale mean-field theory: (i) fully helical small-scale forcing produces significant nonhelical large-scale magnetic energy and (ii) the saturation of the large-scale field growth is time delayed with respect to what minimalist theory predicts. We comment on desirable generalizations to the theory in this context and future desired work.

  10. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    SciTech Connect

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  11. The Internet As a Large-Scale Complex System

    NASA Astrophysics Data System (ADS)

    Park, Kihong; Willinger, Walter

    2005-06-01

    The Internet may be viewed as a "complex system" with diverse features and many components that can give rise to unexpected emergent phenomena, revealing much about its own engineering. This book brings together chapter contributions from a workshop held at the Santa Fe Institute in March 2001. This volume captures a snapshot of some features of the Internet that may be fruitfully approached using a complex systems perspective, meaning using interdisciplinary tools and methods to tackle the subject area. The Internet penetrates the socioeconomic fabric of everyday life; a broader and deeper grasp of the Internet may be needed to meet the challenges facing the future. The resulting empirical data have already proven to be invaluable for gaining novel insights into the network's spatio-temporal dynamics, and can be expected to become even more important when tryin to explain the Internet's complex and emergent behavior in terms of elementary networking-based mechanisms. The discoveries of fractal or self-similar network traffic traces, power-law behavior in network topology and World Wide Web connectivity are instances of unsuspected, emergent system traits. Another important factor at the heart of fair, efficient, and stable sharing of network resources is user behavior. Network systems, when habited by selfish or greedy users, take on the traits of a noncooperative multi-party game, and their stability and efficiency are integral to understanding the overall system and its dynamics. Lastly, fault-tolerance and robustness of large-scale network systems can exhibit spatial and temporal correlations whose effective analysis and management may benefit from rescaling techniques applied in certain physical and biological systems. The present book will bring together several of the leading workers involved in the analysis of complex systems with the future development of the Internet.

  12. Pediatric response to a large-scale child protection intervention.

    PubMed

    Lukefahr, James L; Kellogg, Nancy D; Anderst, James D; Gavril, Amy R; Wehner, Karl K

    2011-08-01

    In a rural area of the US state of Texas, in April 2008, the Texas Department of Family and Protective Services (DFPS) responded to evidence of widespread child abuse in an isolated religious compound by removing 463 individuals into state custody. This mass child protection intervention is the largest such action that has ever occurred in the United States. The objective of this paper is to characterize the burdens placed on the area's community resources, healthcare providers, and legal system, the limitations encountered by the forensic and public health professionals, and how these might be minimized in future large-scale child protection interventions. Drawing on publicly available information, this article describes the child abuse investigation, legal outcomes, experiences of pediatric healthcare providers directly affected by the mass removal, and the roles of regional child abuse pediatric specialists. Because the compound's residents refused to cooperate with the investigation and the population of the compound was eight times higher than expected, law enforcement and child protection resources were insufficient to conduct standard child abuse investigations. Local medical and public health resources were also quickly overwhelmed. Consulting child abuse pediatricians were asked to recommend laboratory and radiologic studies that could assist in identifying signs of child abuse, but the lack of cooperation from patients and parents, inadequate medical histories, and limited physical examinations precluded full implementation of the recommendations. Although most children in danger of abuse were removed from the high-risk environment for several months and some suspected abusers were found guilty in criminal trials, the overall success of the child protection intervention was reduced by the limitations imposed by insufficient resources and lack of cooperation from the compound's residents. Recommendations for community and child abuse pediatricians who may

  13. Influence of large-scale atmospheric circulation on marine air intrusion toward the East Antarctic coast

    NASA Astrophysics Data System (ADS)

    Kurita, Naoyuki; Hirasawa, Naohiko; Koga, Seizi; Matsushita, Junji; Steen-Larsen, Hans Christian; Masson-Delmotte, Valérie; Fujiyoshi, Yasushi

    2016-09-01

    Marine air intrusions into Antarctica play a key role in high-precipitation events. Here we use shipboard observations of water vapor isotopologues between Australia and Syowa on the East Antarctic coast to elucidate the mechanism by which large-scale circulation influences marine air intrusions. The temporal isotopic variations at Syowa reflect the meridional movement of a marine air front. They are also associated with atmospheric circulation anomalies that enhance the southward movement of cyclones over the Southern Ocean. The relationship between large-scale circulation and the movement of the front is explained by northerly winds which, in association with cyclones, move toward the Antarctic coast and push marine air with isotopically enriched moisture into the inland covered by glacial air with depleted isotopic values. Future changes in large-scale circulation may have a significant impact on the frequency and intensity of marine air intrusion into Antarctica.

  14. Copy of Using Emulation and Simulation to Understand the Large-Scale Behavior of the Internet.

    SciTech Connect

    Adalsteinsson, Helgi; Armstrong, Robert C.; Chiang, Ken; Gentile, Ann C.; Lloyd, Levi; Minnich, Ronald G.; Vanderveen, Keith; Van Randwyk, Jamie A; Rudish, Don W.

    2008-10-01

    We report on the work done in the late-start LDRDUsing Emulation and Simulation toUnderstand the Large-Scale Behavior of the Internet. We describe the creation of a researchplatform that emulates many thousands of machines to be used for the study of large-scale inter-net behavior. We describe a proof-of-concept simple attack we performed in this environment.We describe the successful capture of a Storm bot and, from the study of the bot and furtherliterature search, establish large-scale aspects we seek to understand via emulation of Storm onour research platform in possible follow-on work. Finally, we discuss possible future work.3

  15. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    SciTech Connect

    Kanehiro, B.Y.; Lai, C.H.; Stow, S.H.

    1987-05-01

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made.

  16. Modified gravity and large scale flows, a review

    NASA Astrophysics Data System (ADS)

    Mould, Jeremy

    2017-02-01

    Large scale flows have been a challenging feature of cosmography ever since galaxy scaling relations came on the scene 40 years ago. The next generation of surveys will offer a serious test of the standard cosmology.

  17. Learning networks for sustainable, large-scale improvement.

    PubMed

    McCannon, C Joseph; Perla, Rocco J

    2009-05-01

    Large-scale improvement efforts known as improvement networks offer structured opportunities for exchange of information and insights into the adaptation of clinical protocols to a variety of settings.

  18. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Kumar, Rohit; Verma, Mahendra K.

    2017-09-01

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  19. Mechanisation of large-scale agricultural fields in developing countries - a review.

    PubMed

    Onwude, Daniel I; Abdulstter, Rafia; Gomes, Chandima; Hashim, Norhashila

    2016-09-01

    Mechanisation of large-scale agricultural fields often requires the application of modern technologies such as mechanical power, automation, control and robotics. These technologies are generally associated with relatively well developed economies. The application of these technologies in some developing countries in Africa and Asia is limited by factors such as technology compatibility with the environment, availability of resources to facilitate the technology adoption, cost of technology purchase, government policies, adequacy of technology and appropriateness in addressing the needs of the population. As a result, many of the available resources have been used inadequately by farmers, who continue to rely mostly on conventional means of agricultural production, using traditional tools and equipment in most cases. This has led to low productivity and high cost of production among others. Therefore this paper attempts to evaluate the application of present day technology and its limitations to the advancement of large-scale mechanisation in developing countries of Africa and Asia. Particular emphasis is given to a general understanding of the various levels of mechanisation, present day technology, its management and application to large-scale agricultural fields. This review also focuses on/gives emphasis to future outlook that will enable a gradual, evolutionary and sustainable technological change. The study concludes that large-scale-agricultural farm mechanisation for sustainable food production in Africa and Asia must be anchored on a coherent strategy based on the actual needs and priorities of the large-scale farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  20. An Adaptive Multiscale Finite Element Method for Large Scale Simulations

    DTIC Science & Technology

    2015-09-28

    the method . Using the above definitions , the weak statement of the non-linear local problem at the kth 4 DISTRIBUTION A: Distribution approved for...AFRL-AFOSR-VA-TR-2015-0305 An Adaptive Multiscale Finite Element Method for Large Scale Simulations Carlos Duarte UNIVERSITY OF ILLINOIS CHAMPAIGN...14-07-2015 4. TITLE AND SUBTITLE An Adaptive Multiscale Generalized Finite Element Method for Large Scale Simulations 5a.  CONTRACT NUMBER 5b

  1. Large-scale studies of marked birds in North America

    USGS Publications Warehouse

    Tautin, J.; Metras, L.; Smith, G.

    1999-01-01

    The first large-scale, co-operative, studies of marked birds in North America were attempted in the 1950s. Operation Recovery, which linked numerous ringing stations along the east coast in a study of autumn migration of passerines, and the Preseason Duck Ringing Programme in prairie states and provinces, conclusively demonstrated the feasibility of large-scale projects. The subsequent development of powerful analytical models and computing capabilities expanded the quantitative potential for further large-scale projects. Monitoring Avian Productivity and Survivorship, and Adaptive Harvest Management are current examples of truly large-scale programmes. Their exemplary success and the availability of versatile analytical tools are driving changes in the North American bird ringing programme. Both the US and Canadian ringing offices are modifying operations to collect more and better data to facilitate large-scale studies and promote a more project-oriented ringing programme. New large-scale programmes such as the Cornell Nest Box Network are on the horizon.

  2. A study of MLFMA for large-scale scattering problems

    NASA Astrophysics Data System (ADS)

    Hastriter, Michael Larkin

    This research is centered in computational electromagnetics with a focus on solving large-scale problems accurately in a timely fashion using first principle physics. Error control of the translation operator in 3-D is shown. A parallel implementation of the multilevel fast multipole algorithm (MLFMA) was studied as far as parallel efficiency and scaling. The large-scale scattering program (LSSP), based on the ScaleME library, was used to solve ultra-large-scale problems including a 200lambda sphere with 20 million unknowns. As these large-scale problems were solved, techniques were developed to accurately estimate the memory requirements. Careful memory management is needed in order to solve these massive problems. The study of MLFMA in large-scale problems revealed significant errors that stemmed from inconsistencies in constants used by different parts of the algorithm. These were fixed to produce the most accurate data possible for large-scale surface scattering problems. Data was calculated on a missile-like target using both high frequency methods and MLFMA. This data was compared and analyzed to determine possible strategies to increase data acquisition speed and accuracy through multiple computation method hybridization.

  3. Ionospheric plasma disturbances generated by naturally occurring large-scale anomalous heat sources

    NASA Astrophysics Data System (ADS)

    Pradipta, Rezy; Lee, Min-Chang; Coster, Anthea J.; Tepley, Craig A.; Sulzer, Michael P.; Gonzalez, Sixto A.

    2017-04-01

    We report the findings from our investigation on the possibility of large-scale anomalous thermal gradients to generate acoustic-gravity waves (AGWs) and traveling ionospheric disturbances (TIDs). In particular, here we consider the case of summer 2006 North American heat wave event as a concrete example of such large-scale natural thermal gradients. This special scenario of AGW/TID generation was formulated based on the results of our experiments at the Arecibo Observatory in July 2006, followed by a systematic monitoring/surveillance of total electron content (TEC) fluctuations over North America in 2005-2007 using the MIT Haystack Observatory's Madrigal database. The data from our Arecibo experiments indicate a continual occurrence of intense AGW/TID over the Caribbean on 21-24 July 2006, and the Madrigal TEC data analysis shows that the overall level of TID activity over North America had increased by ∼0.2 TECU during the summer 2006 heat wave event. Our proposed scenario is in agreement with these empirical observations, and is generally consistent with a number of past ionospheric HF heating experiments related to AGW/TID generation.

  4. A future wide field-of-view TeV gamma-ray observatory in the Southern Hemisphere

    NASA Astrophysics Data System (ADS)

    Mostafa, Miguel; HAWC Collaboration

    2017-01-01

    High-energy gamma-ray observations are an essential probe of cosmic-ray acceleration. Detection of the highest energies and the shortest timescales of variability are key motivations when designing the next generation of gamma-ray experiments. The Milagro experiment was the first-generation of gamma-ray detectors based on the water-Cherenkov technique, and demonstrated that it is possible to continuously monitor a large fraction of the TeV sky. The second-generation water-Cherenkov experiment, the High Altitude Water Cherenkov observatory, consists of an array of 300 water-Cherenkov detectors covering an area of 22,000 m2 at 4,100 m a.s.l. The larger effective area, the higher altitude, and the optical isolation of the detectors led to a 15-fold increase in sensitivity relative to Milagro. Instruments with a wide field of view and large duty cycle are capable of surveying the TeV sky, mapping the diffuse emission, detecting emission from extended regions, and observing transient events such as gamma ray bursts. They also have the potential for discovering electromagnetic counterparts to gravitational waves and astrophysical neutrinos. I will present the preliminary design of a third-generation water-Cherenkov observatory located at very high altitude in South America.

  5. Advancing the Gemini Observatory

    NASA Astrophysics Data System (ADS)

    Hammel, Heidi B.; Levenson, Nancy A.

    2012-11-01

    Gemini Science and User Meeting; San Francisco, California, 17-20 July 2012 More than 100 astronomers gathered in San Francisco to discuss results from the Gemini Observatory and to plan for its future. The Gemini Observatory consists of twin 8.1 meter diameter optical/infrared telescopes located on mountaintops in Hawai'i and Chile. Gemini was built and is operated by an international partnership that currently includes the United States, the United Kingdom, Canada, Chile, Australia, Brazil, and Argentina.

  6. Large-scale CO2 storage — Is it feasible?

    NASA Astrophysics Data System (ADS)

    Johansen, H.

    2013-06-01

    CCS is generally estimated to have to account for about 20% of the reduction of CO2 emissions to the atmosphere. This paper focuses on the technical aspects of CO2 storage, even if the CCS challenge is equally dependent upon finding viable international solutions to a wide range of economic, political and cultural issues. It has already been demonstrated that it is technically possible to store adequate amounts of CO2 in the subsurface (Sleipner, InSalah, Snøhvit). The large-scale storage challenge (several Gigatons of CO2 per year) is more an issue of minimizing cost without compromising safety, and of making international regulations.The storage challenge may be split into 4 main parts: 1) finding reservoirs with adequate storage capacity, 2) make sure that the sealing capacity above the reservoir is sufficient, 3) build the infrastructure for transport, drilling and injection, and 4) set up and perform the necessary monitoring activities. More than 150 years of worldwide experience from the production of oil and gas is an important source of competence for CO2 storage. The storage challenge is however different in three important aspects: 1) the storage activity results in pressure increase in the subsurface, 2) there is no production of fluids that give important feedback on reservoir performance, and 3) the monitoring requirement will have to extend for a much longer time into the future than what is needed during oil and gas production. An important property of CO2 is that its behaviour in the subsurface is significantly different from that of oil and gas. CO2 in contact with water is reactive and corrosive, and may impose great damage on both man-made and natural materials, if proper precautions are not executed. On the other hand, the long-term effect of most of these reactions is that a large amount of CO2 will become immobilized and permanently stored as solid carbonate minerals. The reduced opportunity for direct monitoring of fluid samples close to the

  7. Large Scale Relationship between Aquatic Insect Traits and Climate.

    PubMed

    Bhowmik, Avit Kumar; Schäfer, Ralf B

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices (~18 km resolution) for five insect orders (Diptera, Ephemeroptera, Odonata, Plecoptera and Trichoptera), evaluated their potential for changing distribution pattern under future climate change and identified the most influential bioclimatic indices. The data comprised 782 species and 395 genera sampled in 4,752 stream sites during 2006 and 2007 in Germany (~357,000 km² spatial extent). We quantified the variability and spatial autocorrelation in the traits and orders that are associated with the combined and individual bioclimatic indices. Traits of temperature preference grouping feature that are the products of several other underlying climate-associated traits, and the insect order Ephemeroptera exhibited the strongest response to the bioclimatic indices as well as the highest potential for changing distribution pattern. Regarding individual traits, insects in general and ephemeropterans preferring very cold temperature showed the highest response, and the insects preferring cold and trichopterans preferring moderate temperature showed the highest potential for changing distribution. We showed that the seasonal radiation and moisture are the most influential bioclimatic aspects, and thus changes in these aspects may affect the most responsive traits and orders and drive a change in their spatial distribution pattern. Our findings support the development of trait-based metrics to predict and detect climate

  8. Large Scale Relationship between Aquatic Insect Traits and Climate

    PubMed Central

    Bhowmik, Avit Kumar; Schäfer, Ralf B.

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices (~18 km resolution) for five insect orders (Diptera, Ephemeroptera, Odonata, Plecoptera and Trichoptera), evaluated their potential for changing distribution pattern under future climate change and identified the most influential bioclimatic indices. The data comprised 782 species and 395 genera sampled in 4,752 stream sites during 2006 and 2007 in Germany (~357,000 km² spatial extent). We quantified the variability and spatial autocorrelation in the traits and orders that are associated with the combined and individual bioclimatic indices. Traits of temperature preference grouping feature that are the products of several other underlying climate-associated traits, and the insect order Ephemeroptera exhibited the strongest response to the bioclimatic indices as well as the highest potential for changing distribution pattern. Regarding individual traits, insects in general and ephemeropterans preferring very cold temperature showed the highest response, and the insects preferring cold and trichopterans preferring moderate temperature showed the highest potential for changing distribution. We showed that the seasonal radiation and moisture are the most influential bioclimatic aspects, and thus changes in these aspects may affect the most responsive traits and orders and drive a change in their spatial distribution pattern. Our findings support the development of trait-based metrics to predict and detect climate

  9. Wireless gigabit data telemetry for large-scale neural recording.

    PubMed

    Kuan, Yen-Cheng; Lo, Yi-Kai; Kim, Yanghyo; Chang, Mau-Chung Frank; Liu, Wentai

    2015-05-01

    Implantable wireless neural recording from a large ensemble of simultaneously acting neurons is a critical component to thoroughly investigate neural interactions and brain dynamics from freely moving animals. Recent researches have shown the feasibility of simultaneously recording from hundreds of neurons and suggested that the ability of recording a larger number of neurons results in better signal quality. This massive recording inevitably demands a large amount of data transfer. For example, recording 2000 neurons while keeping the signal fidelity ( > 12 bit, > 40 KS/s per neuron) needs approximately a 1-Gb/s data link. Designing a wireless data telemetry system to support such (or higher) data rate while aiming to lower the power consumption of an implantable device imposes a grand challenge on neuroscience community. In this paper, we present a wireless gigabit data telemetry for future large-scale neural recording interface. This telemetry comprises of a pair of low-power gigabit transmitter and receiver operating at 60 GHz, and establishes a short-distance wireless link to transfer the massive amount of neural signals outward from the implanted device. The transmission distance of the received neural signal can be further extended by an externally rendezvous wireless transceiver, which is less power/heat-constraint since it is not at the immediate proximity of the cortex and its radiated signal is not seriously attenuated by the lossy tissue. The gigabit data link has been demonstrated to achieve a high data rate of 6 Gb/s with a bit-error-rate of 10(-12) at a transmission distance of 6 mm, an applicable separation between transmitter and receiver. This high data rate is able to support thousands of recording channels while ensuring a low energy cost per bit of 2.08 pJ/b.

  10. Large-scale multielectrode recording and stimulation of neural activity

    NASA Astrophysics Data System (ADS)

    Sher, A.; Chichilnisky, E. J.; Dabrowski, W.; Grillo, A. A.; Grivich, M.; Gunning, D.; Hottowy, P.; Kachiguine, S.; Litke, A. M.; Mathieson, K.; Petrusca, D.

    2007-09-01

    Large circuits of neurons are employed by the brain to encode and process information. How this encoding and processing is carried out is one of the central questions in neuroscience. Since individual neurons communicate with each other through electrical signals (action potentials), the recording of neural activity with arrays of extracellular electrodes is uniquely suited for the investigation of this question. Such recordings provide the combination of the best spatial (individual neurons) and temporal (individual action-potentials) resolutions compared to other large-scale imaging methods. Electrical stimulation of neural activity in turn has two very important applications: it enhances our understanding of neural circuits by allowing active interactions with them, and it is a basis for a large variety of neural prosthetic devices. Until recently, the state-of-the-art in neural activity recording systems consisted of several dozen electrodes with inter-electrode spacing ranging from tens to hundreds of microns. Using silicon microstrip detector expertise acquired in the field of high-energy physics, we created a unique neural activity readout and stimulation framework that consists of high-density electrode arrays, multi-channel custom-designed integrated circuits, a data acquisition system, and data-processing software. Using this framework we developed a number of neural readout and stimulation systems: (1) a 512-electrode system for recording the simultaneous activity of as many as hundreds of neurons, (2) a 61-electrode system for electrical stimulation and readout of neural activity in retinas and brain-tissue slices, and (3) a system with telemetry capabilities for recording neural activity in the intact brain of awake, naturally behaving animals. We will report on these systems, their various applications to the field of neurobiology, and novel scientific results obtained with some of them. We will also outline future directions.

  11. Carnegie Observatories

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    The Carnegie Observatories were founded in 1902 by George Ellery Hale. Their first facility was the MOUNT WILSON OBSERVATORY, located in the San Gabriel Mountains above Pasadena, California. Originally a solar observatory, it moved into stellar, galactic and extragalactic research with the construction of the 60 in (1.5 m), and 100 in (2.5 m) telescopes, each of which was the largest in the world...

  12. EINSTEIN'S SIGNATURE IN COSMOLOGICAL LARGE-SCALE STRUCTURE

    SciTech Connect

    Bruni, Marco; Hidalgo, Juan Carlos; Wands, David

    2014-10-10

    We show how the nonlinearity of general relativity generates a characteristic nonGaussian signal in cosmological large-scale structure that we calculate at all perturbative orders in a large-scale limit. Newtonian gravity and general relativity provide complementary theoretical frameworks for modeling large-scale structure in ΛCDM cosmology; a relativistic approach is essential to determine initial conditions, which can then be used in Newtonian simulations studying the nonlinear evolution of the matter density. Most inflationary models in the very early universe predict an almost Gaussian distribution for the primordial metric perturbation, ζ. However, we argue that it is the Ricci curvature of comoving-orthogonal spatial hypersurfaces, R, that drives structure formation at large scales. We show how the nonlinear relation between the spatial curvature, R, and the metric perturbation, ζ, translates into a specific nonGaussian contribution to the initial comoving matter density that we calculate for the simple case of an initially Gaussian ζ. Our analysis shows the nonlinear signature of Einstein's gravity in large-scale structure.

  13. Recursive architecture for large-scale adaptive system

    NASA Astrophysics Data System (ADS)

    Hanahara, Kazuyuki; Sugiyama, Yoshihiko

    1994-09-01

    'Large scale' is one of major trends in the research and development of recent engineering, especially in the field of aerospace structural system. This term expresses the large scale of an artifact in general, however, it also implies the large number of the components which make up the artifact in usual. Considering a large scale system which is especially used in remote space or deep-sea, such a system should be adaptive as well as robust by itself, because its control as well as maintenance by human operators are not easy due to the remoteness. An approach to realizing this large scale, adaptive and robust system is to build the system as an assemblage of components which are respectively adaptive by themselves. In this case, the robustness of the system can be achieved by using a large number of such components and suitable adaptation as well as maintenance strategies. Such a system gathers many research's interest and their studies such as decentralized motion control, configurating algorithm and characteristics of structural elements are reported. In this article, a recursive architecture concept is developed and discussed towards the realization of large scale system which consists of a number of uniform adaptive components. We propose an adaptation strategy based on the architecture and its implementation by means of hierarchically connected processing units. The robustness and the restoration from degeneration of the processing unit are also discussed. Two- and three-dimensional adaptive truss structures are conceptually designed based on the recursive architecture.

  14. The Influence of Large-scale Environments on Galaxy Properties

    NASA Astrophysics Data System (ADS)

    Wei, Yu-qing; Wang, Lei; Dai, Cai-ping

    2017-07-01

    The star formation properties of galaxies and their dependence on environments play an important role for understanding the formation and evolution of galaxies. Using the galaxy sample of the Sloan Digital Sky Survey (SDSS), different research groups have studied the physical properties of galaxies and their large-scale environments. Here, using the filament catalog from Tempel et al. and the galaxy catalog of large-scale structure classification from Wang et al., and taking the influence of the galaxy morphology, high/low local density environment, and central (satellite) galaxy into consideration, we have found that the properties of galaxies are correlated with their residential large-scale environments: the SSFR (specific star formation rate) and SFR (star formation rate) strongly depend on the large-scale environment for spiral galaxies and satellite galaxies, but this dependence is very weak for elliptical galaxies and central galaxies, and the influence of large-scale environments on galaxies in low density region is more sensitive than that in high density region. The above conclusions remain valid even for the galaxies with the same mass. In addition, the SSFR distributions derived from the catalogs of Tempel et al. and Wang et al. are not entirely consistent.

  15. On the Prospects of Measuring the Cosmic History of Element Synthesis with Future Far-IR/Submillimeter Observatories

    NASA Technical Reports Server (NTRS)

    Leisawitz, David

    2003-01-01

    To understand the cosmic history of element synthesis it will be important to obtain extinction-free measures of the heavy element contents of high-redshift objects and to chart two monumental events: the collapse of the first metal-free clouds to form stars, and the initial seeding of the universe with dust. The information needed to achieve these objectives is uniquely available in the far-infrared/submillimeter (FIR/SMM) spectral region. Following the Decadal Report and anticipating the development of the Single Aperature Far-IR (SAFIR) telescope capabilities of a large-aperature, background-limited FIR/SMM observatory and an interferometer on a boom, and discuss how such instruments could be used to measure the element synthesis history of the universe.

  16. Toward Improved Support for Loosely Coupled Large Scale Simulation Workflows

    SciTech Connect

    Boehm, Swen; Elwasif, Wael R; Naughton, III, Thomas J; Vallee, Geoffroy R

    2014-01-01

    High-performance computing (HPC) workloads are increasingly leveraging loosely coupled large scale simula- tions. Unfortunately, most large-scale HPC platforms, including Cray/ALPS environments, are designed for the execution of long-running jobs based on coarse-grained launch capabilities (e.g., one MPI rank per core on all allocated compute nodes). This assumption limits capability-class workload campaigns that require large numbers of discrete or loosely coupled simulations, and where time-to-solution is an untenable pacing issue. This paper describes the challenges related to the support of fine-grained launch capabilities that are necessary for the execution of loosely coupled large scale simulations on Cray/ALPS platforms. More precisely, we present the details of an enhanced runtime system to support this use case, and report on initial results from early testing on systems at Oak Ridge National Laboratory.

  17. Seismic safety in conducting large-scale blasts

    NASA Astrophysics Data System (ADS)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  18. PKI security in large-scale healthcare networks.

    PubMed

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  19. Large-scale velocity structures in turbulent thermal convection.

    PubMed

    Qiu, X L; Tong, P

    2001-09-01

    A systematic study of large-scale velocity structures in turbulent thermal convection is carried out in three different aspect-ratio cells filled with water. Laser Doppler velocimetry is used to measure the velocity profiles and statistics over varying Rayleigh numbers Ra and at various spatial positions across the whole convection cell. Large velocity fluctuations are found both in the central region and near the cell boundary. Despite the large velocity fluctuations, the flow field still maintains a large-scale quasi-two-dimensional structure, which rotates in a coherent manner. This coherent single-roll structure scales with Ra and can be divided into three regions in the rotation plane: (1) a thin viscous boundary layer, (2) a fully mixed central core region with a constant mean velocity gradient, and (3) an intermediate plume-dominated buffer region. The experiment reveals a unique driving mechanism for the large-scale coherent rotation in turbulent convection.

  20. Large-scale simulations of complex physical systems

    NASA Astrophysics Data System (ADS)

    Belić, A.

    2007-04-01

    Scientific computing has become a tool as vital as experimentation and theory for dealing with scientific challenges of the twenty-first century. Large scale simulations and modelling serve as heuristic tools in a broad problem-solving process. High-performance computing facilities make possible the first step in this process - a view of new and previously inaccessible domains in science and the building up of intuition regarding the new phenomenology. The final goal of this process is to translate this newly found intuition into better algorithms and new analytical results. In this presentation we give an outline of the research themes pursued at the Scientific Computing Laboratory of the Institute of Physics in Belgrade regarding large-scale simulations of complex classical and quantum physical systems, and present recent results obtained in the large-scale simulations of granular materials and path integrals.

  1. Large-scale simulations of complex physical systems

    SciTech Connect

    Belic, A.

    2007-04-23

    Scientific computing has become a tool as vital as experimentation and theory for dealing with scientific challenges of the twenty-first century. Large scale simulations and modelling serve as heuristic tools in a broad problem-solving process. High-performance computing facilities make possible the first step in this process - a view of new and previously inaccessible domains in science and the building up of intuition regarding the new phenomenology. The final goal of this process is to translate this newly found intuition into better algorithms and new analytical results.In this presentation we give an outline of the research themes pursued at the Scientific Computing Laboratory of the Institute of Physics in Belgrade regarding large-scale simulations of complex classical and quantum physical systems, and present recent results obtained in the large-scale simulations of granular materials and path integrals.

  2. A relativistic signature in large-scale structure

    NASA Astrophysics Data System (ADS)

    Bartolo, Nicola; Bertacca, Daniele; Bruni, Marco; Koyama, Kazuya; Maartens, Roy; Matarrese, Sabino; Sasaki, Misao; Verde, Licia; Wands, David

    2016-09-01

    In General Relativity, the constraint equation relating metric and density perturbations is inherently nonlinear, leading to an effective non-Gaussianity in the dark matter density field on large scales-even if the primordial metric perturbation is Gaussian. Intrinsic non-Gaussianity in the large-scale dark matter overdensity in GR is real and physical. However, the variance smoothed on a local physical scale is not correlated with the large-scale curvature perturbation, so that there is no relativistic signature in the galaxy bias when using the simplest model of bias. It is an open question whether the observable mass proxies such as luminosity or weak lensing correspond directly to the physical mass in the simple halo bias model. If not, there may be observables that encode this relativistic signature.

  3. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    NASA Astrophysics Data System (ADS)

    Blackman, Eric G.

    2015-05-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. Here I discuss how magnetic helicity has come to help us understand the saturation of and sustenance of large scale dynamos, the need for either local or global helicity fluxes to avoid dynamo quenching, and the associated observational consequences. I also discuss how magnetic helicity acts as a hindrance to turbulent diffusion of large scale fields, and thus a helper for fossil remnant large scale field origin models in some contexts. I briefly discuss the connection between large scale fields and accretion disk theory as well. The goal here is to provide a conceptual primer to help the reader efficiently penetrate the literature.

  4. Large Scale Processes and Extreme Floods in Brazil

    NASA Astrophysics Data System (ADS)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  5. Astronomical observatories

    NASA Technical Reports Server (NTRS)

    Ponomarev, D. N.

    1983-01-01

    The layout and equipment of astronomical observatories, the oldest scientific institutions of human society are discussed. The example of leading observatories of the USSR allows the reader to familiarize himself with both their modern counterparts, as well as the goals and problems on which astronomers are presently working.

  6. Observatories: History

    NASA Astrophysics Data System (ADS)

    Krisciunas, K.; Murdin, P.

    2000-11-01

    An astronomical OBSERVATORY is a building, installation or institution dedicated to the systematic and regular observation of celestial objects for the purpose of understanding their physical nature, or for purposes of time reckoning and keeping the calendar. At a bona fide observatory such work constitutes a main activity, not just an incidental one. While the ancient Egyptians, Babylonians, Chi...

  7. Amateur Observatories

    NASA Astrophysics Data System (ADS)

    Gavin, M.

    1997-08-01

    A roundup of amateur observatories in this country and abroad, with construction and location details, concluding with a detailed description and architect's drawing of the author's own observatory at Worcester Park, Surrey. The text of the 1996 Presidential Address to the British Astronomical Association.

  8. The Virtual Observatory: I

    NASA Astrophysics Data System (ADS)

    Hanisch, R. J.

    2014-11-01

    The concept of the Virtual Observatory arose more-or-less simultaneously in the United States and Europe circa 2000. Ten pages of Astronomy and Astrophysics in the New Millennium: Panel Reports (National Academy Press, Washington, 2001), that is, the detailed recommendations of the Panel on Theory, Computation, and Data Exploration of the 2000 Decadal Survey in Astronomy, are dedicated to describing the motivation for, scientific value of, and major components required in implementing the National Virtual Observatory. European initiatives included the Astrophysical Virtual Observatory at the European Southern Observatory, the AstroGrid project in the United Kingdom, and the Euro-VO (sponsored by the European Union). Organizational/conceptual meetings were held in the US at the California Institute of Technology (Virtual Observatories of the Future, June 13-16, 2000) and at ESO Headquarters in Garching, Germany (Mining the Sky, July 31-August 4, 2000; Toward an International Virtual Observatory, June 10-14, 2002). The nascent US, UK, and European VO projects formed the International Virtual Observatory Alliance (IVOA) at the June 2002 meeting in Garching, with yours truly as the first chair. The IVOA has grown to a membership of twenty-one national projects and programs on six continents, and has developed a broad suite of data access protocols and standards that have been widely implemented. Astronomers can now discover, access, and compare data from hundreds of telescopes and facilities, hosted at hundreds of organizations worldwide, stored in thousands of databases, all with a single query.

  9. The Little Thompson Observatory

    NASA Astrophysics Data System (ADS)

    Schweitzer, A. E.; VanLew, K.; Melsheimer, T.; Sackett, C.

    1999-12-01

    The Little Thompson Observatory is the second member of the Telescopes in Education (TIE) project. Construction of the dome and the remote control system has been completed, and the telescope is now on-line and operational over the Internet. The observatory is located on the grounds of Berthoud High School in northern Colorado. Local schools and youth organizations have prioritized access to the telescope, and there are monthly opportunities for public viewing. In the future, the telescope will be open after midnight to world-wide use by schools following the model of the first TIE observatory, the 24" telescope on Mt. Wilson. Students remotely connect to the observatory over the Internet, and then receive the images on their local computers. The observatory grew out of grassroots support from the local community surrounding Berthoud, Colorado, a town of 3,500 residents. TIE has provided the observatory with a Tinsley 18" Cassegrain telescope on a 10-year loan. The facility has been built with tremendous support from volunteers and the local school district. With funding from an IDEAS grant, we have begun teacher training workshops which will allow K-12 schools in northern Colorado to make use of the Little Thompson Observatory, including remote observing from classrooms.

  10. [Issues of large scale tissue culture of medicinal plant].

    PubMed

    Lv, Dong-Mei; Yuan, Yuan; Zhan, Zhi-Lai

    2014-09-01

    In order to increase the yield and quality of the medicinal plant and enhance the competitive power of industry of medicinal plant in our country, this paper analyzed the status, problem and countermeasure of the tissue culture of medicinal plant on large scale. Although the biotechnology is one of the most efficient and promising means in production of medicinal plant, it still has problems such as stability of the material, safety of the transgenic medicinal plant and optimization of cultured condition. Establishing perfect evaluation system according to the characteristic of the medicinal plant is the key measures to assure the sustainable development of the tissue culture of medicinal plant on large scale.

  11. The CLASSgal code for relativistic cosmological large scale structure

    NASA Astrophysics Data System (ADS)

    Di Dio, Enea; Montanari, Francesco; Lesgourgues, Julien; Durrer, Ruth

    2013-11-01

    We present accurate and efficient computations of large scale structure observables, obtained with a modified version of the CLASS code which is made publicly available. This code includes all relativistic corrections and computes both the power spectrum Cl(z1,z2) and the corresponding correlation function ξ(θ,z1,z2) of the matter density and the galaxy number fluctuations in linear perturbation theory. For Gaussian initial perturbations, these quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory. We illustrate the usefulness of our code for cosmological parameter estimation through a few simple examples.

  12. Corridors Increase Plant Species Richness at Large Scales

    SciTech Connect

    Damschen, Ellen I.; Haddad, Nick M.; Orrock,John L.; Tewksbury, Joshua J.; Levey, Douglas J.

    2006-09-01

    Habitat fragmentation is one of the largest threats to biodiversity. Landscape corridors, which are hypothesized to reduce the negative consequences of fragmentation, have become common features of ecological management plans worldwide. Despite their popularity, there is little evidence documenting the effectiveness of corridors in preserving biodiversity at large scales. Using a large-scale replicated experiment, we showed that habitat patches connected by corridors retain more native plant species than do isolated patches, that this difference increases over time, and that corridors do not promote invasion by exotic species. Our results support the use of corridors in biodiversity conservation.

  13. Large-Scale Graph Processing Analysis using Supercomputer Cluster

    NASA Astrophysics Data System (ADS)

    Vildario, Alfrido; Fitriyani; Nugraha Nurkahfi, Galih

    2017-01-01

    Graph implementation is widely use in various sector such as automotive, traffic, image processing and many more. They produce graph in large-scale dimension, cause the processing need long computational time and high specification resources. This research addressed the analysis of implementation large-scale graph using supercomputer cluster. We impelemented graph processing by using Breadth-First Search (BFS) algorithm with single destination shortest path problem. Parallel BFS implementation with Message Passing Interface (MPI) used supercomputer cluster at High Performance Computing Laboratory Computational Science Telkom University and Stanford Large Network Dataset Collection. The result showed that the implementation give the speed up averages more than 30 times and eficiency almost 90%.

  14. Clearing and Labeling Techniques for Large-Scale Biological Tissues

    PubMed Central

    Seo, Jinyoung; Choe, Minjin; Kim, Sung-Yon

    2016-01-01

    Clearing and labeling techniques for large-scale biological tissues enable simultaneous extraction of molecular and structural information with minimal disassembly of the sample, facilitating the integration of molecular, cellular and systems biology across different scales. Recent years have witnessed an explosive increase in the number of such methods and their applications, reflecting heightened interest in organ-wide clearing and labeling across many fields of biology and medicine. In this review, we provide an overview and comparison of existing clearing and labeling techniques and discuss challenges and opportunities in the investigations of large-scale biological systems. PMID:27239813

  15. The Evolution of Baryons in Cosmic Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Snedden, Ali; Arielle Phillips, Lara; Mathews, Grant James; Coughlin, Jared; Suh, In-Saeng; Bhattacharya, Aparna

    2015-01-01

    The environments of galaxies play a critical role in their formation and evolution. We study these environments using cosmological simulations with star formation and supernova feedback included. From these simulations, we parse the large scale structure into clusters, filaments and voids using a segmentation algorithm adapted from medical imaging. We trace the star formation history, gas phase and metal evolution of the baryons in the intergalactic medium as function of structure. We find that our algorithm reproduces the baryon fraction in the intracluster medium and that the majority of star formation occurs in cold, dense filaments. We present the consequences this large scale environment has for galactic halos and galaxy evolution.

  16. Large scale purification of RNA nanoparticles by preparative ultracentrifugation.

    PubMed

    Jasinski, Daniel L; Schwartz, Chad T; Haque, Farzin; Guo, Peixuan

    2015-01-01

    Purification of large quantities of supramolecular RNA complexes is of paramount importance due to the large quantities of RNA needed and the purity requirements for in vitro and in vivo assays. Purification is generally carried out by liquid chromatography (HPLC), polyacrylamide gel electrophoresis (PAGE), or agarose gel electrophoresis (AGE). Here, we describe an efficient method for the large-scale purification of RNA prepared by in vitro transcription using T7 RNA polymerase by cesium chloride (CsCl) equilibrium density gradient ultracentrifugation and the large-scale purification of RNA nanoparticles by sucrose gradient rate-zonal ultracentrifugation or cushioned sucrose gradient rate-zonal ultracentrifugation.

  17. The Observatorio Astrofisico de Javalambre. A planned facility for large scale surveys

    NASA Astrophysics Data System (ADS)

    Moles, M.; Cenarro, A. J.; Cristóbal-Hornillos, D.; Gruel, N.; Marín Franch, A.; Valdivielso, L.; Viironen, K.

    2011-11-01

    All-sky surveys play a fundamental role for the development of Astrophysics. The need for large-scale surveys comes from two basic motivations: one is to make an inventory of sources as complete as possible and allow for their classification in families. The other is to attack some problems demanding the sampling of large volumes to give a detectable signal. New challenges, in particular in the domain of Cosmology are giving impulse to a new kind of large-scale surveys, combining area coverage, depth and accurate enough spectral information to recover the redshift and spectral energy distribution (SED) of the detected objects. New instruments are needed to satisfy the requirements of those large-scale surveys, in particular large Etendue telescopes. The Observatorio Astrofisico de Javalambre, OAJ, project includes a telescope of 2.5 m aperture, with a wide field of view, 3 degrees in diameter, and excellent image quality in the whole field. Taking into account that it is going to be fully devoted to carry out surveys, it will be the highest effective Etendue telescope up to date. The project is completed with a smaller, wide field auxiliary telescope. The Observatory is being built at Pico del Buitre, Sierra de Javalambre, Teruel, a site with excellent seeing and low sky surface brightness. The institution in charge of the Observatory is the Centro de Estudios de Fisica del Cosmos de Aragon, CEFCA, a new center created in Teruel for the operation and scientific exploitation of the Javalambre Observatory. CEFCA will be also in charge of the data management and archiving. The data will be made accessible to the community.The first planned scientific project is a multi-narrow-band photometric survey covering 8,000 square degrees, designed to produce precise SEDs, and photometric redshifts accurate at the 0.3 % level. A total of 42, 100-120 Å band pass filters covering most of the optical spectral range will be used. In this sense it is the development, at a much

  18. Disentangling the dynamic core: a research program for a neurodynamics at the large-scale.

    PubMed

    Le Van Quyen, Michel

    2003-01-01

    My purpose in this paper is to sketch a research direction based on Francisco Varela's pioneering work in neurodynamics (see also Rudrauf et al. 2003, in this issue). Very early on he argued that the internal coherence of every mental-cognitive state lies in the global self-organization of the brain activities at the large-scale, constituting a fundamental pole of integration called here a "dynamic core". Recent neuroimaging evidence appears to broadly support this hypothesis and suggests that a global brain dynamics emerges at the large scale level from the cooperative interactions among widely distributed neuronal populations. Despite a growing body of evidence supporting this view, our understanding of these large-scale brain processes remains hampered by the lack of a theoretical language for expressing these complex behaviors in dynamical terms. In this paper, I propose a rough cartography of a comprehensive approach that offers a conceptual and mathematical framework to analyze spatio-temporal large-scale brain phenomena. I emphasize how these nonlinear methods can be applied, what property might be inferred from neuronal signals, and where one might productively proceed for the future. This paper is dedicated, with respect and affection, to the memory of Francisco Varela.

  19. Temperature dependence of large-scale water retention curves: Acase study

    SciTech Connect

    Liu, Hui-Hai; Bodvarsson, G.S.; Dane, J.H.

    2001-10-26

    A local-scale model for temperature-dependence of water-retention curves may be applicable to large scales. Consideration of this temperature dependence is important for modeling unsaturated flow and transport in the subsurface in numerous cases. Although significant progress has been made in understanding and modeling this temperature effect, almost all the previous studies have been limited to small scales (on the order of several centimeters). Numerical experiments were used to investigate the possibility of extending a local-scale model for the temperature-dependence of water retention curves to large scales (on the order of meters). Temperature effects on large-scale hydraulic properties are of interest in many practical applications. Numerical experiment results indicate that the local-scale model can indeed be applicable to large-scale problems for special porous media with high air entry values. A typical porous medium of this kind is the porous tuff matrix in the unsaturated zone of Yucca Mountain, Nevada, the proposed geologic disposal site for national high-level nuclear wastes. Whether this finding can approximately hold for general cases needs to be investigated in future studies.

  20. Weak lensing of large scale structure in the presence of screening

    SciTech Connect

    Tessore, Nicolas; Metcalf, R. Benton; Giocoli, Carlo E-mail: hans.winther@astro.ox.ac.uk E-mail: pedro.ferreira@physics.ox.ac.uk

    2015-10-01

    A number of alternatives to general relativity exhibit gravitational screening in the non-linear regime of structure formation. We describe a set of algorithms that can produce weak lensing maps of large scale structure in such theories and can be used to generate mock surveys for cosmological analysis. By analysing a few basic statistics we indicate how these alternatives can be distinguished from general relativity with future weak lensing surveys.

  1. Large Scale Integrated Photonics for Twenty-First Century Information Technologies

    NASA Astrophysics Data System (ADS)

    Beausoleil, Raymond G.

    2014-08-01

    In this paper, we will review research done by the Large-Scale Integrated Photonics group at HP Laboratories, and in particular we will discuss applications of optical resonances in dielectric microstructures and nanostructures to future classical and quantum information technologies. Our goal is to scale photonic technologies over the next decade in much the same way as electronics over the past five, thereby establishing a Moore's Law for optics.

  2. The Large-Scale Structure of Scientific Method

    ERIC Educational Resources Information Center

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  3. A bibliographical surveys of large-scale systems

    NASA Technical Reports Server (NTRS)

    Corliss, W. R.

    1970-01-01

    A limited, partly annotated bibliography was prepared on the subject of large-scale system control. Approximately 400 references are divided into thirteen application areas, such as large societal systems and large communication systems. A first-author index is provided.

  4. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  5. Firebrands and spotting ignition in large-scale fires

    Treesearch

    Eunmo Koo; Patrick J. Pagni; David R. Weise; John P. Woycheese

    2010-01-01

    Spotting ignition by lofted firebrands is a significant mechanism of fire spread, as observed in many largescale fires. The role of firebrands in fire propagation and the important parameters involved in spot fire development are studied. Historical large-scale fires, including wind-driven urban and wildland conflagrations and post-earthquake fires are given as...

  6. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  7. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    ERIC Educational Resources Information Center

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  8. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  9. Large-Scale Environmental Influences on Aquatic Animal Health

    EPA Science Inventory

    In the latter portion of the 20th century, North America experienced numerous large-scale mortality events affecting a broad diversity of aquatic animals. Short-term forensic investigations of these events have sometimes characterized a causative agent or condition, but have rare...

  10. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  11. Developing and Understanding Methods for Large-Scale Nonlinear Optimization

    DTIC Science & Technology

    2006-07-24

    algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with -2- many thousands...34Published in peer-reviewed journals" E. Eskow, B. Bader, R. Byrd, S. Crivelli, T. Head-Gordon, V. Lamberti and R. Schnabel, "An optimization approach to the

  12. Probabilistic Cuing in Large-Scale Environmental Search

    ERIC Educational Resources Information Center

    Smith, Alastair D.; Hood, Bruce M.; Gilchrist, Iain D.

    2010-01-01

    Finding an object in our environment is an important human ability that also represents a critical component of human foraging behavior. One type of information that aids efficient large-scale search is the likelihood of the object being in one location over another. In this study we investigated the conditions under which individuals respond to…

  13. Feasibility of large-scale aquatic microcosms. Final report

    SciTech Connect

    Pease, T.; Wyman, R.L.; Logan, D.T.; Logan, C.M.; Lispi, D.R.

    1982-02-01

    Microcosms have been used to study a number of fundamental ecological principles and more recently to investigate the effects of man-made perturbations on ecosystems. In this report the feasibility of using large-scale microcosms to access aquatic impacts of power generating facilities is evaluated. Aquatic problems of concern to utilities are outlined, and various research approaches, including large and small microcosms, bioassays, and other laboratory experiments, are discussed. An extensive critical review and synthesis of the literature on recent microcosm research, which includes a comparison of the factors influencing physical, chemical, and biological processes in small vs large microcosms and in microcosms vs nature, led the authors to conclude that large-scale microcosms offer several advantages over other study techniques for particular types of problems. A hypothetical large-scale facility simulating a lake ecosystem is presented to illustrate the size, cost, and complexity of such facilities. The rationale for designing a lake-simulating large-scale microcosm is presented.

  14. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  15. Improving the Utility of Large-Scale Assessments in Canada

    ERIC Educational Resources Information Center

    Rogers, W. Todd

    2014-01-01

    Principals and teachers do not use large-scale assessment results because the lack of distinct and reliable subtests prevents identifying strengths and weaknesses of students and instruction, the results arrive too late to be used, and principals and teachers need assistance to use the results to improve instruction so as to improve student…

  16. Research directions in large scale systems and decentralized control

    NASA Technical Reports Server (NTRS)

    Tenney, R. R.

    1980-01-01

    Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.

  17. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  18. Ecosystem resilience despite large-scale altered hydro climatic conditions

    USDA-ARS?s Scientific Manuscript database

    Climate change is predicted to increase both drought frequency and duration, and when coupled with substantial warming, will establish a new hydroclimatological paradigm for many regions. Large-scale, warm droughts have recently impacted North America, Africa, Europe, Amazonia, and Australia result...

  19. The Large-Scale Structure of Scientific Method

    ERIC Educational Resources Information Center

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  20. Large-Scale Assessments and Educational Policies in Italy

    ERIC Educational Resources Information Center

    Damiani, Valeria

    2016-01-01

    Despite Italy's extensive participation in most large-scale assessments, their actual influence on Italian educational policies is less easy to identify. The present contribution aims at highlighting and explaining reasons for the weak and often inconsistent relationship between international surveys and policy-making processes in Italy.…

  1. Large-Scale Innovation and Change in UK Higher Education

    ERIC Educational Resources Information Center

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  2. Current Scientific Issues in Large Scale Atmospheric Dynamics

    NASA Technical Reports Server (NTRS)

    Miller, T. L. (Compiler)

    1986-01-01

    Topics in large scale atmospheric dynamics are discussed. Aspects of atmospheric blocking, the influence of transient baroclinic eddies on planetary-scale waves, cyclogenesis, the effects of orography on planetary scale flow, small scale frontal structure, and simulations of gravity waves in frontal zones are discussed.

  3. Large-Scale Assessments and Educational Policies in Italy

    ERIC Educational Resources Information Center

    Damiani, Valeria

    2016-01-01

    Despite Italy's extensive participation in most large-scale assessments, their actual influence on Italian educational policies is less easy to identify. The present contribution aims at highlighting and explaining reasons for the weak and often inconsistent relationship between international surveys and policy-making processes in Italy.…

  4. Large scale fire whirls: Can their formation be predicted?

    Treesearch

    J. Forthofer; Bret Butler

    2010-01-01

    Large scale fire whirls have not traditionally been recognized as a frequent phenomenon on wildland fires. However, there are anecdotal data suggesting that they can and do occur with some regularity. This paper presents a brief summary of this information and an analysis of the causal factors leading to their formation.

  5. Large-Scale Environmental Influences on Aquatic Animal Health

    EPA Science Inventory

    In the latter portion of the 20th century, North America experienced numerous large-scale mortality events affecting a broad diversity of aquatic animals. Short-term forensic investigations of these events have sometimes characterized a causative agent or condition, but have rare...

  6. International Large-Scale Assessments: What Uses, What Consequences?

    ERIC Educational Resources Information Center

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  7. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  8. Large-Scale Innovation and Change in UK Higher Education

    ERIC Educational Resources Information Center

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  9. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  10. Individual Skill Differences and Large-Scale Environmental Learning

    ERIC Educational Resources Information Center

    Fields, Alexa W.; Shelton, Amy L.

    2006-01-01

    Spatial skills are known to vary widely among normal individuals. This project was designed to address whether these individual differences are differentially related to large-scale environmental learning from route (ground-level) and survey (aerial) perspectives. Participants learned two virtual environments (route and survey) with limited…

  11. Newton Methods for Large Scale Problems in Machine Learning

    ERIC Educational Resources Information Center

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  12. Large-Scale Machine Learning for Classification and Search

    ERIC Educational Resources Information Center

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  13. Global smoothing and continuation for large-scale molecular optimization

    SciTech Connect

    More, J.J.; Wu, Zhijun

    1995-10-01

    We discuss the formulation of optimization problems that arise in the study of distance geometry, ionic systems, and molecular clusters. We show that continuation techniques based on global smoothing are applicable to these molecular optimization problems, and we outline the issues that must be resolved in the solution of large-scale molecular optimization problems.

  14. Large-scale Eucalyptus energy farms and power cogeneration

    Treesearch

    Robert C. Noroña

    1983-01-01

    A thorough evaluation of all factors possibly affecting a large-scale planting of eucalyptus is foremost in determining the cost effectiveness of the planned operation. Seven basic areas of concern must be analyzed:1. Species Selection 2. Site Preparation 3. Planting 4. Weed Control 5....

  15. Probabilistic Cuing in Large-Scale Environmental Search

    ERIC Educational Resources Information Center

    Smith, Alastair D.; Hood, Bruce M.; Gilchrist, Iain D.

    2010-01-01

    Finding an object in our environment is an important human ability that also represents a critical component of human foraging behavior. One type of information that aids efficient large-scale search is the likelihood of the object being in one location over another. In this study we investigated the conditions under which individuals respond to…

  16. Lessons from Large-Scale Renewable Energy Integration Studies: Preprint

    SciTech Connect

    Bird, L.; Milligan, M.

    2012-06-01

    In general, large-scale integration studies in Europe and the United States find that high penetrations of renewable generation are technically feasible with operational changes and increased access to transmission. This paper describes other key findings such as the need for fast markets, large balancing areas, system flexibility, and the use of advanced forecasting.

  17. The large scale microwave background anisotropy in decaying particle cosmology

    SciTech Connect

    Panek, M.

    1987-06-01

    We investigate the large-scale anisotropy of the microwave background radiation in cosmological models with decaying particles. The observed value of the quadrupole moment combined with other constraints gives an upper limit on the redshift of the decay z/sub d/ < 3-5. 12 refs., 2 figs.

  18. Large-scale search for dark-matter axions

    SciTech Connect

    Kinion, D; van Bibber, K

    2000-08-30

    We review the status of two ongoing large-scale searches for axions which may constitute the dark matter of our Milky Way halo. The experiments are based on the microwave cavity technique proposed by Sikivie, and marks a ''second-generation'' to the original experiments performed by the Rochester-Brookhaven-Fermilab collaboration, and the University of Florida group.

  19. Resilience of Florida Keys coral communities following large scale disturbances

    EPA Science Inventory

    The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...

  20. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  1. The Role of Plausible Values in Large-Scale Surveys

    ERIC Educational Resources Information Center

    Wu, Margaret

    2005-01-01

    In large-scale assessment programs such as NAEP, TIMSS and PISA, students' achievement data sets provided for secondary analysts contain so-called "plausible values." Plausible values are multiple imputations of the unobservable latent achievement for each student. In this article it has been shown how plausible values are used to: (1)…

  2. Large-scale silicon optical switches for optical interconnection

    NASA Astrophysics Data System (ADS)

    Qiao, Lei; Tang, Weijie; Chu, Tao

    2016-11-01

    Large-scale optical switches are greatly demanded in building optical interconnections in data centers and high performance computers (HPCs). Silicon optical switches have advantages of being compact and CMOS process compatible, which can be easily monolithically integrated. However, there are difficulties to construct large ports silicon optical switches. One of them is the non-uniformity of the switch units in large scale silicon optical switches, which arises from the fabrication error and causes confusion in finding the unit optimum operation points. In this paper, we proposed a method to detect the optimum operating point in large scale switch with limited build-in power monitors. We also propose methods for improving the unbalanced crosstalk of cross/bar states in silicon electro-optical MZI switches and insertion losses. Our recent progress in large scale silicon optical switches, including 64 × 64 thermal-optical and 32 × 32 electro-optical switches will be introduced. To the best our knowledge, both of them are the largest scale silicon optical switches in their sections, respectively. The switches were fabricated on 340-nm SOI substrates with CMOS 180- nm processes. The crosstalk of the 32 × 32 electro-optic switch was -19.2dB to -25.1 dB, while the value of the 64 × 64 thermal-optic switch was -30 dB to -48.3 dB.

  3. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  4. Computational Complexity, Efficiency and Accountability in Large Scale Teleprocessing Systems.

    DTIC Science & Technology

    1980-12-01

    COMPLEXITY, EFFICIENCY AND ACCOUNTABILITY IN LARGE SCALE TELEPROCESSING SYSTEMS DAAG29-78-C-0036 STANFORD UNIVERSITY JOHN T. GILL MARTIN E. BELLMAN...solve but easy to check. Ve have also suggested howy sucb random tapes can be simulated by determin- istically generating "pseudorandom" numbers by a

  5. Large-Scale Assessment and English Language Learners with Disabilities

    ERIC Educational Resources Information Center

    Liu, Kristin K.; Ward, Jenna M.; Thurlow, Martha L.; Christensen, Laurene L.

    2017-01-01

    This article highlights a set of principles and guidelines, developed by a diverse group of specialists in the field, for appropriately including English language learners (ELLs) with disabilities in large-scale assessments. ELLs with disabilities make up roughly 9% of the rapidly increasing ELL population nationwide. In spite of the small overall…

  6. Large-scale silviculture experiments of western Oregon and Washington.

    Treesearch

    Nathan J. Poage; Paul D. Anderson

    2007-01-01

    We review 12 large-scale silviculture experiments (LSSEs) in western Washington and Oregon with which the Pacific Northwest Research Station of the USDA Forest Service is substantially involved. We compiled and arrayed information about the LSSEs as a series of matrices in a relational database, which is included on the compact disc published with this report and...

  7. Newton Methods for Large Scale Problems in Machine Learning

    ERIC Educational Resources Information Center

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  8. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  9. Large-Scale Machine Learning for Classification and Search

    ERIC Educational Resources Information Center

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  10. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  11. Large-scale societal changes and intentionality - an uneasy marriage.

    PubMed

    Bodor, Péter; Fokas, Nikos

    2014-08-01

    Our commentary focuses on juxtaposing the proposed science of intentional change with facts and concepts pertaining to the level of large populations or changes on a worldwide scale. Although we find a unified evolutionary theory promising, we think that long-term and large-scale, scientifically guided - that is, intentional - social change is not only impossible, but also undesirable.

  12. Large-scale screening by the automated Wassermann reaction

    PubMed Central

    Wagstaff, W.; Firth, R.; Booth, J. R.; Bowley, C. C.

    1969-01-01

    In view of the drawbacks in the use of the Kahn test for large-scale screening of blood donors, mainly those of human error through work overload and fatiguability, an attempt was made to adapt an existing automated complement-fixation technique for this purpose. This paper reports the successful results of that adaptation. PMID:5776559

  13. International Large-Scale Assessments: What Uses, What Consequences?

    ERIC Educational Resources Information Center

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  14. Cosmic strings and the large-scale structure

    NASA Technical Reports Server (NTRS)

    Stebbins, Albert

    1988-01-01

    A possible problem for cosmic string models of galaxy formation is presented. If very large voids are common and if loop fragmentation is not much more efficient than presently believed, then it may be impossible for string scenarios to produce the observed large-scale structure with Omega sub 0 = 1 and without strong environmental biasing.

  15. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  16. Resilience of Florida Keys coral communities following large scale disturbances

    EPA Science Inventory

    The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...

  17. Large scale nonlinear programming for the optimization of spacecraft trajectories

    NASA Astrophysics Data System (ADS)

    Arrieta-Camacho, Juan Jose

    . Future research directions are identified, involving the automatic scheduling and optimization of trajectory correction maneuvers. The sensitivity information provided by the methodology is expected to be invaluable in such research pursuit. The collocation scheme and nonlinear programming algorithm presented in this work, complement other existing methodologies by providing reliable and efficient numerical methods able to handle large scale, nonlinear dynamic models.

  18. Large Scale Computing and Storage Requirements for High Energy Physics

    SciTech Connect

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes

  19. FEASIBILITY OF LARGE-SCALE OCEAN CO2 SEQUESTRATION

    SciTech Connect

    Dr. Peter Brewer; Dr. James Barry

    2002-09-30

    We have continued to carry out creative small-scale experiments in the deep ocean to investigate the science underlying questions of possible future large-scale deep-ocean CO{sub 2} sequestration as a means of ameliorating greenhouse gas growth rates in the atmosphere. This project is closely linked to additional research funded by the DoE Office of Science, and to support from the Monterey Bay Aquarium Research Institute. The listing of project achievements here over the past year reflects these combined resources. Within the last project year we have: (1) Published a significant workshop report (58 pages) entitled ''Direct Ocean Sequestration Expert's Workshop'', based upon a meeting held at MBARI in 2001. The report is available both in hard copy, and on the NETL web site. (2) Carried out three major, deep ocean, (3600m) cruises to examine the physical chemistry, and biological consequences, of several liter quantities released on the ocean floor. (3) Carried out two successful short cruises in collaboration with Dr. Izuo Aya and colleagues (NMRI, Osaka, Japan) to examine the fate of cold (-55 C) CO{sub 2} released at relatively shallow ocean depth. (4) Carried out two short cruises in collaboration with Dr. Costas Tsouris, ORNL, to field test an injection nozzle designed to transform liquid CO{sub 2} into a hydrate slurry at {approx}1000m depth. (5) In collaboration with Prof. Jill Pasteris (Washington University) we have successfully accomplished the first field test of a deep ocean laser Raman spectrometer for probing in situ the physical chemistry of the CO{sub 2} system. (6) Submitted the first major paper on biological impacts as determined from our field studies. (7) Submitted a paper on our measurements of the fate of a rising stream of liquid CO{sub 2} droplets to Environmental Science & Technology. (8) Have had accepted for publication in Eos the first brief account of the laser Raman spectrometer success. (9) Have had two papers submitted for the

  20. Probabilistic voltage security for large scale power systems

    NASA Astrophysics Data System (ADS)

    Poshtan, Majid

    2000-10-01

    Stability is one of the most important problems in power system operation and control. Voltage instability is one type of power system instability that occurs when the system operates close to its limits. Progressive voltage instability, which is also referred to as Voltage Collapse, results in loss of voltage at certain nodes (buses) in the system. Voltage collapse, a slowly occurring phenomena leading to loss of voltage at specific parts of an electric utility, has been observed in the USA, Europe, Japan, Canada, and other places in the world during the past decade. Voltage collapse typically occurs on power systems which are heavily loaded, faulted and/or have reactive power shortages. There are several power system's parameter changes known to contribute to voltage collapse. The most important contributors to voltage instability are: increasing load, generators or SVC reaching reactive power limits, action of tap-changing transformers, line tripping, and generator outages. The differences between voltage collapse and lack of classical transient stability is that in voltage collapse we focus on loads and voltage magnitudes whereas in classical transient stability the focus is on generators' dynamics and voltage angles. Also voltage collapse often includes longer time scale dynamics and includes the effects of continuous changes such as load increases in addition to discrete events such as line outages. Two conventional methods to analyze voltage collapse are P-V and V-Q curves, and modal analyses. Both methods are deterministic and do not encounter any probability for the contingencies causing the voltage collapse. The purpose of this investigation is to identify probabilistic indices to assess the steady-state voltage stability by considering random failures and their dependency in a large-scale power system. The research mainly continues the previous research completed at Tulane University by Dr. J. Bian and Professor P. Rastgoufard and will complement it by

  1. Taosi Observatory

    NASA Astrophysics Data System (ADS)

    Sun, Xiaochun

    Taosi observatory is the remains of a structure discovered at the later Neolithic Taosi site located in Xiangfen County, Shanxi Province, in north-central China. The structure is a walled enclosure on a raised platform. Only rammed-earth foundations of the structure remained. Archaeoastronomical studies suggest that this structure functioned as an astronomical observatory. Historical circumstantial evidence suggests that it was probably related to the legendary kingdom of Yao from the twenty-first century BC.

  2. Wise Observatory

    NASA Astrophysics Data System (ADS)

    Murdin, P.

    2000-11-01

    Wise Observatory, in Mitzpe Ramon, Israel, is owned and operated by Tel Aviv University, and has a well-equipped 1 m telescope. Since construction in 1971, the large percentage of clear nights at its desert site and its unique longitude have made the observatory particularly useful for long-term monitoring projects (e.g. reverberation mapping of quasars and active galaxies), and as a part of glo...

  3. Inference of 3-dimensional structure underlying large-scale coronal events observed by Yohkoh and Ulysses

    NASA Technical Reports Server (NTRS)

    Slater, G. L.; Freeland, S. L.; Hoeksema, T.; Zhao, X.; Hudson, H. S.

    1995-01-01

    The Yohkoh/SXT images provide full-disk coverage of the solar corona, usually extending before and after one of the large-scale eruptive events that occur in the polar crown These produce large arcades of X-ray loops, often with a cusp-shaped coronal extension, and are known to be associated with coronal mass ejections. The Yohkoh prototype of such events occurred 12 Nov. 1991. This allows us to determine heights from the apparent rotation rates of these structures. In comparison v with magnetic-field extrapolations from Wilcox Solar Observatory. use use this tool to infer the three dimensional structure of the corona in particular cases: 24 Jan. 1992, 24 Feb. 1993, 14 Apr. 1994, and 13 Nov. 1994. The last event is a long-duration flare event.

  4. First hints of large scale structures in the ultrahigh energy sky?

    SciTech Connect

    Cuoco, A.; Miele, G.; Serpico, Pasquale D.; /Fermilab

    2006-10-01

    The result of the recent publication [1] of a broad maximum around 25 degrees in the two-point autocorrelation function of ultra-high energy cosmic ray arrival directions has been intriguingly interpreted as the first imprint of the large scale structures (LSS) of baryonic matter in the near universe. We analyze this suggestion in light of the clustering properties expected from the PSCz astronomical catalogue of LSS. The chance probability of the signal is consistent within 2 {sigma} with the predictions based on the catalogue. No evidence for a significant cross-correlation of the observed events with known overdensities in the LSS is found, which may be due to the role of the galactic and extragalactic magnetic fields, and is however consistent with the limited statistics. The larger statistics to be collected by the Pierre Auger Observatory is needed to answer definitely the question.

  5. Structure and evolution of the large scale solar and heliospheric magnetic fields. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Hoeksema, J. T.

    1984-01-01

    Structure and evolution of large scale photospheric and coronal magnetic fields in the interval 1976-1983 were studied using observations from the Stanford Solar Observatory and a potential field model. The solar wind in the heliosphere is organized into large regions in which the magnetic field has a componenet either toward or away from the sun. The model predicts the location of the current sheet separating these regions. Near solar minimum, in 1976, the current sheet lay within a few degrees of the solar equator having two extensions north and south of the equator. Soon after minimum the latitudinal extent began to increase. The sheet reached to at least 50 deg from 1978 through 1983. The complex structure near maximum occasionally included multiple current sheets. Large scale structures persist for up to two years during the entire interval. To minimize errors in determining the structure of the heliospheric field particular attention was paid to decreasing the distorting effects of rapid field evolution, finding the optimum source surface radius, determining the correction to the sun's polar field, and handling missing data. The predicted structure agrees with direct interplanetary field measurements taken near the ecliptic and with coronameter and interplanetary scintillation measurements which infer the three dimensional interplanetary magnetic structure. During most of the solar cycle the heliospheric field cannot be adequately described as a dipole.

  6. Electron drift in a large scale solid xenon

    SciTech Connect

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.

  7. Electron drift in a large scale solid xenon

    DOE PAGES

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor twomore » faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.« less

  8. Large scale meteorological influence during the Geysers 1979 field experiment

    SciTech Connect

    Barr, S.

    1980-01-01

    A series of meteorological field measurements conducted during July 1979 near Cobb Mountain in Northern California reveals evidence of several scales of atmospheric circulation consistent with the climatic pattern of the area. The scales of influence are reflected in the structure of wind and temperature in vertically stratified layers at a given observation site. Large scale synoptic gradient flow dominates the wind field above about twice the height of the topographic ridge. Below that there is a mixture of effects with evidence of a diurnal sea breeze influence and a sublayer of katabatic winds. The July observations demonstrate that weak migratory circulations in the large scale synoptic meteorological pattern have a significant influence on the day-to-day gradient winds and must be accounted for in planning meteorological programs including tracer experiments.

  9. The Large Scale Synthesis of Aligned Plate Nanostructures

    PubMed Central

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-01-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ′ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential. PMID:27439672

  10. Lagrangian space consistency relation for large scale structure

    SciTech Connect

    Horn, Bart; Hui, Lam; Xiao, Xiao E-mail: lh399@columbia.edu

    2015-09-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.

  11. The workshop on iterative methods for large scale nonlinear problems

    SciTech Connect

    Walker, H.F.; Pernice, M.

    1995-12-01

    The aim of the workshop was to bring together researchers working on large scale applications with numerical specialists of various kinds. Applications that were addressed included reactive flows (combustion and other chemically reacting flows, tokamak modeling), porous media flows, cardiac modeling, chemical vapor deposition, image restoration, macromolecular modeling, and population dynamics. Numerical areas included Newton iterative (truncated Newton) methods, Krylov subspace methods, domain decomposition and other preconditioning methods, large scale optimization and optimal control, and parallel implementations and software. This report offers a brief summary of workshop activities and information about the participants. Interested readers are encouraged to look into an online proceedings available at http://www.usi.utah.edu/logan.proceedings. In this, the material offered here is augmented with hypertext abstracts that include links to locations such as speakers` home pages, PostScript copies of talks and papers, cross-references to related talks, and other information about topics addresses at the workshop.

  12. Large Scale Deformation of the Western US Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2001-01-01

    Destructive earthquakes occur throughout the western US Cordillera (WUSC), not just within the San Andreas fault zone. But because we do not understand the present-day large-scale deformations of the crust throughout the WUSC, our ability to assess the potential for seismic hazards in this region remains severely limited. To address this problem, we are using a large collection of Global Positioning System (GPS) networks which spans the WUSC to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work can roughly be divided into an analysis of the GPS observations to infer the deformation field across and within the entire plate boundary zone and an investigation of the implications of this deformation field regarding plate boundary dynamics.

  13. Large-scale linear nonparallel support vector machine solver.

    PubMed

    Tian, Yingjie; Ping, Yuan

    2014-02-01

    Twin support vector machines (TWSVMs), as the representative nonparallel hyperplane classifiers, have shown the effectiveness over standard SVMs from some aspects. However, they still have some serious defects restricting their further study and real applications: (1) They have to compute and store the inverse matrices before training, it is intractable for many applications where data appear with a huge number of instances as well as features; (2) TWSVMs lost the sparseness by using a quadratic loss function making the proximal hyperplane close enough to the class itself. This paper proposes a Sparse Linear Nonparallel Support Vector Machine, termed as L1-NPSVM, to deal with large-scale data based on an efficient solver-dual coordinate descent (DCD) method. Both theoretical analysis and experiments indicate that our method is not only suitable for large scale problems, but also performs as good as TWSVMs and SVMs.

  14. Instrumentation Development for Large Scale Hypersonic Inflatable Aerodynamic Decelerator Characterization

    NASA Technical Reports Server (NTRS)

    Swanson, Gregory T.; Cassell, Alan M.

    2011-01-01

    Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology is currently being considered for multiple atmospheric entry applications as the limitations of traditional entry vehicles have been reached. The Inflatable Re-entry Vehicle Experiment (IRVE) has successfully demonstrated this technology as a viable candidate with a 3.0 m diameter vehicle sub-orbital flight. To further this technology, large scale HIADs (6.0 8.5 m) must be developed and tested. To characterize the performance of large scale HIAD technology new instrumentation concepts must be developed to accommodate the flexible nature inflatable aeroshell. Many of the concepts that are under consideration for the HIAD FY12 subsonic wind tunnel test series are discussed below.

  15. Long gradient mode and large-scale structure observables

    NASA Astrophysics Data System (ADS)

    Allahyari, Alireza; Firouzjaee, Javad T.

    2017-03-01

    We extend the study of long-mode perturbations to other large-scale observables such as cosmic rulers, galaxy-number counts, and halo bias. The long mode is a pure gradient mode that is still outside an observer's horizon. We insist that gradient-mode effects on observables vanish. It is also crucial that the expressions for observables are relativistic. This allows us to show that the effects of a gradient mode on the large-scale observables vanish identically in a relativistic framework. To study the potential modulation effect of the gradient mode on halo bias, we derive a consistency condition to the first order in gradient expansion. We find that the matter variance at a fixed physical scale is not modulated by the long gradient mode perturbations when the consistency condition holds. This shows that the contribution of long gradient modes to bias vanishes in this framework.

  16. LARGE SCALE PURIFICATION OF PROTEINASES FROM CLOSTRIDIUM HISTOLYTICUM FILTRATES

    PubMed Central

    Conklin, David A.; Webster, Marion E.; Altieri, Patricia L.; Berman, Sanford; Lowenthal, Joseph P.; Gochenour, Raymond B.

    1961-01-01

    Conklin, David A. (Walter Reed Army Institute of Research, Washington, D. C.), Marion E. Webster, Patricia L. Altieri, Sanford Berman, Joseph P. Lowenthal, and Raymond B. Gochenour. Large scale purification of proteinases from Clostridium histolyticum filtrates. J. Bacteriol. 82:589–594. 1961.—A method for the large scale preparation and partial purification of Clostridium histolyticum proteinases by fractional precipitation with ammonium sulfate is described. Conditions for adequate separation and purification of the δ-proteinase and the gelatinase were obtained. Collagenase, on the other hand, was found distributed in four to five fractions and little increase in purity was achieved as compared to the crude ammonium sulfate precipitates. PMID:13880849

  17. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    SciTech Connect

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  18. The Large Scale Synthesis of Aligned Plate Nanostructures

    NASA Astrophysics Data System (ADS)

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-07-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ‧ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential.

  19. Comparative study of large-scale nonlinear optimization methods

    SciTech Connect

    Alemzadeh, S.A.

    1987-01-01

    Solving large-scale nonlinear optimization problems has been one of the active research areas for the last twenty years. Several heuristic algorithms with codes have been developed and implemented since 1966. This study explores the motivation and basic mathematical ideas leading to the development of MINOS-1.0, GRG-2,and MINOS-5.0 algorithms and their codes. The reliability, accuracy, and complexity of the algorithms and software depend upon their use of the gradient, Jacobian, and the Hessian. MINOS-1.0 and GRG-2 incorporate all of the input and output features, while MINOS-1.0 is not able to handle the nonlinearly constrained problems, and GRG-2 is not able to handle large-scale problems, MINOS-5.0 is a robust and an efficient software that incorporates all input, output features.

  20. LARGE-SCALE MOTIONS IN THE PERSEUS GALAXY CLUSTER

    SciTech Connect

    Simionescu, A.; Werner, N.; Urban, O.; Allen, S. W.; Fabian, A. C.; Sanders, J. S.; Mantz, A.; Nulsen, P. E. J.; Takei, Y.

    2012-10-01

    By combining large-scale mosaics of ROSAT PSPC, XMM-Newton, and Suzaku X-ray observations, we present evidence for large-scale motions in the intracluster medium of the nearby, X-ray bright Perseus Cluster. These motions are suggested by several alternating and interleaved X-ray bright, low-temperature, low-entropy arcs located along the east-west axis, at radii ranging from {approx}10 kpc to over a Mpc. Thermodynamic features qualitatively similar to these have previously been observed in the centers of cool-core clusters, and were successfully modeled as a consequence of the gas sloshing/swirling motions induced by minor mergers. Our observations indicate that such sloshing/swirling can extend out to larger radii than previously thought, on scales approaching the virial radius.

  1. The CLASSgal code for relativistic cosmological large scale structure

    SciTech Connect

    Dio, Enea Di; Montanari, Francesco; Durrer, Ruth; Lesgourgues, Julien E-mail: Francesco.Montanari@unige.ch E-mail: Ruth.Durrer@unige.ch

    2013-11-01

    We present accurate and efficient computations of large scale structure observables, obtained with a modified version of the CLASS code which is made publicly available. This code includes all relativistic corrections and computes both the power spectrum C{sub ℓ}(z{sub 1},z{sub 2}) and the corresponding correlation function ξ(θ,z{sub 1},z{sub 2}) of the matter density and the galaxy number fluctuations in linear perturbation theory. For Gaussian initial perturbations, these quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory. We illustrate the usefulness of our code for cosmological parameter estimation through a few simple examples.

  2. Transcriptome characterization and SSR discovery in large-scale loach Paramisgurnus dabryanus (Cobitidae, Cypriniformes).

    PubMed

    Li, Caijuan; Ling, Qufei; Ge, Chen; Ye, Zhuqing; Han, Xiaofei

    2015-02-25

    The large-scale loach (Paramisgurnus dabryanus, Cypriniformes) is a bottom-dwelling freshwater species of fish found mainly in eastern Asia. The natural germplasm resources of this important aquaculture species has been recently threatened due to overfishing and artificial propagation. The objective of this study is to obtain the first functional genomic resource and candidate molecular markers for future conservation and breeding research. Illumina paired-end sequencing generated over one hundred million reads that resulted in 71,887 assembled transcripts, with an average length of 1465bp. 42,093 (58.56%) protein-coding sequences were predicted; and 43,837 transcripts had significant matches to NCBI nonredundant protein (Nr) database. 29,389 and 14,419 transcripts were assigned into gene ontology (GO) categories and Eukaryotic Orthologous Groups (KOG), respectively. 22,102 (31.14%) transcripts were mapped to 302 KEGG pathways. In addition, 15,106 candidate SSR markers were identified, with 11,037 pairs of PCR primers designed. 400 primers pairs of SSR selected randomly were validated, of which 364 (91%) pairs of primers were able to produce PCR products. Further test with 41 loci and 20 large-scale loach specimens collected from the four largest lakes in China showed that 36 (87.8%) loci were polymorphic. The transcriptomic profile and SSR repertoire obtained in this study will facilitate population genetic studies and selective breeding of large-scale loach in the future. Copyright © 2015. Published by Elsevier B.V.

  3. Large-Scale Optimization for Bayesian Inference in Complex Systems

    SciTech Connect

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their

  4. Turbulent amplification of large-scale magnetic fields

    NASA Technical Reports Server (NTRS)

    Montgomery, D.; Chen, H.

    1984-01-01

    Previously-introduced methods for analytically estimating the effects of small-scale turbulent fluctuations on large-scale dynamics are extended to fully three-dimensional magnetohydrodynamics. The problem becomes algebraically tractable in the presence of sufficiently large spectral gaps. The calculation generalizes 'alpha dynamo' calculations, except that the velocity fluctuations and magnetic fluctuations are treated on an independent and equal footing. Earlier expressions for the 'alpha coefficients' of turbulent magnetic field amplification are recovered as a special case.

  5. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    DTIC Science & Technology

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  6. Host Immunity via Mutable Virtualized Large-Scale Network Containers

    DTIC Science & Technology

    2016-07-25

    migrate to different IP addresses multiple 6mes. We implement a virtual machine based system prototype and evaluate it using state-of-the-a1t scanning...entire !Pv4 address space within 5 Host Immunity via Mutable Virtualized Large-Scale Network Containers 45 minutes from a single machine . Second, when...that the attacker will be trapped into one decoy instead of the real server. We implement a virtual machine (VM)-based prototype that integrates

  7. Developing and Understanding Methods for Large Scale Nonlinear Optimization

    DTIC Science & Technology

    2001-12-01

    development of new algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with...analysis of tensor and SQP methods for singular con- strained optimization", to appear in SIAM Journal on Optimization. Published in peer-reviewed...Mathematica, Vol III, Journal der Deutschen Mathematiker-Vereinigung, 1998. S. Crivelli, B. Bader, R. Byrd, E. Eskow, V. Lamberti , R.Schnabel and T

  8. Wiggly cosmic strings, neutrinos and large-scale structure

    NASA Astrophysics Data System (ADS)

    Vachaspati, Tanmay

    1993-04-01

    We discuss the cosmic string scenario of large-scale structure formation in light of the result that the strings are not smooth but instead have a lot of sub-structure or wiggles on them. It appears from the results of Albrecht and Stebbins that the scenario works best if the universe is dominated by massive neutrinos or some other form of hot dark matter. Some unique features of the scenario, such as the generation of primordial magnetic fields, are also described.

  9. Analysis plan for 1985 large-scale tests. Technical report

    SciTech Connect

    McMullan, F.W.

    1983-01-01

    The purpose of this effort is to assist DNA in planning for large-scale (upwards of 5000 tons) detonations of conventional explosives in the 1985 and beyond time frame. Primary research objectives were to investigate potential means to increase blast duration and peak pressures. This report identifies and analyzes several candidate explosives. It examines several charge designs and identifies advantages and disadvantages of each. Other factors including terrain and multiburst techniques are addressed as are test site considerations.

  10. Critical Problems in Very Large Scale Computer Systems

    DTIC Science & Technology

    1990-03-31

    MAY I i9cu( CRITICAL PROBLEMS IN VERY LARGE SCALE COMPUTER SYSTEMS Semiannual Technical Report for the Period October 1, 1989 to...suitability for supporting popular models of parallel computation . During the reporting period they have developed an interface definition. A simulator has...queries in computational geometry . Range queries are a fundamental problem in computational geometry with applications to computer graphics and

  11. Supporting large scale applications on networks of workstations

    NASA Technical Reports Server (NTRS)

    Cooper, Robert; Birman, Kenneth P.

    1989-01-01

    Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.

  12. Large Scale Airflow Perturbations and Resultant Dune Dynamics

    NASA Astrophysics Data System (ADS)

    Smith, Alexander B.; Jackson, Derek W. T.; Cooper, J. Andrew G.; Beyers, Meiring

    2017-04-01

    Large-scale atmospheric turbulence can have a large impact on the regional wind regime effecting dune environments. Depending on the incident angle of mesoscale airflow, local topographic steering can also alter wind conditions and subsequent aeolian dynamics. This research analyses the influence of large-scale airflow perturbations occurring at the Maspalomas dunefield located on the southern coast of Gran Canaria, Spain. These perturbations in turn significantly influence the morphometry and migration rates of barchan dunes, monitored at the study site through time. The main meteorological station on Gran Canaria records highly uni-modal NNE wind conditions; however, simultaneously measured winds are highly variable around the island, showing a high degree of steering. Large Eddy Simulations (LES) were used to identify large-scale airflow perturbations around the island of Gran Canaria during NNE, N, and NNW incident flow directions. Results indicate that approaching surface airflow bifurcates around the island's coastline before converging at the lee coast. Winds in areas located around the islands lateral coast are controlled by these diverging flow patterns, whereas lee-side areas are influenced primarily by the islands upwind canyon topography leading to highly turbulent flow. Characteristic turbulent eddies show a complex wind environment at Maspalomas with winds diverging-converging up to 180° between the eastern and western sections of the dunefield. Multi-directional flow conditions lead to highly altered dune dynamics including the production of temporary slip faces on the stoss slopes, rapid reduction in crest height and slope length, and development of bi-crested dunes. This indicates a distinct bi-modality of airflow conditions that control the geomorphic evolution of the dunefield. Variability in wind conditions is not evident in the long-term meteorological records on the island, indicating the significance of large scale atmospheric steering on

  13. A Holistic Management Architecture for Large-Scale Adaptive Networks

    DTIC Science & Technology

    2007-09-01

    MANAGEMENT ARCHITECTURE FOR LARGE-SCALE ADAPTIVE NETWORKS by Michael R. Clement September 2007 Thesis Advisor: Alex Bordetsky Second Reader...TECHNOLOGY MANAGEMENT from the NAVAL POSTGRADUATE SCHOOL September 2007 Author: Michael R. Clement Approved by: Dr. Alex ...achieve in life is by His will. Ad Majorem Dei Gloriam. To my parents, my family, and Caitlin: For supporting me, listening to me when I got

  14. A Cloud Computing Platform for Large-Scale Forensic Computing

    NASA Astrophysics Data System (ADS)

    Roussev, Vassil; Wang, Liqiang; Richard, Golden; Marziale, Lodovico

    The timely processing of massive digital forensic collections demands the use of large-scale distributed computing resources and the flexibility to customize the processing performed on the collections. This paper describes MPI MapReduce (MMR), an open implementation of the MapReduce processing model that outperforms traditional forensic computing techniques. MMR provides linear scaling for CPU-intensive processing and super-linear scaling for indexing-related workloads.

  15. Large-Scale Weather Disturbances in Mars’ Southern Extratropics

    NASA Astrophysics Data System (ADS)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2015-11-01

    Between late autumn and early spring, Mars’ middle and high latitudes within its atmosphere support strong mean thermal gradients between the tropics and poles. Observations from both the Mars Global Surveyor (MGS) and Mars Reconnaissance Orbiter (MRO) indicate that this strong baroclinicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). These extratropical weather disturbances are key components of the global circulation. Such wave-like disturbances act as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of large-scale, traveling extratropical synoptic-period disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively lifted and radiatively active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to their northern-hemisphere counterparts, southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are examined. Simulations that adapt Mars’ full topography compared to simulations that utilize synthetic topographies emulating key large-scale features of the southern middle latitudes indicate that Mars’ transient barotropic/baroclinic eddies are highly influenced by the great impact basins of this hemisphere (e.g., Argyre and Hellas). The occurrence of a southern storm zone in late winter and early spring appears to be anchored to the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre

  16. The large-scale anisotropy with the PAMELA calorimeter

    NASA Astrophysics Data System (ADS)

    Karelin, A.; Adriani, O.; Barbarino, G.; Bazilevskaya, G.; Bellotti, R.; Boezio, M.; Bogomolov, E.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bruno, A.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; De Donato, C.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A.; Koldashov, S.; Koldobskiy, S.; Krut'kov, S.; Kvashnin, A.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A.; Menn, W.; Mergé, M.; Mikhailov, V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S.; Sarkar, R.; Simon, M.; Scotti, V.; Sparvoli, R.; Spillantini, P.; Stozhkov, Y.; Vacchi, A.; Vannuccini, E.; Vasilyev, G.; Voronov, S.; Yurkin, Y.; Zampa, G.; Zampa, N.

    2015-10-01

    The large-scale anisotropy (or the so-called star-diurnal wave) has been studied using the calorimeter of the space-born experiment PAMELA. The cosmic ray anisotropy has been obtained for the Southern and Northern hemispheres simultaneously in the equatorial coordinate system for the time period 2006-2014. The dipole amplitude and phase have been measured for energies 1-20 TeV n-1.

  17. Space transportation booster engine thrust chamber technology, large scale injector

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.

    1993-01-01

    The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.

  18. Large Scale Density Estimation of Blue and Fin Whales (LSD)

    DTIC Science & Technology

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...estimating blue and fin whale density that is effective over large spatial scales and is designed to cope with spatial variation in animal density utilizing...a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse hydrophone arrays in the

  19. Relic vector field and CMB large scale anomalies

    SciTech Connect

    Chen, Xingang; Wang, Yi E-mail: yw366@cam.ac.uk

    2014-10-01

    We study the most general effects of relic vector fields on the inflationary background and density perturbations. Such effects are observable if the number of inflationary e-folds is close to the minimum requirement to solve the horizon problem. We show that this can potentially explain two CMB large scale anomalies: the quadrupole-octopole alignment and the quadrupole power suppression. We discuss its effect on the parity anomaly. We also provide analytical template for more detailed data comparison.

  20. On a Game of Large-Scale Projects Competition

    NASA Astrophysics Data System (ADS)

    Nikonov, Oleg I.; Medvedeva, Marina A.

    2009-09-01

    The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].

  1. Measuring large scale space perception in literary texts

    NASA Astrophysics Data System (ADS)

    Rossi, Paolo

    2007-07-01

    A center and radius of “perception” (in the sense of environmental cognition) can be formally associated with a written text and operationally defined. Simple algorithms for their computation are presented, and indicators for anisotropy in large scale space perception are introduced. The relevance of these notions for the analysis of literary and historical records is briefly discussed and illustrated with an example taken from medieval historiography.

  2. Semantic Concept Discovery for Large Scale Zero Shot Event Detection

    DTIC Science & Technology

    2015-07-25

    NO. 0704-0188 3. DATES COVERED (From - To) - UU UU UU UU 18-08-2015 Approved for public release; distribution is unlimited. Semantic Concept Discovery ...Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 zero shot event detection, semantic concept discovery REPORT DOCUMENTATION PAGE 11...Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213 -3815 ABSTRACT Semantic Concept Discovery for Large-Scale Zero-Shot Event Detection Report

  3. Large-scale Alfvén vortices

    SciTech Connect

    Onishchenko, O. G.; Horton, W.; Scullion, E.; Fedun, V.

    2015-12-15

    The new type of large-scale vortex structures of dispersionless Alfvén waves in collisionless plasma is investigated. It is shown that Alfvén waves can propagate in the form of Alfvén vortices of finite characteristic radius and characterised by magnetic flux ropes carrying orbital angular momentum. The structure of the toroidal and radial velocity, fluid and magnetic field vorticity, the longitudinal electric current in the plane orthogonal to the external magnetic field are discussed.

  4. The Phoenix series large scale LNG pool fire experiments.

    SciTech Connect

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  5. Large-scale quantization from local correlations in space plasmas

    NASA Astrophysics Data System (ADS)

    Livadiotis, George; McComas, David J.

    2014-05-01

    This study examines the large-scale quantization that can characterize the phase space of certain physical systems. Plasmas are such systems where large-scale quantization, ħ*, is caused by Debye shielding that structures correlations between particles. The value of ħ* is constant—some 12 orders of magnitude larger than the Planck constant—across a wide range of space plasmas, from the solar wind in the inner heliosphere to the distant plasma in the inner heliosheath and the local interstellar medium. This paper develops the foundation and advances the understanding of the concept of plasma quantization; in particular, we (i) show the analogy of plasma to Planck quantization, (ii) show the key points of plasma quantization, (iii) construct some basic quantum mechanical concepts for the large-scale plasma quantization, (iv) investigate the correlation between plasma parameters that implies plasma quantization, when it is approximated by a relation between the magnetosonic energy and the plasma frequency, (v) analyze typical space plasmas throughout the heliosphere and show the constancy of plasma quantization over many orders of magnitude in plasma parameters, (vi) analyze Advanced Composition Explorer (ACE) solar wind measurements to develop another measurement of the value of ħ*, and (vii) apply plasma quantization to derive unknown plasma parameters when some key observable is missing.

  6. Large-scale investigation of genomic markers for severe periodontitis.

    PubMed

    Suzuki, Asami; Ji, Guijin; Numabe, Yukihiro; Ishii, Keisuke; Muramatsu, Masaaki; Kamoi, Kyuichi

    2004-09-01

    The purpose of the present study was to investigate the genomic markers for periodontitis, using large-scale single-nucleotide polymorphism (SNP) association studies comparing healthy volunteers and patients with periodontitis. Genomic DNA was obtained from 19 healthy volunteers and 22 patients with severe periodontitis, all of whom were Japanese. The subjects were genotyped at 637 SNPs in 244 genes on a large scale, using the TaqMan polymerase chain reaction (PCR) system. Statistically significant differences in allele and genotype frequencies were analyzed with Fisher's exact test. We found statistically significant differences (P < 0.01) between the healthy volunteers and patients with severe periodontitis in the following genes; gonadotropin-releasing hormone 1 (GNRH1), phosphatidylinositol 3-kinase regulatory 1 (PIK3R1), dipeptidylpeptidase 4 (DPP4), fibrinogen-like 2 (FGL2), and calcitonin receptor (CALCR). These results suggest that SNPs in the GNRH1, PIK3R1, DPP4, FGL2, and CALCR genes are genomic markers for severe periodontitis. Our findings indicate the necessity of analyzing SNPs in genes on a large scale (i.e., genome-wide approach), to identify genomic markers for periodontitis.

  7. Geospatial Optimization of Siting Large-Scale Solar Projects

    SciTech Connect

    Macknick, J.; Quinby, T.; Caulfield, E.; Gerritsen, M.; Diffendorfer, J.; Haines, S.

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  8. Large-scale data mining pilot project in human genome

    SciTech Connect

    Musick, R.; Fidelis, R.; Slezak, T.

    1997-05-01

    This whitepaper briefly describes a new, aggressive effort in large- scale data Livermore National Labs. The implications of `large- scale` will be clarified Section. In the short term, this effort will focus on several @ssion-critical questions of Genome project. We will adapt current data mining techniques to the Genome domain, to quantify the accuracy of inference results, and lay the groundwork for a more extensive effort in large-scale data mining. A major aspect of the approach is that we will be fully-staffed data warehousing effort in the human Genome area. The long term goal is strong applications- oriented research program in large-@e data mining. The tools, skill set gained will be directly applicable to a wide spectrum of tasks involving a for large spatial and multidimensional data. This includes applications in ensuring non-proliferation, stockpile stewardship, enabling Global Ecology (Materials Database Industrial Ecology), advancing the Biosciences (Human Genome Project), and supporting data for others (Battlefield Management, Health Care).

  9. A model of plasma heating by large-scale flow

    NASA Astrophysics Data System (ADS)

    Pongkitiwanichakul, P.; Cattaneo, F.; Boldyrev, S.; Mason, J.; Perez, J. C.

    2015-12-01

    In this work, we study the process of energy dissipation triggered by a slow large-scale motion of a magnetized conducting fluid. Our consideration is motivated by the problem of heating the solar corona, which is believed to be governed by fast reconnection events set off by the slow motion of magnetic field lines anchored in the photospheric plasma. To elucidate the physics governing the disruption of the imposed laminar motion and the energy transfer to small scales, we propose a simplified model where the large-scale motion of magnetic field lines is prescribed not at the footpoints but rather imposed volumetrically. As a result, the problem can be treated numerically with an efficient, highly accurate spectral method, allowing us to use a resolution and statistical ensemble exceeding those of the previous work. We find that, even though the large-scale deformations are slow, they eventually lead to reconnection events that drive a turbulent state at smaller scales. The small-scale turbulence displays many of the universal features of field-guided magnetohydrodynamic turbulence like a well-developed inertial range spectrum. Based on these observations, we construct a phenomenological model that gives the scalings of the amplitude of the fluctuations and the energy-dissipation rate as functions of the input parameters. We find good agreement between the numerical results and the predictions of the model.

  10. Large-scale biodiversity patterns in freshwater phytoplankton.

    PubMed

    Stomp, Maayke; Huisman, Jef; Mittelbach, Gary G; Litchman, Elena; Klausmeier, Christopher A

    2011-11-01

    Our planet shows striking gradients in the species richness of plants and animals, from high biodiversity in the tropics to low biodiversity in polar and high-mountain regions. Recently, similar patterns have been described for some groups of microorganisms, but the large-scale biogeographical distribution of freshwater phytoplankton diversity is still largely unknown. We examined the species diversity of freshwater phytoplankton sampled from 540 lakes and reservoirs distributed across the continental United States and found strong latitudinal, longitudinal, and altitudinal gradients in phytoplankton biodiversity, demonstrating that microorganisms can show substantial geographic variation in biodiversity. Detailed analysis using structural equation models indicated that these large-scale biodiversity gradients in freshwater phytoplankton diversity were mainly driven by local environmental factors, although there were residual direct effects of latitude, longitude, and altitude as well. Specifically, we found that phytoplankton species richness was an increasing saturating function of lake chlorophyll a concentration, increased with lake surface area and possibly increased with water temperature, resembling effects of productivity, habitat area, and temperature on diversity patterns commonly observed for macroorganisms. In turn, these local environmental factors varied along latitudinal, longitudinal, and altitudinal gradients. These results imply that changes in land use or climate that affect these local environmental factors are likely to have major impacts on large-scale biodiversity patterns of freshwater phytoplankton.

  11. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  12. Channel capacity of next generation large scale MIMO systems

    NASA Astrophysics Data System (ADS)

    Alshammari, A.; Albdran, S.; Matin, M.

    2016-09-01

    Information rate that can be transferred over a given bandwidth is limited by the information theory. Capacity depends on many factors such as the signal to noise ratio (SNR), channel state information (CSI) and the spatial correlation in the propagation environment. It is very important to increase spectral efficiency in order to meet the growing demand for wireless services. Thus, Multiple input multiple output (MIMO) technology has been developed and applied in most of the wireless standards and it has been very successful in increasing capacity and reliability. As the demand is still increasing, attention now is shifting towards large scale multiple input multiple output (MIMO) which has a potential of bringing orders of magnitude of improvement in spectral and energy efficiency. It has been shown that users channels decorrelate after increasing the number of antennas. As a result, inter-user interference can be avoided since energy can be focused on precise directions. This paper investigates the limits of channel capacity for large scale MIMO. We study the relation between spectral efficiency and the number of antenna N. We use time division duplex (TDD) system in order to obtain CSI using training sequence in the uplink. The same CSI is used for the downlink because the channel is reciprocal. Spectral efficiency is measured for channel model that account for small scale fading while ignoring the effect of large scale fading. It is shown the spectral efficiency can be improved significantly when compared to single antenna systems in ideal circumstances.

  13. Sparse approximation through boosting for learning large scale kernel machines.

    PubMed

    Sun, Ping; Yao, Xin

    2010-06-01

    Recently, sparse approximation has become a preferred method for learning large scale kernel machines. This technique attempts to represent the solution with only a subset of original data points also known as basis vectors, which are usually chosen one by one with a forward selection procedure based on some selection criteria. The computational complexity of several resultant algorithms scales as O(NM(2)) in time and O(NM) in memory, where N is the number of training points and M is the number of basis vectors as well as the steps of forward selection. For some large scale data sets, to obtain a better solution, we are sometimes required to include more basis vectors, which means that M is not trivial in this situation. However, the limited computational resource (e.g., memory) prevents us from including too many vectors. To handle this dilemma, we propose to add an ensemble of basis vectors instead of only one at each forward step. The proposed method, closely related to gradient boosting, could decrease the required number M of forward steps significantly and thus a large fraction of computational cost is saved. Numerical experiments on three large scale regression tasks and a classification problem demonstrate the effectiveness of the proposed approach.

  14. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances.

    PubMed

    Parker, V Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host.

  15. Line segment extraction for large scale unorganized point clouds

    NASA Astrophysics Data System (ADS)

    Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan

    2015-04-01

    Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.

  16. [A large-scale accident in Alpine terrain].

    PubMed

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  17. Large scale structure in universes dominated by cold dark matter

    NASA Technical Reports Server (NTRS)

    Bond, J. Richard

    1986-01-01

    The theory of Gaussian random density field peaks is applied to a numerical study of the large-scale structure developing from adiabatic fluctuations in models of biased galaxy formation in universes with Omega = 1, h = 0.5 dominated by cold dark matter (CDM). The angular anisotropy of the cross-correlation function demonstrates that the far-field regions of cluster-scale peaks are asymmetric, as recent observations indicate. These regions will generate pancakes or filaments upon collapse. One-dimensional singularities in the large-scale bulk flow should arise in these CDM models, appearing as pancakes in position space. They are too rare to explain the CfA bubble walls, but pancakes that are just turning around now are sufficiently abundant and would appear to be thin walls normal to the line of sight in redshift space. Large scale streaming velocities are significantly smaller than recent observations indicate. To explain the reported 700 km/s coherent motions, mass must be significantly more clustered than galaxies with a biasing factor of less than 0.4 and a nonlinear redshift at cluster scales greater than one for both massive neutrino and cold models.

  18. Learning Short Binary Codes for Large-scale Image Retrieval.

    PubMed

    Liu, Li; Yu, Mengyang; Shao, Ling

    2017-03-01

    Large-scale visual information retrieval has become an active research area in this big data era. Recently, hashing/binary coding algorithms prove to be effective for scalable retrieval applications. Most existing hashing methods require relatively long binary codes (i.e., over hundreds of bits, sometimes even thousands of bits) to achieve reasonable retrieval accuracies. However, for some realistic and unique applications, such as on wearable or mobile devices, only short binary codes can be used for efficient image retrieval due to the limitation of computational resources or bandwidth on these devices. In this paper, we propose a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes (i.e., usually the code length shorter than 100 b) for scalable image retrieval tasks. By exploring the discriminative ability of each dimension of data, MCR can generate one bit binary code for each dimension and simultaneously rank the discriminative separability of each bit according to the proposed cost function. Only top-ranked bits with minimum cost-values are then selected and grouped together to compose the final salient binary codes. Extensive experimental results on large-scale retrieval demonstrate that MCR can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval.

  19. Robust regression for large-scale neuroimaging studies.

    PubMed

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  20. Reliability assessment for components of large scale photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Ahadi, Amir; Ghadimi, Noradin; Mirabbasi, Davar

    2014-10-01

    Photovoltaic (PV) systems have significantly shifted from independent power generation systems to a large-scale grid-connected generation systems in recent years. The power output of PV systems is affected by the reliability of various components in the system. This study proposes an analytical approach to evaluate the reliability of large-scale, grid-connected PV systems. The fault tree method with an exponential probability distribution function is used to analyze the components of large-scale PV systems. The system is considered in the various sequential and parallel fault combinations in order to find all realistic ways in which the top or undesired events can occur. Additionally, it can identify areas that the planned maintenance should focus on. By monitoring the critical components of a PV system, it is possible not only to improve the reliability of the system, but also to optimize the maintenance costs. The latter is achieved by informing the operators about the system component's status. This approach can be used to ensure secure operation of the system by its flexibility in monitoring system applications. The implementation demonstrates that the proposed method is effective and efficient and can conveniently incorporate more system maintenance plans and diagnostic strategies.

  1. Impact of Large-scale Geological Architectures On Recharge

    NASA Astrophysics Data System (ADS)

    Troldborg, L.; Refsgaard, J. C.; Engesgaard, P.; Jensen, K. H.

    Geological and hydrogeological data constitutes the basis for assessment of ground- water flow pattern and recharge zones. The accessibility and applicability of hard ge- ological data is often a major obstacle in deriving plausible conceptual models. Nev- ertheless focus is often on parameter uncertainty caused by the effect of geological heterogeneity due to lack of hard geological data, thus neglecting the possibility of alternative conceptualizations of the large-scale geological architecture. For a catchment in the eastern part of Denmark we have constructed different geologi- cal models based on different conceptualization of the major geological trends and fa- cies architecture. The geological models are equally plausible in a conceptually sense and they are all calibrated to well head and river flow measurements. Comparison of differences in recharge zones and subsequently well protection zones emphasize the importance of assessing large-scale geological architecture in hydrological modeling on regional scale in a non-deterministic way. Geostatistical modeling carried out in a transitional probability framework shows the possibility of assessing multiple re- alizations of large-scale geological architecture from a combination of soft and hard geological information.

  2. Alteration of Large-Scale Chromatin Structure by Estrogen Receptor

    PubMed Central

    Nye, Anne C.; Rajendran, Ramji R.; Stenoien, David L.; Mancini, Michael A.; Katzenellenbogen, Benita S.; Belmont, Andrew S.

    2002-01-01

    The estrogen receptor (ER), a member of the nuclear hormone receptor superfamily important in human physiology and disease, recruits coactivators which modify local chromatin structure. Here we describe effects of ER on large-scale chromatin structure as visualized in live cells. We targeted ER to gene-amplified chromosome arms containing large numbers of lac operator sites either directly, through a lac repressor-ER fusion protein (lac rep-ER), or indirectly, by fusing lac repressor with the ER interaction domain of the coactivator steroid receptor coactivator 1. Significant decondensation of large-scale chromatin structure, comparable to that produced by the ∼150-fold-stronger viral protein 16 (VP16) transcriptional activator, was produced by ER in the absence of estradiol using both approaches. Addition of estradiol induced a partial reversal of this unfolding by green fluorescent protein-lac rep-ER but not by wild-type ER recruited by a lac repressor-SRC570-780 fusion protein. The chromatin decondensation activity did not require transcriptional activation by ER nor did it require ligand-induced coactivator interactions, and unfolding did not correlate with histone hyperacetylation. Ligand-induced coactivator interactions with helix 12 of ER were necessary for the partial refolding of chromatin in response to estradiol using the lac rep-ER tethering system. This work demonstrates that when tethered or recruited to DNA, ER possesses a novel large-scale chromatin unfolding activity. PMID:11971975

  3. Multiresolution comparison of precipitation datasets for large-scale models

    NASA Astrophysics Data System (ADS)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  4. Equivalent common path method in large-scale laser comparator

    NASA Astrophysics Data System (ADS)

    He, Mingzhao; Li, Jianshuang; Miao, Dongjing

    2015-02-01

    Large-scale laser comparator is main standard device that providing accurate, reliable and traceable measurements for high precision large-scale line and 3D measurement instruments. It mainly composed of guide rail, motion control system, environmental parameters monitoring system and displacement measurement system. In the laser comparator, the main error sources are temperature distribution, straightness of guide rail and pitch and yaw of measuring carriage. To minimize the measurement uncertainty, an equivalent common optical path scheme is proposed and implemented. Three laser interferometers are adjusted to parallel with the guide rail. The displacement in an arbitrary virtual optical path is calculated using three displacements without the knowledge of carriage orientations at start and end positions. The orientation of air floating carriage is calculated with displacements of three optical path and position of three retroreflectors which are precisely measured by Laser Tracker. A 4th laser interferometer is used in the virtual optical path as reference to verify this compensation method. This paper analyzes the effect of rail straightness on the displacement measurement. The proposed method, through experimental verification, can improve the measurement uncertainty of large-scale laser comparator.

  5. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances

    PubMed Central

    Parker, V. Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host. PMID:26151560

  6. Large-scale flow generation by inhomogeneous helicity.

    PubMed

    Yokoi, N; Brandenburg, A

    2016-03-01

    The effect of kinetic helicity (velocity-vorticity correlation) on turbulent momentum transport is investigated. The turbulent kinetic helicity (pseudoscalar) enters the Reynolds stress (mirror-symmetric tensor) expression in the form of a helicity gradient as the coupling coefficient for the mean vorticity and/or the angular velocity (axial vector), which suggests the possibility of mean-flow generation in the presence of inhomogeneous helicity. This inhomogeneous helicity effect, which was previously confirmed at the level of a turbulence- or closure-model simulation, is examined with the aid of direct numerical simulations of rotating turbulence with nonuniform helicity sustained by an external forcing. The numerical simulations show that the spatial distribution of the Reynolds stress is in agreement with the helicity-related term coupled with the angular velocity, and that a large-scale flow is generated in the direction of angular velocity. Such a large-scale flow is not induced in the case of homogeneous turbulent helicity. This result confirms the validity of the inhomogeneous helicity effect in large-scale flow generation and suggests that a vortex dynamo is possible even in incompressible turbulence where there is no baroclinicity effect.

  7. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  8. Optimization of large-scale heterogeneous system-of-systems models.

    SciTech Connect

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane; Lee, Herbert K. H.; Hart, William Eugene; Gray, Genetha Anne; Woodruff, David L.

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  9. Neutrino Physics from the Cosmic Microwave Background and Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Abazajian, Kevork N.; Kaplinghat, Manoj

    2016-10-01

    Cosmology and neutrino physics have converged into a recent discovery era. The success of the standard model of cosmology in explaining the cosmic microwave background and cosmological large-scale structure data allows for the possibility of measuring the absolute neutrino mass and providing exquisite constraints on the number of light degrees of freedom, including neutrinos. This sensitivity to neutrino physics requires the validity of some of the assumptions, including general relativity, inflationary cosmology, and standard thermal history, many of which can be tested with cosmological data. This sensitivity is also predicated on the robust handling of systematic uncertainties associated with different cosmological observables. We review several past, current, and future measurements of the cosmic microwave background and cosmological large-scale structure that allow us to do fundamental neutrino physics with cosmology.

  10. Automating large-scale power plant systems: a perspective and philosophy

    SciTech Connect

    Kisner, R A; Raju, G V.S.

    1984-12-01

    This report is intended to convey a philosophy for the design of large-scale control systems that will guide control engineers and managers in the development of integrated, intelligent, flexible control systems. A liquid reactor, the large-scale protoype breeder, is the focus of the examples and analyses in the report. A structure for the discontinuous and continuous control aspects is presented in sufficient detail to form the foundation for future expanded development. The system diagramming techniques used are especially useful because they are both an aid to control design and a specification for software design. This report develops a continuous-system supervisory controlled that adds the capability for optimal coordination and control to existing supervisory control design. This development makes global minimization of variations in key system parameters during transients.

  11. Modelling the large-scale redshift-space 3-point correlation function of galaxies

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.

    2017-08-01

    We present a configuration-space model of the large-scale galaxy 3-point correlation function (3PCF) based on leading-order perturbation theory and including redshift-space distortions (RSD). This model should be useful in extracting distance-scale information from the 3PCF via the baryon acoustic oscillation method. We include the first redshift-space treatment of biasing by the baryon-dark matter relative velocity. Overall, on large scales the effect of RSD is primarily a renormalization of the 3PCF that is roughly independent of both physical scale and triangle opening angle; for our adopted Ωm and bias values, the rescaling is a factor of ∼1.8. We also present an efficient scheme for computing 3PCF predictions from our model, important for allowing fast exploration of the space of cosmological parameters in future analyses.

  12. A feasibility study of large-scale photobiological hydrogen production utilizing mariculture-raised cyanobacteria.

    PubMed

    Sakurai, Hidehiro; Masukawa, Hajime; Kitashima, Masaharu; Inoue, Kazuhito

    2010-01-01

    In order to decrease CO(2) emissions from the burning of fossil fuels, the development of new renewable energy sources sufficiently large in quantity is essential. To meet this need, we propose large-scale H(2) production on the sea surface utilizing cyanobacteria. Although many of the relevant technologies are in the early stage of development, this chapter briefly examines the feasibility of such H(2) production, in order to illustrate that under certain conditions large-scale photobiological H(2) production can be viable. Assuming that solar energy is converted to H(2) at 1.2% efficiency, the future cost of H(2) can be estimated to be about 11 (pipelines) and 26.4 (compression and marine transportation) cents kWh(-1), respectively.

  13. Ground test of a large scale 'D' vented thrust deflecting nozzle

    NASA Technical Reports Server (NTRS)

    Rosenberg, E. W.; Christiansen, R. S.

    1981-01-01

    Future V/STOL aircraft will require efficient techniques for changing the thrust vector from the vertical direction for VTOL operation to the horizontal direction for conventional flight. Most V/STOL concepts utilize thrust vectoring nozzles to provide this variation in the thrust vector direction. An experimental test program was initiated to demonstrate the capabilities of a large scale 'D' vented thrust deflecting system coupled with a high bypass ratio turbofan engine. Data were obtained for a 'D' vented nozzle mounted behind a YTF-34-F5 turbofan engine. Preliminary data are presented for a variety of test conditions. Attention is given to aspects of 'D' vented nozzle design, the test apparatus, engine-nozzle compatibility, exit area variation, longitudinal vectoring performance, nozzle temperature distribution, and large scale - small scale comparisons.

  14. III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.

    PubMed

    Davis-Kean, Pamela E; Jager, Justin

    2017-06-01

    For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.

  15. Foundational perspectives on causality in large-scale brain networks.

    PubMed

    Mannino, Michael; Bressler, Steven L

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  16. Robust large-scale parallel nonlinear solvers for simulations.

    SciTech Connect

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple to write

  17. New probes of Cosmic Microwave Background large-scale anomalies

    NASA Astrophysics Data System (ADS)

    Aiola, Simone

    Fifty years of Cosmic Microwave Background (CMB) data played a crucial role in constraining the parameters of the LambdaCDM model, where Dark Energy, Dark Matter, and Inflation are the three most important pillars not yet understood. Inflation prescribes an isotropic universe on large scales, and it generates spatially-correlated density fluctuations over the whole Hubble volume. CMB temperature fluctuations on scales bigger than a degree in the sky, affected by modes on super-horizon scale at the time of recombination, are a clean snapshot of the universe after inflation. In addition, the accelerated expansion of the universe, driven by Dark Energy, leaves a hardly detectable imprint in the large-scale temperature sky at late times. Such fundamental predictions have been tested with current CMB data and found to be in tension with what we expect from our simple LambdaCDM model. Is this tension just a random fluke or a fundamental issue with the present model? In this thesis, we present a new framework to probe the lack of large-scale correlations in the temperature sky using CMB polarization data. Our analysis shows that if a suppression in the CMB polarization correlations is detected, it will provide compelling evidence for new physics on super-horizon scale. To further analyze the statistical properties of the CMB temperature sky, we constrain the degree of statistical anisotropy of the CMB in the context of the observed large-scale dipole power asymmetry. We find evidence for a scale-dependent dipolar modulation at 2.5sigma. To isolate late-time signals from the primordial ones, we test the anomalously high Integrated Sachs-Wolfe effect signal generated by superstructures in the universe. We find that the detected signal is in tension with the expectations from LambdaCDM at the 2.5sigma level, which is somewhat smaller than what has been previously argued. To conclude, we describe the current status of CMB observations on small scales, highlighting the

  18. Foundational perspectives on causality in large-scale brain networks

    NASA Astrophysics Data System (ADS)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  19. Herschel view of the large-scale structure in the Chamaeleon dark clouds

    NASA Astrophysics Data System (ADS)

    Alves de Oliveira, C.; Schneider, N.; Merín, B.; Prusti, T.; Ribas, Á.; Cox, N. L. J.; Vavrek, R.; Könyves, V.; Arzoumanian, D.; Puga, E.; Pilbratt, G. L.; Kóspál, Á.; André, Ph.; Didelon, P.; Men'shchikov, A.; Royer, P.; Waelkens, C.; Bontemps, S.; Winston, E.; Spezzi, L.

    2014-08-01

    Context. The Chamaeleon molecular cloud complex is one of the nearest star-forming sites and encompasses three molecular clouds (Cha I, II, and III) that have a different star-formation history, from quiescent (Cha III) to actively forming stars (Cha II), and one that reaches the end of star-formation (Cha I). Aims: We aim at characterising the large-scale structure of the three sub-regions of the Chamaeleon molecular cloud complex by analysing new far-infrared images taken with the Herschel Space Observatory. Methods: We derived column density and temperature maps using PACS and SPIRE observations from the Herschel Gould Belt Survey and applied several tools, such as filament tracing, power-spectra, Δ-variance, and probability distribution functions (PDFs) of the column density, to derive the physical properties. Results: The column density maps reveal a different morphological appearance for each of the three clouds, with a ridge-like structure for Cha I, a clump-dominated regime for Cha II, and an intricate filamentary network for Cha III. The filament width is measured to be about 0.12 ± 0.04 pc in the three clouds, and the filaments are found to be gravitationally unstable in Cha I and II, but mostly subcritical in Cha III. Faint filaments (striations) are prominent in Cha I and are mostly aligned with the large-scale magnetic field. The PDFs of all regions show a lognormal distribution at low column densities. For higher densities, the PDF of Cha I shows a turnover indicative of an extended higher density component and culminates in a power-law tail. Cha II shows a power-law tail with a slope characteristic of gravity. The PDF of Cha III can be best fit by a single lognormal. Conclusions: The turbulence properties of the three regions are found to be similar, pointing towards a scenario where the clouds are impacted by large-scale processes. The magnetic field might possibly play an important role for the star formation efficiency in the Chamaeleon clouds

  20. Large-scale ground motion simulation using GPGPU

    NASA Astrophysics Data System (ADS)

    Aoi, S.; Maeda, T.; Nishizawa, N.; Aoki, T.

    2012-12-01

    Huge computation resources are required to perform large-scale ground motion simulations using 3-D finite difference method (FDM) for realistic and complex models with high accuracy. Furthermore, thousands of various simulations are necessary to evaluate the variability of the assessment caused by uncertainty of the assumptions of the source models for future earthquakes. To conquer the problem of restricted computational resources, we introduced the use of GPGPU (General purpose computing on graphics processing units) which is the technique of using a GPU as an accelerator of the computation which has been traditionally conducted by the CPU. We employed the CPU version of GMS (Ground motion Simulator; Aoi et al., 2004) as the original code and implemented the function for GPU calculation using CUDA (Compute Unified Device Architecture). GMS is a total system for seismic wave propagation simulation based on 3-D FDM scheme using discontinuous grids (Aoi&Fujiwara, 1999), which includes the solver as well as the preprocessor tools (parameter generation tool) and postprocessor tools (filter tool, visualization tool, and so on). The computational model is decomposed in two horizontal directions and each decomposed model is allocated to a different GPU. We evaluated the performance of our newly developed GPU version of GMS on the TSUBAME2.0 which is one of the Japanese fastest supercomputer operated by the Tokyo Institute of Technology. First we have performed a strong scaling test using the model with about 22 million grids and achieved 3.2 and 7.3 times of the speed-up by using 4 and 16 GPUs. Next, we have examined a weak scaling test where the model sizes (number of grids) are increased in proportion to the degree of parallelism (number of GPUs). The result showed almost perfect linearity up to the simulation with 22 billion grids using 1024 GPUs where the calculation speed reached to 79.7 TFlops and about 34 times faster than the CPU calculation using the same number