Science.gov

Sample records for future large-scale observatories

  1. The Saskatchewan River Basin - a large scale observatory for transdisciplinary science

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2012-12-01

    Water resources are under pressure world-wide and face unprecedented challenges - from population growth, economic development, pollution and environmental change. Further, effective water management is becoming increasingly complex, requiring deep understanding of aquatic and terrestrial environments, their vulnerabilities to environmental change, and water management and protection challenges. Important science challenges arise in understanding and managing environmental change. However, with increasing pressures on the environment, it is necessary to recognise the effects of human interventions; flows in many major rivers are strongly affected by operational water management, and large-scale agricultural land management change can affect hydrology, land-atmosphere feedbacks, water quality and habitats. There is a need to represent effects on river flows and groundwater of management decisions, and more generally to understand impacts of policy, governance and societal values on water futures. This research agenda poses important challenges to the science community. Observational data are necessary, across multiple scales, to understand environmental change. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and multiple jurisdictions. The 340,000 km2 Saskatchewan River Basin (SRB) is being developed as a large scale observatory to support a new level of integration of interdisciplinary science. In one of the most extreme and variable climates in the world, we are developing state-of-the-art hydro-ecological experimental sites in the

  2. Large-Scale Science Observatories: Building on What We Have Learned from USArray

    NASA Astrophysics Data System (ADS)

    Woodward, R.; Busby, R.; Detrick, R. S.; Frassetto, A.

    2015-12-01

    With the NSF-sponsored EarthScope USArray observatory, the Earth science community has built the operational capability and experience to tackle scientific challenges at the largest scales, such as a Subduction Zone Observatory. In the first ten years of USArray, geophysical instruments were deployed across roughly 2% of the Earth's surface. The USArray operated a rolling deployment of seismic stations that occupied ~1,700 sites across the USA, made co-located atmospheric observations, occupied hundreds of sites with magnetotelluric sensors, expanded a backbone reference network of seismic stations, and provided instruments to PI-led teams that deployed thousands of additional seismic stations. USArray included a comprehensive outreach component that directly engaged hundreds of students at over 50 colleges and universities to locate station sites and provided Earth science exposure to roughly 1,000 landowners who hosted stations. The project also included a comprehensive data management capability that received, archived and distributed data, metadata, and data products; data were acquired and distributed in real time. The USArray project was completed on time and under budget and developed a number of best practices that can inform other large-scale science initiatives that the Earth science community is contemplating. Key strategies employed by USArray included: using a survey, rather than hypothesis-driven, mode of observation to generate comprehensive, high quality data on a large-scale for exploration and discovery; making data freely and openly available to any investigator from the very onset of the project; and using proven, commercial, off-the-shelf systems to ensure a fast start and avoid delays due to over-reliance on unproven technology or concepts. Scope was set ambitiously, but managed carefully to avoid overextending. Configuration was controlled to ensure efficient operations while providing consistent, uniform observations. Finally, community

  3. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    NASA Astrophysics Data System (ADS)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and

  4. Interloper bias in future large-scale structure surveys

    NASA Astrophysics Data System (ADS)

    Pullen, Anthony R.; Hirata, Christopher M.; Doré, Olivier; Raccanelli, Alvise

    2016-02-01

    Next-generation spectroscopic surveys will map the large-scale structure of the observable universe, using emission line galaxies as tracers. While each survey will map the sky with a specific emission line, interloping emission lines can masquerade as the survey's intended emission line at different redshifts. Interloping lines from galaxies that are not removed can contaminate the power spectrum measurement, mixing correlations from various redshifts and diluting the true signal. We assess the potential for power spectrum contamination, finding that an interloper fraction worse than 0.2% could bias power spectrum measurements for future surveys by more than 10% of statistical errors, while also biasing power spectrum inferences. We also construct a formalism for predicting cosmological parameter measurement bias, demonstrating that a 0.15%-0.3% interloper fraction could bias the growth rate by more than 10% of the error, which can affect constraints on gravity from upcoming surveys. We use the COSMOS Mock Catalog (CMC), with the emission lines rescaled to better reproduce recent data, to predict potential interloper fractions for the Prime Focus Spectrograph (PFS) and the Wide-Field InfraRed Survey Telescope (WFIRST). We find that secondary line identification, or confirming galaxy redshifts by finding correlated emission lines, can remove interlopers for PFS. For WFIRST, we use the CMC to predict that the 0.2% target can be reached for the WFIRST Hα survey, but sensitive optical and near-infrared photometry will be required. For the WFIRST [O III] survey, the predicted interloper fractions reach several percent and their effects will have to be estimated and removed statistically (e.g., with deep training samples). These results are optimistic as the CMC does not capture the full set of correlations of galaxy properties in the real Universe, and they do not include blending effects. Mitigating interloper contamination will be crucial to the next generation of

  5. Large-Scale Data Challenges in Future Power Grids

    SciTech Connect

    Yin, Jian; Sharma, Poorva; Gorton, Ian; Akyol, Bora A.

    2013-03-25

    This paper describes technical challenges in supporting large-scale real-time data analysis for future power grid systems and discusses various design options to address these challenges. Even though the existing U.S. power grid has served the nation remarkably well over the last 120 years, big changes are in the horizon. The widespread deployment of renewable generation, smart grid controls, energy storage, plug-in hybrids, and new conducting materials will require fundamental changes in the operational concepts and principal components. The whole system becomes highly dynamic and needs constant adjustments based on real time data. Even though millions of sensors such as phase measurement units (PMUs) and smart meters are being widely deployed, a data layer that can support this amount of data in real time is needed. Unlike the data fabric in cloud services, the data layer for smart grids must address some unique challenges. This layer must be scalable to support millions of sensors and a large number of diverse applications and still provide real time guarantees. Moreover, the system needs to be highly reliable and highly secure because the power grid is a critical piece of infrastructure. No existing systems can satisfy all the requirements at the same time. We examine various design options. In particular, we explore the special characteristics of power grid data to meet both scalability and quality of service requirements. Our initial prototype can improve performance by orders of magnitude over existing general-purpose systems. The prototype was demonstrated with several use cases from PNNL’s FPGI and was shown to be able to integrate huge amount of data from a large number of sensors and a diverse set of applications.

  6. Future large scale accelerator projects for particle physics

    NASA Astrophysics Data System (ADS)

    Aleksan, R.

    2013-12-01

    The discovery of a new particle, the properties of which are compatible with the expected Brout-Englert-Higgs scalar field in the Standard Model (SM), is the starting point of an intense program for studying its couplings. With this particle, all the components of the SM have now been unraveled. Yet, the existence of dark matter, baryon asymmetry of the Universe and neutrino mass call for new physics at an energy scale, which is not determined so far. Therefore, new large scale accelerators are needed to investigate these mysteries through ultra-high precision measurements and/or the exploration of higher energy frontiers. In the following, we discuss the various accelerator projects aimed at the achievement of the above objectives. The physics reach of these facilities will be briefly described as well as their main technical features and related challenges, highlighting the importance of accelerator R&D not only for the benefit of particle physics but also for other fields of research, and more generally for the society.

  7. Large-Scale Sequencing: The Future of Genomic Sciences Colloquium

    SciTech Connect

    Margaret Riley; Merry Buckley

    2009-01-01

    Genetic sequencing and the various molecular techniques it has enabled have revolutionized the field of microbiology. Examining and comparing the genetic sequences borne by microbes - including bacteria, archaea, viruses, and microbial eukaryotes - provides researchers insights into the processes microbes carry out, their pathogenic traits, and new ways to use microorganisms in medicine and manufacturing. Until recently, sequencing entire microbial genomes has been laborious and expensive, and the decision to sequence the genome of an organism was made on a case-by-case basis by individual researchers and funding agencies. Now, thanks to new technologies, the cost and effort of sequencing is within reach for even the smallest facilities, and the ability to sequence the genomes of a significant fraction of microbial life may be possible. The availability of numerous microbial genomes will enable unprecedented insights into microbial evolution, function, and physiology. However, the current ad hoc approach to gathering sequence data has resulted in an unbalanced and highly biased sampling of microbial diversity. A well-coordinated, large-scale effort to target the breadth and depth of microbial diversity would result in the greatest impact. The American Academy of Microbiology convened a colloquium to discuss the scientific benefits of engaging in a large-scale, taxonomically-based sequencing project. A group of individuals with expertise in microbiology, genomics, informatics, ecology, and evolution deliberated on the issues inherent in such an effort and generated a set of specific recommendations for how best to proceed. The vast majority of microbes are presently uncultured and, thus, pose significant challenges to such a taxonomically-based approach to sampling genome diversity. However, we have yet to even scratch the surface of the genomic diversity among cultured microbes. A coordinated sequencing effort of cultured organisms is an appropriate place to begin

  8. Testing model independent modified gravity with future large scale surveys

    SciTech Connect

    Thomas, Daniel B.; Contaldi, Carlo R. E-mail: c.contaldi@ic.ac.uk

    2011-12-01

    Model-independent parametrisations of modified gravity have attracted a lot of attention over the past few years and numerous combinations of experiments and observables have been suggested to constrain the parameters used in these models. Galaxy clusters have been mentioned, but not looked at as extensively in the literature as some other probes. Here we look at adding galaxy clusters into the mix of observables and examine how they could improve the constraints on the modified gravity parameters. In particular, we forecast the constraints from combining Planck satellite Cosmic Microwave Background (CMB) measurements and Sunyaev-Zeldovich (SZ) cluster catalogue with a DES-like Weak Lensing (WL) survey. We find that cluster counts significantly improve the constraints over those derived using CMB and WL. We then look at surveys further into the future, to see how much better it may be feasible to make the constraints.

  9. Important aspects of Eastern Mediterranean large-scale variability revealed from data of three fixed observatories

    NASA Astrophysics Data System (ADS)

    Bensi, Manuel; Velaoras, Dimitris; Cardin, Vanessa; Perivoliotis, Leonidas; Pethiakis, George

    2015-04-01

    Long-term variations of temperature and salinity observed in the Adriatic and Aegean Seas seem to be regulated by larger-scale circulation modes of the Eastern Mediterranean (EMed) Sea, such as the recently discovered feedback mechanisms, namely the BiOS (Bimodal Oscillating System) and the internal thermohaline pump theories. These theories are the results of interpretation of many years' observations, highlighting possible interactions between two key regions of the EMed. Although repeated oceanographic cruises carried out in the past or planned for the future are a very useful tool for understanding the interaction between the two basins (e.g. alternating dense water formation, salt ingressions), recent long time-series of high frequency (up to 1h) sampling have added valuable information to the interpretation of internal mechanisms for both areas (i.e. mesoscale eddies, evolution of fast internal processes, etc.). During the last 10 years, three deep observatories were deployed and maintained in the Adriatic, Ionian, and Aegean Seas: they are respectively, the E2-M3A, the Pylos, and the E1-M3A. All are part of the largest European network of Fixed Point Open Ocean Observatories (FixO3, http://www.fixo3.eu/). Herein, from the analysis of temperature and salinity, and potential density time series collected at the three sites from the surface down to the intermediate and deep layers, we will discuss the almost perfect anti-correlated behavior between the Adriatic and the Aegean Seas. Our data, collected almost continuously since 2006, reveal that these observatories well represent the thermohaline variability of their own areas. Interestingly, temperature and salinity in the intermediate layer suddenly increased in the South Adriatic from the end of 2011, exactly when they started decreasing in the Aegean Sea. Moreover, Pylos data used together with additional ones (e.g. Absolute dynamic topography, temperature and salinity data from other platforms) collected

  10. Heliophysics/Geospace System Observatory: System level science by large-scale space-ground coordination

    NASA Astrophysics Data System (ADS)

    Nishimura, T.; Angelopoulos, V.; Moore, T. E.; Samara, M.

    2015-12-01

    Recent multi-satellite and ground-based network measurements have revealed importance of cross-scale and cross-regional coupling processes for understanding key issues in geospace such as magnetic reconnection, substorms and particle acceleration. In particular, localized and fast plasma transport in a global scale has been recognized to play a fundamental role in regulating evolution of the magnetosphere-ionosphere-thermosphere coupling. Those results call for coordinated measurements multi-missions and facilities in a global scale for understanding global coupling processes in a system level. In fact, the National Research Council recommends to use NASA's existing heliophysics flight missions and NSF's ground-based facilities by forming a network of observing platforms that operate simultaneously to investigate the solar system. This array can be thought of as a single observatory, the Heliophysics/Geospace System Observatory (H/GSO). Motivated by the successful launch of MMS and the healthy status of THEMIS, Van Allen Probes and other missions, we plan a strategic use of existing and upcoming assets in space and ground in the next two years. In the 2015-2016 and 2016-2017 northern winter seasons, MMS will be in the dayside over northern Europe, and THEMIS will be in the nightside over North America. In the 2016 and 2017 southern winter seasons, THEMIS will be in the dayside over the South Pole, and MMS will be in the nightside in the Australian sector. These are favorable configurations for simultaneous day-night coupling measurements of magnetic reconnection and related plasma transport both in space and on the ground, and also provide excellent opportunities for cross-scale coupling, global effects of dayside transients, tail-inner magnetosphere coupling, and other global processes. This presentation will give the current status and plan of the H/GSO and these science targets.

  11. Assessment of present and future large-scale semiconductor detector systems

    SciTech Connect

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future.

  12. The Landscape Evolution Observatory: A large-scale controllable infrastructure to study coupled Earth-surface processes

    NASA Astrophysics Data System (ADS)

    Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferre, T. P. A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin

    2015-09-01

    Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.

  13. The Landscape Evolution Observatory: a large-scale controllable infrastructure to study coupled Earth-surface processes

    USGS Publications Warehouse

    Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferre, T. P. A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin

    2015-01-01

    Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.

  14. Prototyping a large-scale distributed system for the Great Observatories era - NASA Astrophysics Data System (ADS)

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1990-01-01

    The NASA Astrophysics Data System (ADS) is a distributed information system intended to support research in the Great Observatories era, to simplify access to data, and to enable simultaneous analyses of multispectral data sets. Here, the user agent and interface, its functions, and system components are examined, and the system architecture and infrastructure is addressed. The present status of the system and related future activities are examined.

  15. Extra-large crystal emulsion detectors for future large-scale experiments

    NASA Astrophysics Data System (ADS)

    Ariga, T.; Ariga, A.; Kuwabara, K.; Morishima, K.; Moto, M.; Nishio, A.; Scampoli, P.; Vladymyrov, M.

    2016-03-01

    Photographic emulsion is a particle tracking device which features the best spatial resolution among particle detectors. For certain applications, for example muon radiography, large-scale detectors are required. Therefore, a huge surface has to be analyzed by means of automated optical microscopes. An improvement of the readout speed is then a crucial point to make these applications possible and the availability of a new type of photographic emulsions featuring crystals of larger size is a way to pursue this program. This would allow a lower magnification for the microscopes, a consequent larger field of view resulting in a faster data analysis. In this framework, we developed new kinds of emulsion detectors with a crystal size of 600-1000 nm, namely 3-5 times larger than conventional ones, allowing a 25 times faster data readout. The new photographic emulsions have shown a sufficient sensitivity and a good signal to noise ratio. The proposed development opens the way to future large-scale applications of the technology, e.g. 3D imaging of glacier bedrocks or future neutrino experiments.

  16. Large-scale cortical network properties predict future sound-to-word learning success.

    PubMed

    Sheppard, John Patrick; Wang, Ji-Ping; Wong, Patrick C M

    2012-05-01

    The human brain possesses a remarkable capacity to interpret and recall novel sounds as spoken language. These linguistic abilities arise from complex processing spanning a widely distributed cortical network and are characterized by marked individual variation. Recently, graph theoretical analysis has facilitated the exploration of how such aspects of large-scale brain functional organization may underlie cognitive performance. Brain functional networks are known to possess small-world topologies characterized by efficient global and local information transfer, but whether these properties relate to language learning abilities remains unknown. Here we applied graph theory to construct large-scale cortical functional networks from cerebral hemodynamic (fMRI) responses acquired during an auditory pitch discrimination task and found that such network properties were associated with participants' future success in learning words of an artificial spoken language. Successful learners possessed networks with reduced local efficiency but increased global efficiency relative to less successful learners and had a more cost-efficient network organization. Regionally, successful and less successful learners exhibited differences in these network properties spanning bilateral prefrontal, parietal, and right temporal cortex, overlapping a core network of auditory language areas. These results suggest that efficient cortical network organization is associated with sound-to-word learning abilities among healthy, younger adults. PMID:22360625

  17. Possible future effects of large-scale algae cultivation for biofuels on coastal eutrophication in Europe.

    PubMed

    Blaas, Harry; Kroeze, Carolien

    2014-10-15

    Biodiesel is increasingly considered as an alternative for fossil diesel. Biodiesel can be produced from rapeseed, palm, sunflower, soybean and algae. In this study, the consequences of large-scale production of biodiesel from micro-algae for eutrophication in four large European seas are analysed. To this end, scenarios for the year 2050 are analysed, assuming that in the 27 countries of the European Union fossil diesel will be replaced by biodiesel from algae. Estimates are made for the required fertiliser inputs to algae parks, and how this may increase concentrations of nitrogen and phosphorus in coastal waters, potentially leading to eutrophication. The Global NEWS (Nutrient Export from WaterSheds) model has been used to estimate the transport of nitrogen and phosphorus to the European coastal waters. The results indicate that the amount of nitrogen and phosphorus in the coastal waters may increase considerably in the future as a result of large-scale production of algae for the production of biodiesel, even in scenarios assuming effective waste water treatment and recycling of waste water in algae production. To ensure sustainable production of biodiesel from micro-algae, it is important to develop cultivation systems with low nutrient losses to the environment. PMID:25058933

  18. The Uncertain Future of Arecibo Observatory

    NASA Astrophysics Data System (ADS)

    Altschuler, D. R.

    2009-05-01

    After forty years of existence, Arecibo Observatory has an uncertain future. On November 3th, 2006 the ``Senior Review'' (SR), an advisory panel, recommended to the astronomy division of NSF that the anual budget destinated to astronomy in the Observatory, should be reduced from US10.5 million annual to US8 million during the first 3 years. The SR also indicated that the Observatory have to be closed in 2011, if an external financial source is not found. The SR panel was called to find near US30 million in savings (approximately 25% of total budget of the five national observatories, including Arecibo) to redirect them to operate new future projects.

  19. Constraining scale-dependent non-Gaussianity with future large-scale structure and the CMB

    SciTech Connect

    Becker, Adam; Huterer, Dragan; Kadota, Kenji E-mail: huterer@umich.edu

    2012-12-01

    We forecast combined future constraints from the cosmic microwave background and large-scale structure on the models of primordial non-Gaussianity. We study the generalized local model of non-Gaussianity, where the parameter f{sub NL} is promoted to a function of scale, and present the principal component analysis applicable to an arbitrary form of f{sub NL}(k). We emphasize the complementarity between the CMB and LSS by using Planck, DES and BigBOSS surveys as examples, forecast constraints on the power-law f{sub NL}(k) model, and introduce the figure of merit for measurements of scale-dependent non-Gaussianity.

  20. CONSTRAINTS ON THE ORIGIN OF COSMIC RAYS ABOVE 10{sup 18} eV FROM LARGE-SCALE ANISOTROPY SEARCHES IN DATA OF THE PIERRE AUGER OBSERVATORY

    SciTech Connect

    Abreu, P.; Andringa, S.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Castillo, J. Alvarez; Alvarez-Muniz, J.; Alves Batista, R.; Ambrosio, M.; Aramo, C.; Aminaei, A.; Anchordoqui, L.; Antici'c, T.; Arganda, E.; Collaboration: Pierre Auger Collaboration; and others

    2013-01-01

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 10{sup 18} eV at the Pierre Auger Observatory is reported. For the first time, these large-scale anisotropy searches are performed as a function of both the right ascension and the declination and expressed in terms of dipole and quadrupole moments. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Upper limits on dipole and quadrupole amplitudes are derived under the hypothesis that any cosmic ray anisotropy is dominated by such moments in this energy range. These upper limits provide constraints on the production of cosmic rays above 10{sup 18} eV, since they allow us to challenge an origin from stationary galactic sources densely distributed in the galactic disk and emitting predominantly light particles in all directions.

  1. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    PubMed

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. PMID:27503077

  2. The Plate Boundary Observatory Cascadia Network: Development and Installation of a Large Scale Real-time GPS Network

    NASA Astrophysics Data System (ADS)

    Austin, K. E.; Blume, F.; Berglund, H. T.; Feaux, K.; Gallaher, W. W.; Hodgkinson, K. M.; Mattioli, G. S.; Mencin, D.

    2014-12-01

    The EarthScope Plate Boundary Observatory (PBO), through a NSF-ARRA supplement, has enhanced the geophysical infrastructure in in the Pacific Northwest by upgrading a total of 282 Plate Boundary Observatory GPS stations to allow the collection and distribution of high-rate (1 Hz), low-latency (<1 s) data streams (RT-GPS). These upgraded stations supplemented the original 100 RT-GPS stations in the PBO GPS network. The addition of the new RT-GPS sites in Cascadia should spur new volcano and earthquake research opportunities in an area of great scientific interest and high geophysical hazard. Streaming RT-GPS data will enable researchers to detect and investigate strong ground motion during large geophysical events, including a possible plate-interface earthquake, which has implications for earthquake hazard mitigation. A Mw 6.9 earthquake occurred on March 10, 2014, off the coast of northern California. As a response, UNAVCO downloaded high-rate GPS data from Plate Boundary Observatory stations within 500 km of the epicenter of the event, providing a good test of network performance.In addition to the 282 stations upgraded to real-time, 22 new meteorological instruments were added to existing PBO stations. Extensive testing of BGAN satellite communications systems has been conducted to support the Cascadia RT-GPS upgrades and the installation of three BGAN satellite fail over systems along the Cascadia margin will allow for the continuation of data flow in the event of a loss of primary communications during in a large geophysical event or other interruptions in commercial cellular networks. In summary, with these additional upgrades in the Cascadia region, the PBO RT-GPS network will increase to 420 stations. Upgrades to the UNAVCO data infrastructure included evaluation and purchase of the Trimble Pivot Platform, servers, and additional hardware for archiving the high rate data, as well as testing and implementation of GLONASS and Trimble RTX positioning on the

  3. The Plate Boundary Observatory Cascadia Network: Development and Installation of a Large Scale Real-time GPS Network

    NASA Astrophysics Data System (ADS)

    Austin, K. E.; Blume, F.; Berglund, H. T.; Dittman, T.; Feaux, K.; Gallaher, W. W.; Mattioli, G. S.; Mencin, D.; Walls, C. P.

    2013-12-01

    The EarthScope Plate Boundary Observatory (PBO), through a NSF-ARRA supplement, has enhanced the geophysical infrastructure in in the Pacific Northwest by upgrading 232 Plate Boundary Observatory GPS stations to allow the collection and distribution of high-rate (1 Hz), low-latency (<1 s) data streams (RT-GPS). These upgraded stations supplemented the original 100 RT-GPS stations in the PBO GPS network. The addition of the new RT-GPS sites in the Pacific Northwest should spur new volcano and earthquake research opportunities in an area of great scientific interest and high geophysical hazard. Streaming RT-GPS data will enable researchers to detect and investigate strong ground motion during large geophysical events, including a possible plate-interface earthquake, which has implications for earthquake hazard mitigation. A total of 282 PBO stations were upgraded and added to the UNAVCO real-time GPS system, along with addition of 22 new meteorological instruments to existing PBO stations. Extensive testing of BGAN satellite communications systems has been conducted to support the Cascadia RT-GPS upgrades and the installation of three BGAN satellite fail over systems along the Cascadia margin will allow for the continuation of data flow in the event of a loss of primary communications during in a large geophysical event or other interruptions in commercial cellular networks. In summary, with these additional upgrades in the Cascadia region, the PBO RT-GPS network will increase to 420 stations. Upgrades to UNAVCO's data infrastructure included evaluation and purchase of the Trimble Pivot Platform, servers, and additional hardware for archiving the high rate data. UNAVCO staff is working closely with the UNAVCO community to develop data standards, protocols, and a science plan for the use of RT-GPS data.

  4. Vulnerability of the large-scale future smart electric power grid

    NASA Astrophysics Data System (ADS)

    Nasiruzzaman, A. B. M.; Pota, H. R.; Akter, Most. Nahida

    2014-11-01

    The changing power flow pattern of the power system, with inclusion of large-scale renewable energy sources in the distribution side of the network, has been modeled by complex network framework based bidirectional graph. The bidirectional graph accommodates the reverse power flowing back from the distribution side to the grid in the model as a reverse edge connecting two nodes. The capacity of the reverse edge is equal to the capacity of the existing edge between the nodes in the forward directional nominal graph. Increased path in the combined model, built to facilitate grid reliability and efficiency, may serve as a bottleneck in practice with removal of certain percentage of nodes or edges. The effect of removal of critical elements has been analyzed in terms of increased path length, connectivity loss, load loss, and number of overloaded lines.

  5. Inclusive constraints on unified dark matter models from future large-scale surveys

    SciTech Connect

    Camera, Stefano; Carbone, Carmelita; Moscardini, Lauro E-mail: carmelita.carbone@unibo.it

    2012-03-01

    In the very last years, cosmological models where the properties of the dark components of the Universe — dark matter and dark energy — are accounted for by a single ''dark fluid'' have drawn increasing attention and interest. Amongst many proposals, Unified Dark Matter (UDM) cosmologies are promising candidates as effective theories. In these models, a scalar field with a non-canonical kinetic term in its Lagrangian mimics both the accelerated expansion of the Universe at late times and the clustering properties of the large-scale structure of the cosmos. However, UDM models also present peculiar behaviours, the most interesting one being the fact that the perturbations in the dark-matter component of the scalar field do have a non-negligible speed of sound. This gives rise to an effective Jeans scale for the Newtonian potential, below which the dark fluid does not cluster any more. This implies a growth of structures fairly different from that of the concordance ΛCDM model. In this paper, we demonstrate that forthcoming large-scale surveys will be able to discriminate between viable UDM models and ΛCDM to a good degree of accuracy. To this purpose, the planned Euclid satellite will be a powerful tool, since it will provide very accurate data on galaxy clustering and the weak lensing effect of cosmic shear. Finally, we also exploit the constraining power of the ongoing CMB Planck experiment. Although our approach is the most conservative, with the inclusion of only well-understood, linear dynamics, in the end we also show what could be done if some amount of non-linear information were included.

  6. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    NASA Astrophysics Data System (ADS)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning

  7. LARGE-SCALE DISTRIBUTION OF ARRIVAL DIRECTIONS OF COSMIC RAYS DETECTED ABOVE 10{sup 18} eV AT THE PIERRE AUGER OBSERVATORY

    SciTech Connect

    Abreu, P.; Andringa, S.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Alves Batista, R.; Ambrosio, M.; Aramo, C.; Aminaei, A.; Anchordoqui, L.; Antici'c, T.; Arganda, E.; Collaboration: Pierre Auger Collaboration; and others

    2012-12-15

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 10{sup 18} eV at the Pierre Auger Observatory is presented. This search is performed as a function of both declination and right ascension in several energy ranges above 10{sup 18} eV, and reported in terms of dipolar and quadrupolar coefficients. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Assuming that any cosmic-ray anisotropy is dominated by dipole and quadrupole moments in this energy range, upper limits on their amplitudes are derived. These upper limits allow us to test the origin of cosmic rays above 10{sup 18} eV from stationary Galactic sources densely distributed in the Galactic disk and predominantly emitting light particles in all directions.

  8. The effect of the geomagnetic field on cosmic ray energy estimates and large scale anisotropy searches on data from the Pierre Auger Observatory

    SciTech Connect

    Abreu, P.; Aglietta, M.; Ahn, E.J.; Albuquerque, I.F.M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Ambrosio, M.; /Naples U. /INFN, Naples /Nijmegen U., IMAPP

    2011-11-01

    We present a comprehensive study of the influence of the geomagnetic field on the energy estimation of extensive air showers with a zenith angle smaller than 60{sup o}, detected at the Pierre Auger Observatory. The geomagnetic field induces an azimuthal modulation of the estimated energy of cosmic rays up to the {approx} 2% level at large zenith angles. We present a method to account for this modulation of the reconstructed energy. We analyse the effect of the modulation on large scale anisotropy searches in the arrival direction distributions of cosmic rays. At a given energy, the geomagnetic effect is shown to induce a pseudo-dipolar pattern at the percent level in the declination distribution that needs to be accounted for. In this work, we have identified and quantified a systematic uncertainty affecting the energy determination of cosmic rays detected by the surface detector array of the Pierre Auger Observatory. This systematic uncertainty, induced by the influence of the geomagnetic field on the shower development, has a strength which depends on both the zenith and the azimuthal angles. Consequently, we have shown that it induces distortions of the estimated cosmic ray event rate at a given energy at the percent level in both the azimuthal and the declination distributions, the latter of which mimics an almost dipolar pattern. We have also shown that the induced distortions are already at the level of the statistical uncertainties for a number of events N {approx_equal} 32 000 (we note that the full Auger surface detector array collects about 6500 events per year with energies above 3 EeV). Accounting for these effects is thus essential with regard to the correct interpretation of large scale anisotropy measurements taking explicitly profit from the declination distribution.

  9. Effects of unstable dark matter on large-scale structure and constraints from future surveys

    NASA Astrophysics Data System (ADS)

    Wang, Mei-Yu; Zentner, Andrew R.

    2012-02-01

    In this paper we explore the effect of decaying dark matter (DDM) on large-scale structure and possible constraints from galaxy imaging surveys. DDM models have been studied, in part, as a way to address apparent discrepancies between the predictions of standard cold dark matter models and observations of galactic structure. Our study is aimed at developing independent constraints on these models. In such models, DDM decays into a less massive, stable dark matter (SDM) particle and a significantly lighter particle. The small mass splitting between the parent DDM and the daughter SDM provides the SDM with a recoil or “kick” velocity vk, inducing a free-streaming suppression of matter fluctuations. This suppression can be probed via weak lensing power spectra measured by a number of forthcoming imaging surveys that aim primarily to constrain dark energy. Using scales on which linear perturbation theory alone is valid (multipoles ℓ<300), surveys like Euclid or the Large Synoptic Survey Telescope can be sensitive to vk≳90km/s for lifetimes τ˜1-5Gyr. To estimate more aggressive constraints, we model nonlinear corrections to lensing power using a simple halo evolution model that is in good agreement with numerical simulations. In our most ambitious forecasts, using multipoles ℓ<3000, we find that imaging surveys can be sensitive to vk˜10km/s for lifetimes τ≲10Gyr. Lensing will provide a particularly interesting complement to existing constraints in that they will probe the long lifetime regime (τ≫H0-1) far better than contemporary techniques. A caveat to these ambitious forecasts is that the evolution of perturbations on nonlinear scales will need to be well calibrated by numerical simulations before they can be realized. This work motivates the pursuit of such a numerical simulation campaign to constrain dark matter with cosmological weak lensing.

  10. High-energy physics strategies and future large-scale projects

    NASA Astrophysics Data System (ADS)

    Zimmermann, F.

    2015-07-01

    We sketch the actual European and international strategies and possible future facilities. In the near term the High Energy Physics (HEP) community will fully exploit the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). Post-LHC options include a linear e+e- collider in Japan (ILC) or at CERN (CLIC), as well as circular lepton or hadron colliders in China (CepC/SppC) and Europe (FCC). We conclude with linear and circular acceleration approaches based on crystals, and some perspectives for the far future of accelerator-based particle physics.

  11. Observing trans-Planckian ripples in the primordial power spectrum with future large scale structure probes

    SciTech Connect

    Hamann, Jan; Hannestad, Steen; Sloth, Martin S; Wong, Yvonne Y Y E-mail: sth@phys.au.dk E-mail: ywong@mppmu.mpg.de

    2008-09-15

    We revisit the issue of ripples in the primordial power spectra caused by trans-Planckian physics, and the potential for their detection by future cosmological probes. We find that for reasonably large values of the first slow-roll parameter {epsilon} ({approx}>0.001), a positive detection of trans-Planckian ripples can be made even if the amplitude is as low as 10{sup -4}. Data from the Large Synoptic Survey Telescope (LSST) and the proposed future 21 cm survey with the Fast Fourier Transform Telescope (FFTT) will be particularly useful in this regard. If the scale of inflation is close to its present upper bound, a scale of new physics as high as {approx}0.2 M{sub P} could lead to observable signatures.

  12. Efficiency and economics of large scale hydrogen liquefaction. [for future generation aircraft requirements

    NASA Technical Reports Server (NTRS)

    Baker, C. R.

    1975-01-01

    Liquid hydrogen is being considered as a substitute for conventional hydrocarbon-based fuels for future generations of commercial jet aircraft. Its acceptance will depend, in part, upon the technology and cost of liquefaction. The process and economic requirements for providing a sufficient quantity of liquid hydrogen to service a major airport are described. The design is supported by thermodynamic studies which determine the effect of process arrangement and operating parameters on the process efficiency and work of liquefaction.

  13. Health-2000: an integrated large-scale expert system for the hospital of the future.

    PubMed

    Boyom, S F; Kwankam, S Y; Asoh, D A; Asaah, C; Kengne, F

    1997-02-01

    Decision making and management are problems which plague health systems in developing countries, particularly in Sub-Saharan Africa where there is significant waste of resources. The need goes beyond national health management information systems, to tools required in daily micro-management of various components of the health system. This paper describes an integrated expert system, Health-2000, an information-oriented tool for acquiring, processing and disseminating medical knowledge, data and decisions in the hospital of the future. It integrates six essential features of the medical care environment: personnel management, patient management, medical diagnosis, laboratory management, propharmacy, and equipment management. Disease conditions covered are the major tropical diseases. An intelligent tutoring feature completes the package. Emphasis is placed on the graphical user interface to facilitate interactions between the user and the system, which is developed for PCs using Pascal, C, Clipper and Prolog. PMID:9242002

  14. The Design of Large-Scale Complex Engineered Systems: Present Challenges and Future Promise

    NASA Technical Reports Server (NTRS)

    Bloebaum, Christina L.; McGowan, Anna-Maria Rivas

    2012-01-01

    Model-Based Systems Engineering techniques are used in the SE community to address the need for managing the development of complex systems. A key feature of the MBSE approach is the use of a model to capture the requirements, architecture, behavior, operating environment and other key aspects of the system. The focus on the model differentiates MBSE from traditional SE techniques that may have a document centric approach. In an effort to assess the benefit of utilizing MBSE on its flight projects, NASA Langley has implemented a pilot program to apply MBSE techniques during the early phase of the Materials International Space Station Experiment-X (MISSE-X). MISSE-X is a Technology Demonstration Mission being developed by the NASA Office of the Chief Technologist i . Designed to be installed on the exterior of the International Space Station (ISS), MISSE-X will host experiments that advance the technology readiness of materials and devices needed for future space exploration. As a follow-on to the highly successful series of previous MISSE experiments on ISS, MISSE-X benefits from a significant interest by the

  15. Methods, caveats and the future of large-scale microelectrode recordings in the non-human primate

    PubMed Central

    Dotson, Nicholas M.; Goodell, Baldwin; Salazar, Rodrigo F.; Hoffman, Steven J.; Gray, Charles M.

    2015-01-01

    Cognitive processes play out on massive brain-wide networks, which produce widely distributed patterns of activity. Capturing these activity patterns requires tools that are able to simultaneously measure activity from many distributed sites with high spatiotemporal resolution. Unfortunately, current techniques with adequate coverage do not provide the requisite spatiotemporal resolution. Large-scale microelectrode recording devices, with dozens to hundreds of microelectrodes capable of simultaneously recording from nearly as many cortical and subcortical areas, provide a potential way to minimize these tradeoffs. However, placing hundreds of microelectrodes into a behaving animal is a highly risky and technically challenging endeavor that has only been pursued by a few groups. Recording activity from multiple electrodes simultaneously also introduces several statistical and conceptual dilemmas, such as the multiple comparisons problem and the uncontrolled stimulus response problem. In this perspective article, we discuss some of the techniques that we, and others, have developed for collecting and analyzing large-scale data sets, and address the future of this emerging field. PMID:26578906

  16. Large-scale atmospheric circulation and local particulate matter concentrations in Bavaria - from current observations to future projections

    NASA Astrophysics Data System (ADS)

    Beck, Christoph; Weitnauer, Claudia; Brosy, Caroline; Hald, Cornelius; Lochbihler, Kai; Siegmund, Stefan; Jacobeit, Jucundus

    2016-04-01

    Particulate matter with an aerodynamic diameter of 10 μm or less (PM10) may have distinct adverse effects on human health. Spatial and temporal variations in PM10 concentrations reflect local emission rates, but are as well influenced by the local and synoptic-scale atmospheric conditions. Against this background, it can be furthermore argued that potential future climate change and associated variations in large-scale atmospheric circulation and local meteorological parameters will probably provoke corresponding changes in future PM10 concentration levels. The DFG-funded research project „Particulate matter and climate change in Bavaria" aimed at establishing quantitative relationships between daily and monthly PM10 indices at different Bavarian urban stations and the corresponding large-scale atmospheric circulation as well as local meteorological conditions. To this end, several statistical downscaling approaches have been developed for the period 1980 to 2011. PM10 data from 19 stations from the air quality monitoring network (LÜB) of the Bavarian Environmental Agency (LfU) have been utilized as predictands. Large-scale atmospheric gridded data from the NCEP/NCAR reanalysis data base and local meteorological observational data provided by the German Meteorological Service (DWD) served as predictors. The downscaling approaches encompass the synoptic downscaling of daily PM10 concentrations and several multivariate statistical models for the estimation of daily and monthly PM10, i.e.monthly mean and number of days exceeding a certain PM10 concentration threshold. Both techniques utilize objective circulation type classifications, which have been optimized with respect to their synoptic skill for the target variable PM10. All downscaling approaches have been evaluated via cross validation using varying subintervals of the 1980-2011 period as calibration and validation periods respectively. The most suitable - in terms of model skill determined from cross

  17. The ecological future of the North American bison: Conceiving long-term, large-scale conservation of a species

    USGS Publications Warehouse

    Sanderson, E.W.; Redford, K.; Weber, Bill; Aune, K.; Baldes, Dick; Berger, J.; Carter, Dave; Curtin, C.; Derr, James N.; Dobrott, S.J.; Fearn, Eva; Fleener, Craig; Forrest, Steven C.; Gerlach, Craig; Gates, C. Cormack; Gross, J.E.; Gogan, P.; Grassel, Shaun M.; Hilty, Jodi A.; Jensen, Marv; Kunkel, Kyran; Lammers, Duane; List, R.; Minkowski, Karen; Olson, Tom; Pague, Chris; Robertson, Paul B.; Stephenson, Bob

    2008-01-01

    Many wide-ranging mammal species have experienced significant declines over the last 200 years; restoring these species will require long-term, large-scale recovery efforts. We highlight 5 attributes of a recent range-wide vision-setting exercise for ecological recovery of the North American bison (Bison bison) that are broadly applicable to other species and restoration targets. The result of the exercise, the “Vermejo Statement” on bison restoration, is explicitly (1) large scale, (2) long term, (3) inclusive, (4) fulfilling of different values, and (5) ambitious. It reads, in part, “Over the next century, the ecological recovery of the North American bison will occur when multiple large herds move freely across extensive landscapes within all major habitats of their historic range, interacting in ecologically significant ways with the fullest possible set of other native species, and inspiring, sustaining and connecting human cultures.” We refined the vision into a scorecard that illustrates how individual bison herds can contribute to the vision. We also developed a set of maps and analyzed the current and potential future distributions of bison on the basis of expert assessment. Although more than 500,000 bison exist in North America today, we estimated they occupy <1% of their historical range and in no place express the full range of ecological and social values of previous times. By formulating an inclusive, affirmative, and specific vision through consultation with a wide range of stakeholders, we hope to provide a foundation for conservation of bison, and other wide-ranging species, over the next 100 years.

  18. Characteristics of Al substituted nanowires fabricated by self-aligned growth for future large scale integration interconnects

    NASA Astrophysics Data System (ADS)

    Kudo, Hiroshi; Kurahashi, Teruo

    2011-06-01

    Substituted Al nanowires for use in future large scale integration interconnects were fabricated by self-aligned growth. The resistivity of an Al substituted nanowire 80 nm in width, 100 nm in height, and 20 μm in length was 4.7 μΩ cm, which is 48% lower than that of an Al nanowire with the same dimensions fabricated using a bottom-up approach. The variation in the resistivity was in a narrow range (14%) over a Si wafer. The TEM imaging revealed that the Al substituted nanowire had a bamboo-like structure with grains larger than 1.6 μm. The electromigration activation energy was 0.72 eV, which is comparable to that of a pure Al wire with a bamboo-like structure. The product of the critical current density and wire length was 1.3 × 103 A/cm at 250 °C; 2.1 times higher than that of a pure Al wire with a polycrystalline structure. The acceleration of electromigration due to current density was 2.0, indicating that incubation time dominates electromigration lifetime. The prolonged incubation time observed in the electromigration test is attributed to the reduction in electromigration-induced mass transport due to the microstructure of the Al substituted nanowire. Even the formation of a small void immediately after incubation may be a fatal defect for nanoscale Al wires.

  19. Future Astronomical Observatories on the Moon

    NASA Technical Reports Server (NTRS)

    Burns, Jack O. (Editor); Mendell, Wendell W. (Editor)

    1988-01-01

    Papers at a workshop which consider the topic astronomical observations from a lunar base are presented. In part 1, the rationale for performing astronomy on the Moon is established and economic factors are considered. Part 2 includes concepts for individual lunar based telescopes at the shortest X-ray and gamma ray wavelengths, for high energy cosmic rays, and at optical and infrared wavelengths. Lunar radio frequency telescopes are considered in part 3, and engineering considerations for lunar base observatories are discussed in part 4. Throughout, advantages and disadvantages of lunar basing compared to terrestrial and orbital basing of observatories are weighted. The participants concluded that the Moon is very possibly the best location within the inner solar system from which to perform front-line astronomical research.

  20. X6.9-CLASS FLARE-INDUCED VERTICAL KINK OSCILLATIONS IN A LARGE-SCALE PLASMA CURTAIN AS OBSERVED BY THE SOLAR DYNAMICS OBSERVATORY/ATMOSPHERIC IMAGING ASSEMBLY

    SciTech Connect

    Srivastava, A. K.; Goossens, M.

    2013-11-01

    We present rare observational evidence of vertical kink oscillations in a laminar and diffused large-scale plasma curtain as observed by the Atmospheric Imaging Assembly on board the Solar Dynamics Observatory. The X6.9-class flare in active region 11263 on 2011 August 9 induces a global large-scale disturbance that propagates in a narrow lane above the plasma curtain and creates a low density region that appears as a dimming in the observational image data. This large-scale propagating disturbance acts as a non-periodic driver that interacts asymmetrically and obliquely with the top of the plasma curtain and triggers the observed oscillations. In the deeper layers of the curtain, we find evidence of vertical kink oscillations with two periods (795 s and 530 s). On the magnetic surface of the curtain where the density is inhomogeneous due to coronal dimming, non-decaying vertical oscillations are also observed (period ≈ 763-896 s). We infer that the global large-scale disturbance triggers vertical kink oscillations in the deeper layers as well as on the surface of the large-scale plasma curtain. The properties of the excited waves strongly depend on the local plasma and magnetic field conditions.

  1. Planning Our Future Together: Using a Large-Scale Systems Change Process for Educational Reform: Video Discussion Guide.

    ERIC Educational Resources Information Center

    Sauer, John; Walz, Lynn

    This video discussion guide is intended for use in groups working toward large-scale systems changes in schools. It is designed to be appropriate for use in a three-hour workshop, an undergraduate or graduate course, or a training seminar. The guide and proposed workshop are both structured into eight sections. Recommended time allotment,…

  2. Large-Scale Academic Achievement Testing of Deaf and Hard-of-Hearing Students: Past, Present, and Future

    ERIC Educational Resources Information Center

    Qi, Sen; Mitchell, Ross E.

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the…

  3. Recent results and future challenges for large scale Particle-In-Cell simulations of plasma-based accelerator concepts

    SciTech Connect

    Huang, C.; An, W.; Decyk, V.K.; Lu, W.; Mori, W.B.; Tsung, F.S.; Tzoufras, M.; Morshed, S.; Antomsen, T.; Feng, B.; Katsouleas, T; Fonseca, R.A.; Martins, S.F.; Vieira, J.; Silva, L.O.; Geddes, C.G.R.; Cormier-Michel, E; Vay, J.-L.; Esarey, E.; Leemans, W.P.; Bruhwiler, D.L.; Cowan, B.; Cary, J.R.; Paul, K.

    2009-05-01

    The concept and designs of plasma-based advanced accelerators for high energy physics and photon science are modeled in the SciDAC COMPASS project with a suite of Particle-In-Cell codes and simulation techniques including the full electromagnetic model, the envelope model, the boosted frame approach and the quasi-static model. In this paper, we report the progress of the development of these models and techniques and present recent results achieved with large-scale parallel PIC simulations. The simulation needs for modeling the plasma-based advanced accelerator at the energy frontier is discussed and a path towards this goal is outlined.

  4. On-orbit assembly and servicing of future space observatories

    NASA Astrophysics Data System (ADS)

    Lillie, C. F.

    2006-06-01

    NASA's experience servicing the Hubble Space Telescope, including the installation of optical elements to compensate for a mirror manufacturing error; replacement of failed avionics and worn-out batteries, gyros, thermal insulation and solar arrays; upgrades to the data handling subsystem; installation of far more capable instruments; and retrofitting the NICMOS experiment with a mechanical cryocooler has clearly demonstrated the advantages of on-orbit servicing. This effort has produced a unique astronomical observatory that is orders of magnitude more capable than when it was launched and can be operated for several times its original design life. The in-space operations capabilities that are developed for NASA's Exploration Program will make it possible to assemble and service spacecraft in space and to service them in cis-lunar and L2 orbits. Future space observatories should be designed to utilize these capabilities. This paper discusses the application of the lessons learned from HST and our plans for servicing the Advanced X-ray Astrophysical Observatory with the Orbital Maneuvering Vehicle and the Space Station Freedom Customer Servicing Facility to future space observatories, such as SAFIR and LifeFinder that are designed to operate in heliocentric orbits. It addresses the use of human and robotic in-space capabilities that would be required for on-orbit assembly and servicing for future space observatories, and describes some of our design concepts for these activities.

  5. The application of LiF:Mg,Cu,P to large scale personnel dosimetry: current status and future directions.

    PubMed

    Moscovitch, M; St John, T J; Cassata, J R; Blake, P K; Rotunda, J E; Ramlo, M; Velbeck, K J; Luo, L Z

    2006-01-01

    LiF:Mg,Cu,P is starting to replace LiF:Mg,Ti in a variety of personnel dosimetry applications. LiF:Mg,Cu,P has superior characteristics as compared to LiF:Mg,Ti including, higher sensitivity, improved energy response for photons, lack of supralinearity and insignificant fading. The use of LiF:Mg,Cu,P in large scale dosimetry programs is of particular interest due to the extreme sensitivity of this material to the maximum readout temperature, and the variety of different dosimetry aspects and details that must be considered for a successful implementation in routine dosimetry. Here we discuss and explain the various aspects of large scale LiF:Mg,Cu,P based dosimetry programs including the properties of the TL material, new generation of TLD readers, calibration methodologies, a new generation of dose calculation algorithms based on the use of artificial neural networks and the overall uncertainty of the dose measurement. The United States Navy (USN) will be the first US dosimetry processor who will use this new material for routine applications. Until June 2002, the Navy used two types of thermoluminescent materials for personnel dosimetry, CaF2:Mn and LiF:Mg,Ti. A program to upgrade the system and to implement LiF:Mg,Cu,P, started in the mid 1990s and was recently concluded. In 2002, the new system replaced the LiF:Mg,Ti and is scheduled to start replacing the CaF2:Mn system in 2006. A pilot study to determine the dosimetric performance of the new LiF:Mg,Cu,P based dosimetry system was recently completed, and the results show the new system to be as good or better than the current system in all areas tested. As a result, LiF:Mg,Cu,P is scheduled to become the primary personnel dosimeter for the entire US Navy in 2006. PMID:16835277

  6. Future technological tests on large-scale mock-ups of ITER blanket modules at IVV-2M reactor

    SciTech Connect

    Zyrianov, A.P.; Tokarev, V.I.; Zlokazov, S.B.

    1994-12-31

    A multisection core of water-cooled water-moderated reactor IVV-2M facilities testing of large scale mock-ups of ITER breeder blanket modules, the reactor arrangement in a building provides a maximum close position of tritium {open_quotes}in-pile{close_quotes} measurement station {open_quotes}RITM{close_quotes} to the core (in-pile testing of tritium producing mock-ups). Mock-ups of ceramic and liquid metal blankets are planned to be tested complying the following requirements: mock-up dimensions maximum close to those of ITER, distributions of nuclear power density, temperature fields, tritium release modes at continuous helium purging, provision of cyclic neutron and thermal loading variations. Variants of location of large ({approximately}150x200 mm) mock-up of ceramic blanket and a submerged loop facility containing liquid lithium and a vanadium alloy as a structure material are described. A technological scheme of {open_quotes}RITM{close_quotes} measurement station to study tritium system operation modes are presented.

  7. Machine learning for large-scale wearable sensor data in Parkinson's disease: Concepts, promises, pitfalls, and futures.

    PubMed

    Kubota, Ken J; Chen, Jason A; Little, Max A

    2016-09-01

    For the treatment and monitoring of Parkinson's disease (PD) to be scientific, a key requirement is that measurement of disease stages and severity is quantitative, reliable, and repeatable. The last 50 years in PD research have been dominated by qualitative, subjective ratings obtained by human interpretation of the presentation of disease signs and symptoms at clinical visits. More recently, "wearable," sensor-based, quantitative, objective, and easy-to-use systems for quantifying PD signs for large numbers of participants over extended durations have been developed. This technology has the potential to significantly improve both clinical diagnosis and management in PD and the conduct of clinical studies. However, the large-scale, high-dimensional character of the data captured by these wearable sensors requires sophisticated signal processing and machine-learning algorithms to transform it into scientifically and clinically meaningful information. Such algorithms that "learn" from data have shown remarkable success in making accurate predictions for complex problems in which human skill has been required to date, but they are challenging to evaluate and apply without a basic understanding of the underlying logic on which they are based. This article contains a nontechnical tutorial review of relevant machine-learning algorithms, also describing their limitations and how these can be overcome. It discusses implications of this technology and a practical road map for realizing the full potential of this technology in PD research and practice. © 2016 International Parkinson and Movement Disorder Society. PMID:27501026

  8. Climate change and large-scale land acquisitions in Africa: Quantifying the future impact on acquired water resources

    NASA Astrophysics Data System (ADS)

    Chiarelli, Davide Danilo; Davis, Kyle Frankel; Rulli, Maria Cristina; D'Odorico, Paolo

    2016-08-01

    Pressure on agricultural land has markedly increased since the start of the century, driven by demographic growth, changes in diet, increasing biofuel demand, and globalization. To better ensure access to adequate land and water resources, many investors and countries began leasing large areas of agricultural land in the global South, a phenomenon often termed "large-scale land acquisition" (LSLA). To date, this global land rush has resulted in the appropriation of 41million hectares and about 490 km3 of freshwater resources, affecting rural livelihoods and local environments. It remains unclear to what extent land and water acquisitions contribute to the emergence of water-stress conditions in acquired areas, and how these demands for water may be impacted by climate change. Here we analyze 18 African countries - 20 Mha (or 80%) of LSLA for the continent - and estimate that under present climate 210 km3 year-1of water would be appropriated if all acquired areas were actively under production. We also find that consumptive use of irrigation water is disproportionately contributed by water-intensive biofuel crops. Using the IPCCA1B scenario, we find only small changes in green (-1.6%) and blue (+2.0%) water demand in targeted areas. With a 3 °C temperature increase, crop yields are expected to decrease up to 20% with a consequent increase in the water footprint. When the effect of increasing atmospheric CO2concentrations is accounted for, crop yields increase by as much as 40% with a decrease in water footprint up to 29%. The relative importance of CO2 fertilization and warming will therefore determine water appropriations and changes in water footprint under climate change scenarios.

  9. Large scale distribution of ultra high energy cosmic rays detected at the Pierre Auger Observatory with zenith angles up to 80°

    SciTech Connect

    Aab, Alexander

    2015-03-30

    In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in the $E\\gt 8$ EeV energy bin, with an amplitude for the first harmonic in right ascension $r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$, that has a chance probability $P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$, reinforcing the hint previously reported with vertical events alone.

  10. Large scale distribution of ultra high energy cosmic rays detected at the Pierre Auger Observatory with zenith angles up to 80°

    DOE PAGESBeta

    Aab, Alexander

    2015-03-30

    In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in themore » $$E\\gt 8$$ EeV energy bin, with an amplitude for the first harmonic in right ascension $$r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$$, that has a chance probability $$P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$$, reinforcing the hint previously reported with vertical events alone.« less

  11. CORK Borehole Observatory Meets NEPTUNE Canada Cabled Observatory: First Experiences and Future Plans

    NASA Astrophysics Data System (ADS)

    Heesemann, M.; Davis, E. E.; Scherwath, M.

    2011-12-01

    The connection between the CORK ("Circulation Obviation Retrofit Kit") borehole observatory monitoring Ocean Drilling Program (ODP) borehole 1026B and the NEPTUNE Canada ocean network in September of 2009 marks the beginning of a new era of cabled subseafloor observations. The electrical power and real-time data access provided by cables improve the sampling rate, life time, and timing accuracy of existing borehole instrumentation. Cabled observatories also provide the opportunity to deploy advanced instruments that consume more power and produce more data than ever before. Using data from the 1026B CORK, we demonstrate how the higher sampling rate of cabled CORK observatories enables us to study phenomena like ocean weather and hydrologic responses to seismic waves. In an outlook we show how CORKs and new borehole instruments-planned for future connection to the NEPTUNE Canada ocean network-can help to yield critical information on the accumulation of stress and resulting strain of plate-scale crustal movements. In the future, these CORKs and new geodetic borehole instrumentation will provide a time-series of strain signals associated with the Cascadia subduction zone that would not have been possible with remote sensing or land-based monitoring. These CORKs will not only represent a new approach for earthquake research but the high-frequency, real-time data could also directly contribute to earthquake and tsunami early warning systems.

  12. ANALYSIS OF CHARACTERISTIC PARAMETERS OF LARGE-SCALE CORONAL WAVES OBSERVED BY THE SOLAR-TERRESTRIAL RELATIONS OBSERVATORY/EXTREME ULTRAVIOLET IMAGER

    SciTech Connect

    Muhr, N.; Veronig, A. M.; Kienreich, I. W.; Temmer, M.; Vrsnak, B.

    2011-10-01

    The kinematical evolution of four extreme ultraviolet waves, well observed by the Extreme Ultraviolet Imager on board the Solar-Terrestrial Relations Observatory (STEREO), is studied by visually tracking wave fronts as well as by a semi-automatized perturbation profile method, which leads to results matching each other within the error limits. The derived mean velocities of the events under study lie in the range of 220-350 km s{sup -1}. The fastest of the events (2007 May 19) reveals a significant deceleration of {approx} - 190 m s{sup -2}, while the others are consistent with a constant velocity during wave propagation. The evolution of maximum-intensity values reveals initial intensification of 20%-70% and decays to original levels within 40-60 minutes, while the widths at half-maximum and full-maximum of the perturbation profiles broaden by a factor of two to four. The integral below the perturbation profile remains basically constant in two cases, while it shows a decrease by a factor of three to four in the other two cases. From the peak perturbation amplitudes, we estimate the corresponding magnetosonic Mach numbers M{sub ms}, which range from 1.08-1.21. The perturbation profiles reveal three distinct features behind the propagating wave fronts: coronal dimmings, stationary brightenings, and rarefaction regions. All features appear after the wave passage and only slowly fade away. Our findings indicate that the events under study are weak-shock fast-mode magnetohydrodynamic waves initiated by the CME lateral expansion.

  13. Genetic influences on schizophrenia and subcortical brain volumes: large-scale proof-of-concept and roadmap for future studies

    PubMed Central

    Anttila, Verneri; Hibar, Derrek P; van Hulzen, Kimm J E; Arias-Vasquez, Alejandro; Smoller, Jordan W; Nichols, Thomas E; Neale, Michael C; McIntosh, Andrew M; Lee, Phil; McMahon, Francis J; Meyer-Lindenberg, Andreas; Mattheisen, Manuel; Andreassen, Ole A; Gruber, Oliver; Sachdev, Perminder S; Roiz-Santiañez, Roberto; Saykin, Andrew J; Ehrlich, Stefan; Mather, Karen A; Turner, Jessica A; Schwarz, Emanuel; Thalamuthu, Anbupalam; Shugart, Yin Yao; Ho, Yvonne YW; Martin, Nicholas G; Wright, Margaret J

    2016-01-01

    Schizophrenia is a devastating psychiatric illness with high heritability. Brain structure and function differ, on average, between schizophrenia cases and healthy individuals. As common genetic associations are emerging for both schizophrenia and brain imaging phenotypes, we can now use genome-wide data to investigate genetic overlap. Here we integrated results from common variant studies of schizophrenia (33,636 cases, 43,008 controls) and volumes of several (mainly subcortical) brain structures (11,840 subjects). We did not find evidence of genetic overlap between schizophrenia risk and subcortical volume measures either at the level of common variant genetic architecture or for single genetic markers. The current study provides proof-of-concept (albeit based on a limited set of structural brain measures), and defines a roadmap for future studies investigating the genetic covariance between structural/functional brain phenotypes and risk for psychiatric disorders. PMID:26854805

  14. Future Large-Scale Surveys of 'Interesting' Stars in The Halo and Thick Disk of the Galaxy

    NASA Astrophysics Data System (ADS)

    Beers, T. C.

    The age of slow, methodical, star-by-star, single-slit spectroscopic observations of rare stars in the halo and thick disk of the Milky Way has come to an end. As the result of the labors of numerous astronomers over the past 40 years, spectroscopic data for some 2000 stars with metallicity less than [Fe/H] = -1.5 has been obtained. Under the assumption of a constant flux of astronomers working in this area (and taking 50 major players over the years), the long-term average yield works out to ONE (1!) such star per astronomer per year. The use of new spectroscopic and photometric survey techniques which obtain large sky coverage to faint magnitudes will enable substantially better "return on investment" in the near future. We review the present state of surveys for low metallicity and field horizontal-branch stars in the Galaxy, and describe several new lines of attack which should open the way to a more than one hundred-fold increase in the numbers of interesting stars with available spectroscopic and photometric information. The age of slow, methodical, star-by-star, single-slit spectroscopic observations of rare stars in the halo and thick disk of the Milky Way has come to an end. As the result of the labors of numerous astronomers over the past 40 years, spectroscopic data for some 2000 stars with metallicity less than [Fe/H] = -1.5 has been obtained. Under the assumption of a constant flux of astronomers working in this area (and taking 50 major players over the years), the long-term average yield works out to ONE (1!) such star per astronomer per year. The use of new spectroscopic and photometric survey techniques which obtain large sky coverage to faint magnitudes will enable substantially better "return on investment" in the near future. We review the present state of surveys for low metallicity and field horizontal-branch stars in the Galaxy, and describe several new lines of attack which should open the way to a more than one hundred-fold increase in the

  15. Phylogeny Drives Large Scale Patterns in Australian Marine Bioactivity and Provides a New Chemical Ecology Rationale for Future Biodiscovery

    PubMed Central

    Evans-Illidge, Elizabeth A.; Logan, Murray; Doyle, Jason; Fromont, Jane; Battershill, Christopher N.; Ericson, Gavin; Wolff, Carsten W.; Muirhead, Andrew; Kearns, Phillip; Abdo, David; Kininmonth, Stuart; Llewellyn, Lyndon

    2013-01-01

    Twenty-five years of Australian marine bioresources collecting and research by the Australian Institute of Marine Science (AIMS) has explored the breadth of latitudinally and longitudinally diverse marine habitats that comprise Australia’s ocean territory. The resulting AIMS Bioresources Library and associated relational database integrate biodiversity with bioactivity data, and these resources were mined to retrospectively assess biogeographic, taxonomic and phylogenetic patterns in cytotoxic, antimicrobial, and central nervous system (CNS)-protective bioactivity. While the bioassays used were originally chosen to be indicative of pharmaceutically relevant bioactivity, the results have qualified ecological relevance regarding secondary metabolism. In general, metazoan phyla along the deuterostome phylogenetic pathway (eg to Chordata) and their ancestors (eg Porifera and Cnidaria) had higher percentages of bioactive samples in the assays examined. While taxonomy at the phylum level and higher-order phylogeny groupings helped account for observed trends, taxonomy to genus did not resolve the trends any further. In addition, the results did not identify any biogeographic bioactivity hotspots that correlated with biodiversity hotspots. We conclude with a hypothesis that high-level phylogeny, and therefore the metabolic machinery available to an organism, is a major determinant of bioactivity, while habitat diversity and ecological circumstance are possible drivers in the activation of this machinery and bioactive secondary metabolism. This study supports the strategy of targeting phyla from the deuterostome lineage (including ancestral phyla) from biodiverse marine habitats and ecological niches, in future biodiscovery, at least that which is focused on vertebrate (including human) health. PMID:24040076

  16. The large scale structure of the Universe revealed with high redshift emission-line galaxies: implications for future surveys

    NASA Astrophysics Data System (ADS)

    Antonino Orsi, Alvaro

    2015-08-01

    Nebular emission in galaxies trace their star-formation activity within the last 10 Myr or so. Hence, these objects are typically found in the outskirts of massive clusters, where otherwise environmental effects can effectively stop the star formation process. In this talk I discuss the nature of emission-line galaxies (ELGs) and its implications for their clustering properties. To account for the relevant physical ingredients that produce nebular emission, I combine semi-analytical models of galaxy formation with a radiative transfer code of Ly-alpha photons, and the photoionzation and shock code MAPPINGS-III. As a result, the clustering strength of ELGs is found to correlate weakly with the line luminosities. Also, their 2-d clustering displays a weak finger-of-god effect, and the clustering in linear scales is affected by assembly bias. I review the impact of the nature of this galaxy population for future spectroscopic large surveys targeting ELGs to extract cosmological results. In particular, I present forecasts for the ELG population in J-PAS, an 8000 deg^2 survey with 54 narrow-band filters covering the optical range, expected to start in 2016.

  17. A future large-aperture UVOIR space observatory: reference designs

    NASA Astrophysics Data System (ADS)

    Rioux, Norman; Thronson, Harley; Feinberg, Lee; Stahl, H. Philip; Redding, Dave; Jones, Andrew; Sturm, James; Collins, Christine; Liu, Alice

    2015-09-01

    Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. We describe the feasibility assessment of system thermal and dynamic stability for supporting coronagraphy. The observatory is in a Sun-Earth L2 orbit providing a stable thermal environment and excellent field of regard. Reference designs include a 36-segment 9.2 m aperture telescope that stows within a five meter diameter launch vehicle fairing. Performance needs developed under the study are traceable to a variety of reference designs including options for a monolithic primary mirror.

  18. A Future Large-Aperture UVOIR Space Observatory: Reference Designs

    NASA Technical Reports Server (NTRS)

    Thronson, Harley; Rioux, Norman; Feinberg, Lee; Stahl, H. Philip; Redding, Dave; Jones, Andrew; Sturm, James; Collins, Christine; Liu, Alice

    2015-01-01

    Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. We describe the feasibility assessment of system thermal and dynamic stability for supporting coronagraphy. The observatory is in a Sun-Earth L2 orbit providing a stable thermal environment and excellent field of regard. Reference designs include a 36-segment 9.2 m aperture telescope that stows within a five meter diameter launch vehicle fairing. Performance needs developed under the study are traceable to a variety of reference designs including options for a monolithic primary mirror.

  19. Impact of idealized future stratospheric aerosol injection on the large-scale ocean and land carbon cycles

    NASA Astrophysics Data System (ADS)

    Tjiputra, J. F.; Grini, A.; Lee, H.

    2016-01-01

    Using an Earth system model, we simulate stratospheric aerosol injection (SAI) on top of the Representative Concentration Pathways 8.5 future scenario. Our idealized method prescribes aerosol concentration, linearly increasing from 2020 to 2100, and thereafter remaining constant until 2200. In the aggressive scenario, the model projects a cooling trend toward 2100 despite warming that persists in the high latitudes. Following SAI termination in 2100, a rapid global warming of 0.35 K yr-1 is simulated in the subsequent 10 years, and the global mean temperature returns to levels close to the reference state, though roughly 0.5 K cooler. In contrast to earlier findings, we show a weak response in the terrestrial carbon sink during SAI implementation in the 21st century, which we attribute to nitrogen limitation. The SAI increases the land carbon uptake in the temperate forest-, grassland-, and shrub-dominated regions. The resultant lower temperatures lead to a reduction in the heterotrophic respiration rate and increase soil carbon retention. Changes in precipitation patterns are key drivers for variability in vegetation carbon. Upon SAI termination, the level of vegetation carbon storage returns to the reference case, whereas the soil carbon remains high. The ocean absorbs nearly 10% more carbon in the geoengineered simulation than in the reference simulation, leading to a ˜15 ppm lower atmospheric CO2 concentration in 2100. The largest enhancement in uptake occurs in the North Atlantic. In both hemispheres' polar regions, SAI delays the sea ice melting and, consequently, export production remains low. In the deep water of North Atlantic, SAI-induced circulation changes accelerate the ocean acidification rate and broaden the affected area.

  20. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  1. Genetic Diversity and Ecological Niche Modelling of Wild Barley: Refugia, Large-Scale Post-LGM Range Expansion and Limited Mid-Future Climate Threats?

    PubMed Central

    Russell, Joanne; van Zonneveld, Maarten; Dawson, Ian K.; Booth, Allan; Waugh, Robbie; Steffenson, Brian

    2014-01-01

    Describing genetic diversity in wild barley (Hordeum vulgare ssp. spontaneum) in geographic and environmental space in the context of current, past and potential future climates is important for conservation and for breeding the domesticated crop (Hordeum vulgare ssp. vulgare). Spatial genetic diversity in wild barley was revealed by both nuclear- (2,505 SNP, 24 nSSR) and chloroplast-derived (5 cpSSR) markers in 256 widely-sampled geo-referenced accessions. Results were compared with MaxEnt-modelled geographic distributions under current, past (Last Glacial Maximum, LGM) and mid-term future (anthropogenic scenario A2, the 2080s) climates. Comparisons suggest large-scale post-LGM range expansion in Central Asia and relatively small, but statistically significant, reductions in range-wide genetic diversity under future climate. Our analyses support the utility of ecological niche modelling for locating genetic diversity hotspots and determine priority geographic areas for wild barley conservation under anthropogenic climate change. Similar research on other cereal crop progenitors could play an important role in tailoring conservation and crop improvement strategies to support future human food security. PMID:24505252

  2. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  3. Large Scale Computing

    NASA Astrophysics Data System (ADS)

    Capiluppi, Paolo

    2005-04-01

    Large Scale Computing is acquiring an important role in the field of data analysis and treatment for many Sciences and also for some Social activities. The present paper discusses the characteristics of Computing when it becomes "Large Scale" and the current state of the art for some particular application needing such a large distributed resources and organization. High Energy Particle Physics (HEP) Experiments are discussed in this respect; in particular the Large Hadron Collider (LHC) Experiments are analyzed. The Computing Models of LHC Experiments represent the current prototype implementation of Large Scale Computing and describe the level of maturity of the possible deployment solutions. Some of the most recent results on the measurements of the performances and functionalities of the LHC Experiments' testing are discussed.

  4. Operations of and Future Plans for the Pierre Auger Observatory

    SciTech Connect

    Abraham, : J.; Abreu, P.; Aglietta, M.; Aguirre, C.; Ahn, E.J.; Allard, D.; Allekotte, I.; Allen, J.; Alvarez-Muniz, J.; Ambrosio, M.; Anchordoqui, L.

    2009-06-01

    These are presentations to be presented at the 31st International Cosmic Ray Conference, in Lodz, Poland during July 2009. It consists of the following presentations: (1) Performance and operation of the Surface Detectors of the Pierre Auger Observatory; (2) Extension of the Pierre Auger Observatory using high-elevation fluorescence telescopes (HEAT); (3) AMIGA - Auger Muons and Infill for the Ground Array of the Pierre Auger Observatory; (4) Radio detection of Cosmic Rays at the southern Auger Observatory; (5) Hardware Developments for the AMIGA enhancement at the Pierre Auger Observatory; (6) A simulation of the fluorescence detectors of the Pierre Auger Observatory using GEANT 4; (7) Education and Public Outreach at the Pierre Auger Observatory; (8) BATATA: A device to characterize the punch-through observed in underground muon detectors and to operate as a prototype for AMIGA; and (9) Progress with the Northern Part of the Pierre Auger Observatory.

  5. Hydrological projections under climate change in the near future by RegCM4 in Southern Africa using a large-scale hydrological model

    NASA Astrophysics Data System (ADS)

    Li, Lu; Diallo, Ismaïla; Xu, Chong-Yu; Stordal, Frode

    2015-09-01

    This study aims to provide model estimates of changes in hydrological elements, such as EvapoTranspiration (ET) and runoff, in Southern Africa in the near future until 2029. The climate change scenarios are projected by a high-resolution Regional Climate Model (RCM), RegCM4, which is the latest version of this model developed by the Abdus Salam International Centre for Theoretical Physics (ICTP). The hydrological projections are performed by using a large-scale hydrological model (WASMOD-D), which has been tested and customized on this region prior to this study. The results reveal that (1) the projected temperature shows an increasing tendency over Southern Africa in the near future, especially eastward of 25°E, while the precipitation changes are varying between different months and sub-regions; (2) an increase in runoff (and ET) was found in eastern part of Southern Africa, i.e. Southern Mozambique and Malawi, while a decrease was estimated across the driest region in a wide area encompassing Kalahari Desert, Namibia, southwest of South Africa and Angola; (3) the strongest climate change signals are found over humid tropical areas, i.e. north of Angola and Malawi and south of Dem Rep of Congo; and (4) large spatial and temporal variability of climate change signals is found in the near future over Southern Africa. This study presents the main results of work-package 2 (WP2) of the 'Socioeconomic Consequences of Climate Change in Sub-equatorial Africa (SoCoCA)' project, which is funded by the Research Council of Norway.

  6. Long-term precipitation variability in Morocco and the link to the large-scale circulation in recent and future climates

    NASA Astrophysics Data System (ADS)

    Knippertz, P.; Christoph, M.; Speth, P.

    Monthly precipitation data from the Global Historical Climatology Network for 42 stations in Morocco and its vicinity are investigated with respect to baroclinicity, storm track and cyclone activity, moisture transports, North Atlantic Oscillation (NAO) variations, and different circulation types by means of correlation and composite studies. The results are related to a climate change scenario from an ECHAM4/OPYC3 transient greenhouse gas only (GHG) simulation. Precipitation in northwestern Morocco shows a clear link to the baroclinic activity over the North Atlantic during boreal winter (DJF). In large precipitation months the North Atlantic storm track is shifted southward, more westerly and northwesterly circulation situations occur and moisture transports from the Atlantic are enhanced. The occurrence of local cyclones and upper-level troughs is more frequent than in low precipitation months. The negative correlation to the NAO is relatively strong, especially with Gibraltar as a southern pole (-0.71). The northward shift of the storm track and eastward shift of the Azores High predicted by the ECHAM model for increasing GHG concentrations would therefore be associated with decreasing precipitation and potentially serious impacts for the future water supply for parts of Morocco. In the region south of the Atlas mountains, moisture transports from the Atlantic along the southern flank of the Atlas Mountains associated with cyclones west of Morocco and the Iberian Peninsula can be identified as a decisive factor for precipitation. Northeastern Morocco and Northwestern Algeria, however, is rather dominated by the influence of cyclones over the Western Mediterranean that are associated with a strong northwesterly moisture transport. As both regions appear to be less dependent on the North Atlantic storm track and more on local processes, a straight forward interpretation of the large-scale changes predicted by the ECHAM4/OPYC3 cannot be done without the

  7. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  8. A Future Large-Aperture UVOIR Space Observatory: Study Overview

    NASA Astrophysics Data System (ADS)

    Postman, Marc; Thronson, Harley A.; Feinberg, Lee; Redding, David; Stahl, H. Philip

    2015-01-01

    The scientific drivers for very high angular resolution coupled with very high sensitivity and wavefront stability in the UV and optical wavelength regime have been well established. These include characterization of exoplanets in the habitable zones of solar type stars, probing the physical properties of the circumgalactic medium around z < 2 galaxies, and resolving stellar populations across a broad range of galactic environments. The 2010 NRC Decadal Survey and the 2013 NASA Science Mission Directorate 30-Year Roadmap identified a large-aperture UVOIR observatory as a priority future space mission. Our joint NASA GSFC/JPL/MSFC/STScI team has extended several earlier studies of the technology and engineering requirements needed to design and build a single filled aperture 10-meter class space-based telescope that can enable these ambitious scientific observations. We present here an overview of our new technical work including a brief summary of the reference science drivers as well as in-depth investigations of the viable telescope architectures, the requirements on thermal control and active wavefront control systems, and the range of possible launch configurations.

  9. Future development of the PLATO Observatory for Antarctic science

    NASA Astrophysics Data System (ADS)

    Ashley, Michael C. B.; Bonner, Colin S.; Everett, Jon R.; Lawrence, Jon S.; Luong-Van, Daniel; McDaid, Scott; McLaren, Campbell; Storey, John W. V.

    2010-07-01

    PLATO is a self-contained robotic observatory built into two 10-foot shipping containers. It has been successfully deployed at Dome A on the Antarctic plateau since January 2008, and has accumulated over 730 days of uptime at the time of writing. PLATO provides 0.5{1kW of continuous electrical power for a year from diesel engines running on Jet-A1, supplemented during the summertime with solar panels. One of the 10-foot shipping containers houses the power system and fuel, the other provides a warm environment for instruments. Two Iridium satellite modems allow 45 MB/day of data to be transferred across the internet. Future enhancements to PLATO, currently in development, include a more modular design, using lithium iron-phosphate batteries, higher power output, and a light-weight low-power version for eld deployment from a Twin Otter aircraft. Technologies used in PLATO include a CAN (Controller Area Network) bus, high-reliability PC/104 com- puters, ultracapacitors for starting the engines, and fault-tolerant redundant design.

  10. A robust methodology for conducting large-scale assessments of current and future water availability and use: A case study in Tasmania, Australia

    NASA Astrophysics Data System (ADS)

    Post, D. A.; Chiew, F. H. S.; Teng, J.; Viney, N. R.; Ling, F. L. N.; Harrington, G.; Crosbie, R. S.; Graham, B.; Marvanek, S.; McLoughlin, R.

    2012-01-01

    SummaryThis paper describes a robust methodology for determining current surface and groundwater availability and use, as well as future changes due to climate and landuse changes. It is based on the methodology developed by CSIRO to deliver on four large-scale water availability assessments conducted in the Murray-Darling Basin, Northern Australia, south-west Western Australia, and Tasmania. It will focus on the application of the technique and results from Tasmania, providing a representative example of the approach used. The genesis of this work was the explicit desire by Australian State and Commonwealth governments to use the outputs of these water availability assessments for assisting the formation of state and federal government water policy. For example, the results of the work have already been utilised as a key technical input to decision making on funding for proposed irrigation projects in Tasmania. Outputs from the other three study areas have been used to assist in developing a water resources plan for the Murray-Darling Basin, to guide infrastructure development in northern Australia, and to plan for reductions in water availability due to climate change in south-west Western Australia. The methodology assesses current water availability through the application of rainfall-runoff and river models, and recharge and groundwater models. These were calibrated to streamflow records and groundwater levels, and parameterised using estimates of current surface and groundwater extractions and use. Having derived an estimate of current water availability, the impacts of future climate change on water availability were determined through deriving projected changes in rainfall and potential evaporation from 15 IPCC AR4 global climate models. The changes in rainfall were then dynamically downscaled using the CSIRO-CCAM model over the study area (50,000 km 2). The future climate sequence was then derived by modifying the historical 84-year climate sequence based

  11. The Pierre Auger Observatory: Mass composition results and future plans

    NASA Astrophysics Data System (ADS)

    Hervé, A. E.; Pierre Auger Collaboration

    2016-07-01

    The Pierre Auger Observatory has been designed to study ultra-high energy cosmic rays. The study of their mass composition can help constrain models concerning their nature and origin. We discuss the different methods of deriving the mass composition of the primary cosmic rays. The methods use different parameters that describe various characteristics of the shower development. We will also discuss the prospects expected from an upgrade of the Pierre Auger Observatory.

  12. An Observatory to Enhance the Preparation of Future California Teachers

    NASA Astrophysics Data System (ADS)

    Connolly, L.; Lederer, S.

    2004-12-01

    With a major grant from the W. M. Keck Foundation, California State University, San Bernardino is establishing a state-of-the-art teaching astronomical observatory. The Observatory will be fundamental to an innovative undergraduate physics and astronomy curriculum for Physics and Liberal Studies majors and will be integrated into our General Education program. The critical need for a research and educational observatory is linked to changes in California's Science Competencies for teacher certification. Development of the Observatory will also complement a new infusion of NASA funding and equipment support for our growing astronomy education programs and the University's established Strategic Plan for excellence in education and teacher preparation. The Observatory will consist of two domed towers. One tower will house a 20" Ritchey-Chretien telescope equipped with a CCD camera in conjunction with either UBVRI broadband filters or a spectrometer for evening laboratories and student research projects. The second tower will house the university's existing 12" Schmidt-Cassegrain optical telescope coupled with a CCD camera and an array of filters. A small aperture solar telescope will be attached to the 12" for observing solar prominences while a milar filter can be attached to the 12" for sunspot viewing. We have been very fortunate to receive a challenge grant of \\600,000 from the W. M. Keck Foundation to equip the two domed towers; we continue to seek a further \\800,000 to meet our construction needs. Funding also provided by the California State University, San Bernardino.

  13. Large scale tracking algorithms.

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  14. Large scale traffic simulations

    SciTech Connect

    Nagel, K.; Barrett, C.L.; Rickert, M.

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  15. The Renovation and Future Capabilities of the Thacher Observatory

    NASA Astrophysics Data System (ADS)

    O'Neill, Katie; Osuna, Natalie; Edwards, Nick; Klink, Douglas; Swift, Jonathan; Vyhnal, Chris; Meyer, Kurt

    2016-01-01

    The Thacher School is in the process of renovating the campus observatory with a new meter class telescope and full automation capabilities for the purpose of scientific research and education. New equipment on site has provided a preliminary site characterization including seeing and V-band sky brightness measurements. These data, along with commissioning data from the MINERVA project (which uses comparable hardware) are used to estimate the capabilities of the observatory once renovation is complete. Our V-band limiting magnitude is expected to be better than 21.3 for a one minute integration time, and we estimate that milli-magnitude precision photometry will be possible for a V=14.5 point source over approximately 5 min timescales. The quick response, autonomous operation, and multi-band photometric capabilities of the renovated observatory will make it a powerful follow-up science facility for exoplanets, eclipsing binaries, near-Earth objects, stellar variability, and supernovae.

  16. Precision medicine in the age of big data: The present and future role of large-scale unbiased sequencing in drug discovery and development.

    PubMed

    Vicini, P; Fields, O; Lai, E; Litwack, E D; Martin, A-M; Morgan, T M; Pacanowski, M A; Papaluca, M; Perez, O D; Ringel, M S; Robson, M; Sakul, H; Vockley, J; Zaks, T; Dolsten, M; Søgaard, M

    2016-02-01

    High throughput molecular and functional profiling of patients is a key driver of precision medicine. DNA and RNA characterization has been enabled at unprecedented cost and scale through rapid, disruptive progress in sequencing technology, but challenges persist in data management and interpretation. We analyze the state-of-the-art of large-scale unbiased sequencing in drug discovery and development, including technology, application, ethical, regulatory, policy and commercial considerations, and discuss issues of LUS implementation in clinical and regulatory practice. PMID:26536838

  17. The Pierre Auger Observatory: Present Status and Future Prospects

    SciTech Connect

    Petrera, Sergio

    2005-10-12

    The Pierre Auger Observatory is in advanced stage of construction in the southern site of Malargue, Argentina. This progress report mainly focuses on hybrid events, a remarkable subset of cosmic ray events which are simultaneously detected by both Surface Detector and Fluorescence Detector subsystems. The hybrid method and its performances are presented.

  18. The LIGO Gravitational Wave Observatories:. Recent Results and Future Plans

    NASA Astrophysics Data System (ADS)

    Harry, G. M.; Adhikari, R.; Ballmer, S.; Bayer, K.; Betzwieser, J.; Bochner, B.; Burgess, R.; Cadonati, L.; Chatterji, S.; Corbitt, T.; Csatorday, P.; Fritschel, P.; Goda, K.; Hefetz, Y.; Katsavounidis, E.; Lawrence, R.; Macinnis, M.; Marin, A.; Mason, K.; Mavalvala, N.; Mittleman, R.; Ottaway, D. J.; Pratt, M.; Regimbau, T.; Richman, S.; Rollins, J.; Shoemaker, D. H.; Smith, M.; van Putten, M.; Weiss, R.; Aulbert, C.; Berukoff, S. J.; Cutler, C.; Grunewald, S.; Itoh, Y.; Krishnan, B.; Machenschalk, B.; Mohanty, S.; Mukherjee, S.; Naundorf, H.; Papa, M. A.; Schutz, B. F.; Sintes, A. M.; Williams, P. R.; Colacino, C.; Danzmann, K.; Freise, A.; Grote, H.; Heinzel, G.; Kawabe, K.; Kloevekorn, P.; Lück, H.; Mossavi, K.; Nagano, S.; Rüdiger, A.; Schilling, R.; Smith, J. R.; Weidner, A.; Willke, B.; Winkler, W.; Cusack, B. J.; McClelland, D. E.; Scott, S. M.; Searle, A. C.; Drever, R. W. P.; Tinto, M.; Williams, R.; Buonanno, A.; Chen, Y.; Thorne, K. S.; Vallisneri, M.; Abbott, B.; Anderson, S. B.; Araya, M.; Armandula, H.; Asiri, F.; Barish, B. C.; Barnes, M.; Barton, M. A.; Bhawal, B.; Billingsley, G.; Black, E.; Blackburn, K.; Bogue, L.; Bork, R.; Busby, D.; Cardenas, L.; Chandler, A.; Chapsky, J.; Charlton, P.; Coyne, D.; Creighton, T. D.; D'Ambrosio, E.; Desalvo, R.; Ding, H.; Edlund, J.; Ehrens, P.; Etzel, T.; Evans, M.; Farnham, D.; Fine, M.; Gillespie, A.; Grimmett, D.; Hartunian, A.; Heefner, J.; Hoang, P.; Hrynevych, M.; Ivanov, A.; Jones, L.; Jungwirth, D.; Kells, W.; King, C.; King, P.; Kozak, D.; Lazzarini, A.; Lei, M.; Libbrecht, K.; Lindquist, P.; Liu, S.; Logan, J.; Lyons, T. T.; Mageswaran, M.; Mailand, K.; Majid, W.; Mann, F.; Márka, S.; Maros, E.; Mason, J.; Meshkov, S.; Miyakawa, O.; Miyoki, S.; Mours, B.; Nocera, F.; Ouimette, D.; Pedraza, M.; Rao, S. R.; Redding, D.; Regehr, M. W.; Reilly, K. T.; Reithmaier, K.; Robison, L.; Romie, J.; Rose, D.; Russell, P.; Salzman, I.; Sanders, G. H.; Sannibale, V.; Schmidt, V.; Sears, B.; Seel, S.; Shawhan, P.; Sievers, L.; Smith, M. R.; Spero, R.; Sumner, M. C.; Sylvestre, J.; Takamori, A.; Tariq, H.; Taylor, R.; Tilav, S.; Torrie, C.; Tyler, W.; Vass, S.; Wallace, L.; Ware, B.; Webber, D.; Weinstein, A.; Wen, L.; Whitcomb, S. E.; Willems, P. A.; Wilson, A.; Yamamoto, H.; Zhang, L.; Zweizig, J.; Ganezer, K. S.; Babak, S.; Balasubramanian, R.; Churches, D.; Davies, R.; Sathyaprakash, B.; Taylor, I.; Christensen, N.; Ebeling, C.; Flanagan, É.; Nash, T.; Penn, S.; Dhurandar, S.; Nayak, R.; Sengupta, A. S.; Barker, D.; Barker-Patton, C.; Bland-Weaver, B.; Cook, D.; Gray, C.; Guenther, M.; Hindman, N.; Landry, M.; Lubiński, M.; Matherny, O.; Matone, L.; McCarthy, R.; Mendell, G.; Moreno, G.; Myers, J.; Parameswariah, V.; Raab, F.; Radkins, H.; Ryan, K.; Savage, R.; Schwinberg, P.; Sigg, D.; Vorvick, C.; Worden, J.; Abbott, R.; Carter, K.; Coles, M.; Evans, T.; Frolov, V.; Fyffe, M.; Gretarsson, A. M.; Hammond, M.; Hanson, J.; Kern, J.; Khan, A.; Kovalik, J.; Langdale, J.; Lormand, M.; O'Reilly, B.; Overmier, H.; Parameswariah, C.; Riesen, R.; Rizzi, A.; Roddy, S.; Sibley, A.; Stapfer, G.; Traylor, G.; Watts, K.; Wooley, R.; Yakushin, I.; Zucker, M.; Chickarmane, V.; Daw, E.; Giaime, J. A.; González, G.; Hamilton, W. O.; Johnson, W. W.; Wen, S.; Zotov, N.; McHugh, M.; Whelan, J. T.; Walther, H.; Ageev, A.; Bilenko, I. A.; Braginsky, V. B.; Mitrofanov, V. P.; Tokmakov, K. V.; Vyachanin, S. P.; Camp, J. B.; Kawamura, S.; Belczynski, K.; Grandclément, P.; Kalogera, V.; Kim, C.; Nutzman, P.; Olson, T.; Yoshida, S.; Beausoleil, R.; Bullington, A.; Byer, R. L.; Debra, D.; Fejer, M. M.; Gustafson, E.; Hardham, C.; Hennessy, M.; Hua, W.; Lantz, B.; Robertson, N. A.; Saulson, P. R.; Finn, L. S.; Hepler, N.; Owen, B. J.; Rotthoff, E.; Schlaufman, K.; Shapiro, C. A.; Stuver, A.; Summerscales, T.; Sutton, P. J.; Tibbits, M.; Winjum, B. J.; Anderson, W. G.; Díaz, M.; Johnston, W.; Romano, J. D.; Torres, C.; Ugolini, D.; Aufmuth, P.; Brozek, S.; Fallnich, C.; Goßler, S.; Heng, I. S.; Heurs, M.; Kötter, K.; Leonhardt, V.; Malec, M.; Quetschke, V.; Schrempel, M.; Traeger, S.; Weiland, U.; Welling, H.; Zawischa, I.; Ingley, R.; Messenger, C.; Vecchio, A.; Amin, R.; Castiglione, J.; Coldwell, R.; Delker, T.; Klimenko, S.; Mitselmakher, G.; Mueller, G.; Rakhmanov, M.; Reitze, D. H.; Rong, H.; Sazonov, A.; Shu, Q. Z.; Tanner, D. B.; Whiting, B. F.; Wise, S.; Barr, B.; Bennett, R.; Cagnoli, G.; Cantley, C. A.; Casey, M. M.; Crooks, D. R. M.; Dupuis, R. J.; Elliffe, E. J.; Grant, A.; Heptonstall, A.; Hewitson, M.; Hough, J.; Jennrich, O.; Killbourn, S.; Killow, C. J.; McNamara, P.; Newton, G.; Pitkin, M.; Plissi, M.; Robertson, D. I.; Rowan, S.; Skeldon, K.; Sneddon, P.; Strain, K. A.; Ward, H.; Woan, G.; Chin, D.; Gustafson, R.; Riles, K.; Brau, J. E.; Frey, R.; Ito, M.; Leonor, I.

    2006-02-01

    The LIGO interferometers are operating as gravitational wave observatories, with a noise level near an order of magnitude of the goal and the first scientific data recently taken. This data has been analyzed for four different categories of gravitational wave sources; millisecond bursts, inspiralling binary neutron stars, periodic waves from a known pulsar, and stochastic background. Research and development is also underway for the next generation LIGO detector, Advanced LIGO.

  19. The new Arecibo Observatory Remote Optical Facility (AO-ROF) in Culebra Island, Puerto Rico: Current Status and Future Projects

    NASA Astrophysics Data System (ADS)

    Santos, P. T.

    2015-12-01

    The idea of establishing the Arecibo Observatory Remote Optical Facility (AO-ROF) in the island of Culebra is a solution to mitigate the ever cumulative quantity of cloud, fog, and rain that has distressed observations at the Arecibo Observatory (AO) during major optical campaigns and observations. Given Culebra Island's favorable geographical and climatological characteristics as its low elevation and geographic location, it appears to have more steady weather conditions than Arecibo, so therefore it provides more availability for optical observations. Placed on Culebra, optical instruments can observe the same thermospheric volume over AO sampled by the Incoherent Scatter Radar (ISR). This capability will become especially important during the High Frequency (HF) facility is on operation. Small and large scale irregularities created by that HF can be readily observed and tracked from the Culebra site, and simultaneous observations from AO of the same atmospheric volume will permit direct vector measurements of dynamical evolution of the irregularities. This work presents a discussion of the current status of AO-ROF facility, as well the future projects.

  20. Challenges for Large Scale Simulations

    NASA Astrophysics Data System (ADS)

    Troyer, Matthias

    2010-03-01

    With computational approaches becoming ubiquitous the growing impact of large scale computing on research influences both theoretical and experimental work. I will review a few examples in condensed matter physics and quantum optics, including the impact of computer simulations in the search for supersolidity, thermometry in ultracold quantum gases, and the challenging search for novel phases in strongly correlated electron systems. While only a decade ago such simulations needed the fastest supercomputers, many simulations can now be performed on small workstation clusters or even a laptop: what was previously restricted to a few experts can now potentially be used by many. Only part of the gain in computational capabilities is due to Moore's law and improvement in hardware. Equally impressive is the performance gain due to new algorithms - as I will illustrate using some recently developed algorithms. At the same time modern peta-scale supercomputers offer unprecedented computational power and allow us to tackle new problems and address questions that were impossible to solve numerically only a few years ago. While there is a roadmap for future hardware developments to exascale and beyond, the main challenges are on the algorithmic and software infrastructure side. Among the problems that face the computational physicist are: the development of new algorithms that scale to thousands of cores and beyond, a software infrastructure that lifts code development to a higher level and speeds up the development of new simulation programs for large scale computing machines, tools to analyze the large volume of data obtained from such simulations, and as an emerging field provenance-aware software that aims for reproducibility of the complete computational workflow from model parameters to the final figures. Interdisciplinary collaborations and collective efforts will be required, in contrast to the cottage-industry culture currently present in many areas of computational

  1. Large Scale Commodity Clusters for Lattice QCD

    SciTech Connect

    A. Pochinsky; W. Akers; R. Brower; J. Chen; P. Dreher; R. Edwards; S. Gottlieb; D. Holmgren; P. Mackenzie; J. Negele; D. Richards; J. Simone; W. Watson

    2002-06-01

    We describe the construction of large scale clusters for lattice QCD computing being developed under the umbrella of the U.S. DoE SciDAC initiative. We discuss the study of floating point and network performance that drove the design of the cluster, and present our plans for future multi-Terascale facilities.

  2. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  3. Searches for large-scale anisotropy in the arrival directions of cosmic rays detected above energy of 10{sup 19} eV at the Pierre Auger observatory and the telescope array

    SciTech Connect

    Aab, A.; Abreu, P.; Andringa, S.; Aglietta, M.; Ahn, E. J.; Al Samarai, I.; Albuquerque, I. F. M.; Allekotte, I.; Asorey, H.; Allen, J.; Allison, P.; Almela, A.; Castillo, J. Alvarez; Alvarez-Muñiz, J.; Batista, R. Alves; Ambrosio, M.; Aramo, C.; Aminaei, A.; Anchordoqui, L.; Arqueros, F.; Collaboration: Pierre Auger Collaboration; Telescope Array Collaboration; and others

    2014-10-20

    Spherical harmonic moments are well-suited for capturing anisotropy at any scale in the flux of cosmic rays. An unambiguous measurement of the full set of spherical harmonic coefficients requires full-sky coverage. This can be achieved by combining data from observatories located in both the northern and southern hemispheres. To this end, a joint analysis using data recorded at the Telescope Array and the Pierre Auger Observatory above 10{sup 19} eV is presented in this work. The resulting multipolar expansion of the flux of cosmic rays allows us to perform a series of anisotropy searches, and in particular to report on the angular power spectrum of cosmic rays above 10{sup 19} eV. No significant deviation from isotropic expectations is found throughout the analyses performed. Upper limits on the amplitudes of the dipole and quadrupole moments are derived as a function of the direction in the sky, varying between 7% and 13% for the dipole and between 7% and 10% for a symmetric quadrupole.

  4. Searches for Large-scale Anisotropy in the Arrival Directions of Cosmic Rays Detected above Energy of 1019 eV at the Pierre Auger Observatory and the Telescope Array

    NASA Astrophysics Data System (ADS)

    Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Samarai, I. Al; Albuquerque, I. F. M.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Aramo, C.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Badescu, A. M.; Barber, K. B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Brogueira, P.; Brown, W. C.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Candusso, M.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Cordier, A.; Coutu, S.; Covault, C. E.; Cronin, J.; Curutiu, A.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; de Jong, S. J.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Diaz, J. C.; Díaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dorofeev, A.; Dorosti Hasankiadeh, Q.; Dova, M. T.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fernandes, M.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fox, B. D.; Fratu, O.; Fröhlich, U.; Fuchs, B.; Fuji, T.; Gaior, R.; García, B.; Garcia Roca, S. T.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gate, F.; Gemmeke, H.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Glaser, C.; Glass, H.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; González, N.; Gookin, B.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grebe, S.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Hartmann, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Isar, P. G.; Islo, K.; Jandt, I.; Jansen, S.; Jarne, C.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kunka, N.; La Rosa, G.; LaHurd, D.; Latronico, L.; Lauer, R.; Lauscher, M.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Maccarone, M. C.; Malacari, M.; Maldera, S.; Mallamaci, M.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Mariş, I. C.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J. A. J.; Matthews, J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Messina, S.; Meyhandan, R.; Mićanović, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morello, C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, L.; Ochilo, L.; Olinto, A.; Oliveira, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pękala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Peters, C.; Petrera, S.; Petrolini, A.; Petrov, Y.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porcelli, A.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Purrello, V.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rizi, V.; Roberts, J.; Rodrigues de Carvalho, W.; Rodriguez Cabo, I.; Rodriguez Fernandez, G.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salamida, F.; Salazar, H.; Saleh, A.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Sanchez-Lucas, P.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Scholten, O.; Schoorlemmer, H.; Schovánek, P.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Squartini, R.; Srivastava, Y. N.; Stanič, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Taborda, O. A.; Tapia, A.; Tartare, M.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Vlcek, B.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Widom, A.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.; Pierre Auger Collaboration; Abbasi, R. U.; Abe, M.; Abu-Zayyad, T.; Allen, M.; Anderson, R.; Azuma, R.; Barcikowski, E.; Belz, J. W.; Bergman, D. R.; Blake, S. A.; Cady, R.; Chae, M. J.; Cheon, B. G.; Chiba, J.; Chikawa, M.; Cho, W. R.; Fujii, T.; Fukushima, M.; Goto, T.; Hanlon, W.; Hayashi, Y.; Hayashida, N.; Hibino, K.; Honda, K.; Ikeda, D.; Inoue, N.; Ishii, T.; Ishimori, R.; Ito, H.; Ivanov, D.; Jui, C. C. H.; Kadota, K.; Kakimoto, F.; Kalashev, O.; Kasahara, K.; Kawai, H.; Kawakami, S.; Kawana, S.; Kawata, K.; Kido, E.; Kim, H. B.; Kim, J. H.; Kim, J. H.; Kitamura, S.; Kitamura, Y.; Kuzmin, V.; Kwon, Y. J.; Lan, J.; Lim, S. I.; Lundquist, J. P.; Machida, K.; Martens, K.; Matsuda, T.; Matsuyama, T.; Matthews, J. N.; Minamino, M.; Mukai, K.; Myers, I.; Nagasawa, K.; Nagataki, S.; Nakamura, T.; Nonaka, T.; Nozato, A.; Ogio, S.; Ogura, J.; Ohnishi, M.; Ohoka, H.; Oki, K.; Okuda, T.; Ono, M.; Oshima, A.; Ozawa, S.; Park, I. H.; Pshirkov, M. S.; Rodriguez, D. C.; Rubtsov, G.; Ryu, D.; Sagawa, H.; Sakurai, N.; Sampson, A. L.; Scott, L. M.; Shah, P. D.; Shibata, F.; Shibata, T.; Shimodaira, H.; Shin, B. K.; Smith, J. D.; Sokolsky, P.; Springer, R. W.; Stokes, B. T.; Stratton, S. R.; Stroman, T. A.; Suzawa, T.; Takamura, M.; Takeda, M.; Takeishi, R.; Taketa, A.; Takita, M.; Tameda, Y.; Tanaka, H.; Tanaka, K.; Tanaka, M.; Thomas, S. B.; Thomson, G. B.; Tinyakov, P.; Tkachev, I.; Tokuno, H.; Tomida, T.; Troitsky, S.; Tsunesada, Y.; Tsutsumi, K.; Uchihori, Y.; Udo, S.; Urban, F.; Vasiloff, G.; Wong, T.; Yamane, R.; Yamaoka, H.; Yamazaki, K.; Yang, J.; Yashiro, K.; Yoneda, Y.; Yoshida, S.; Yoshii, H.; Zollinger, R.; Zundel, Z.; Telescope Array Collaboration

    2014-10-01

    Spherical harmonic moments are well-suited for capturing anisotropy at any scale in the flux of cosmic rays. An unambiguous measurement of the full set of spherical harmonic coefficients requires full-sky coverage. This can be achieved by combining data from observatories located in both the northern and southern hemispheres. To this end, a joint analysis using data recorded at the Telescope Array and the Pierre Auger Observatory above 1019 eV is presented in this work. The resulting multipolar expansion of the flux of cosmic rays allows us to perform a series of anisotropy searches, and in particular to report on the angular power spectrum of cosmic rays above 1019 eV. No significant deviation from isotropic expectations is found throughout the analyses performed. Upper limits on the amplitudes of the dipole and quadrupole moments are derived as a function of the direction in the sky, varying between 7% and 13% for the dipole and between 7% and 10% for a symmetric quadrupole.

  5. Searches for Large-Scale Anisotropy in the Arrival Directions of Cosmic Rays Detected above Energy of $10^{19}$ eV at the Pierre Auger Observatory and the Telescope Array

    SciTech Connect

    Aab, Alexander; et al,

    2014-10-07

    Spherical harmonic moments are well-suited for capturing anisotropy at any scale in the flux of cosmic rays. An unambiguous measurement of the full set of spherical harmonic coefficients requires full-sky coverage. This can be achieved by combining data from observatories located in both the northern and southern hemispheres. To this end, a joint analysis using data recorded at the Telescope Array and the Pierre Auger Observatory above 1019 eV is presented in this work. The resulting multipolar expansion of the flux of cosmic rays allows us to perform a series of anisotropy searches, and in particular to report on the angular power spectrum of cosmic rays above 1019 eV. No significant deviation from isotropic expectations is found throughout the analyses performed. Upper limits on the amplitudes of the dipole and quadrupole moments are derived as a function of the direction in the sky, varying between 7% and 13% for the dipole and between 7% and 10% for a symmetric quadrupole.

  6. Galaxy clustering on large scales.

    PubMed

    Efstathiou, G

    1993-06-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe. PMID:11607400

  7. Microfluidic large-scale integration.

    PubMed

    Thorsen, Todd; Maerkl, Sebastian J; Quake, Stephen R

    2002-10-18

    We developed high-density microfluidic chips that contain plumbing networks with thousands of micromechanical valves and hundreds of individually addressable chambers. These fluidic devices are analogous to electronic integrated circuits fabricated using large-scale integration. A key component of these networks is the fluidic multiplexor, which is a combinatorial array of binary valve patterns that exponentially increases the processing power of a network by allowing complex fluid manipulations with a minimal number of inputs. We used these integrated microfluidic networks to construct the microfluidic analog of a comparator array and a microfluidic memory storage device whose behavior resembles random-access memory. PMID:12351675

  8. A Future Large-Aperture UVOIR Space Observatory: Key Technologies and Capabilities

    NASA Technical Reports Server (NTRS)

    Bolcar, Matthew Ryan; Stahle, Carl M.; Balasubramaniam, Kunjithapatham; Clampin, Mark; Feinberg, Lee D.; Mosier, Gary E.; Quijada, Manuel A.; Rauscher, Bernard J.; Redding, David C.; Rioux, Norman M.; Shaklan, Stuart B.; Stahl, H. Philip; Thronson, Harley A.

    2015-01-01

    We present the key technologies and capabilities that will enable a future, large-aperture ultravioletopticalinfrared (UVOIR) space observatory. These include starlight suppression systems, vibration isolation and control systems, lightweight mirror segments, detector systems, and mirror coatings. These capabilities will provide major advances over current and near-future observatories for sensitivity, angular resolution, and starlight suppression. The goals adopted in our study for the starlight suppression system are 10-10 contrast with an inner working angle of 20 milliarcsec and broad bandpass. We estimate that a vibration and isolation control system that achieves a total system vibration isolation of 140 dB for a vibration-isolated mass of 5000 kg is required to achieve the high wavefront error stability needed for exoplanet coronagraphy. Technology challenges for lightweight mirror segments include diffraction-limited optical quality and high wavefront error stability as well as low cost, low mass, and rapid fabrication. Key challenges for the detector systems include visible-blind, high quantum efficiency UV arrays, photon counting visible and NIR arrays for coronagraphic spectroscopy and starlight wavefront sensing and control, and detectors with deep full wells with low persistence and radiation tolerance to enable transit imaging and spectroscopy at all wavelengths. Finally, mirror coatings with high reflectivity ( 90), high uniformity ( 1) and low polarization ( 1) that are scalable to large diameter mirror substrates will be essential for ensuring that both high throughput UV observations and high contrast observations can be performed by the same observatory.

  9. National Ecological Observatory Network's (NEON) future role in US carbon cycling and budgets

    NASA Astrophysics Data System (ADS)

    Loescher, H. W.

    2015-12-01

    The US National Ecological Observatory Network (NEON) is a National Science Foundation investment designed to observe the impacts of large-scale environment changes on the nation's ecosystems for 30 years with rigorous consistency. NEON does this through the construction (and operations) of new physical infrastructure and data infrastructure distributed across the North American Continent. This includes 47 terrestrial and 32 aquatic sites. Key to its design is its ability to provide ecosystem-scale carbon measurements of carbon stores, fluxes, processes—and the means to scale them from the local-to regional scales via remote sensed aircraft. NEON design NEON will be collecting these carbon data as a facility and providing openly providing them. NEON will not preform any high-level synthesis, rather the carbon data is an open resource for research, private and public communities, alike. Overall, these data are also harmonized with other international carbon-based infrastructures to facilitate cross-continental understanding and global carbon syntheses. Products, engagement and harmonization of data to facilitate syntheses will be discussed.

  10. Nonthermal Components in the Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Miniati, Francesco

    2004-12-01

    I address the issue of nonthermal processes in the large scale structure of the universe. After reviewing the properties of cosmic shocks and their role as particle accelerators, I discuss the main observational results, from radio to γ-ray and describe the processes that are thought be responsible for the observed nonthermal emissions. Finally, I emphasize the important role of γ-ray astronomy for the progress in the field. Non detections at these photon energies have already allowed us important conclusions. Future observations will tell us more about the physics of the intracluster medium, shocks dissipation and CR acceleration.

  11. Large scale topography of Io

    NASA Technical Reports Server (NTRS)

    Gaskell, R. W.; Synnott, S. P.

    1987-01-01

    To investigate the large scale topography of the Jovian satellite Io, both limb observations and stereographic techniques applied to landmarks are used. The raw data for this study consists of Voyager 1 images of Io, 800x800 arrays of picture elements each of which can take on 256 possible brightness values. In analyzing this data it was necessary to identify and locate landmarks and limb points on the raw images, remove the image distortions caused by the camera electronics and translate the corrected locations into positions relative to a reference geoid. Minimizing the uncertainty in the corrected locations is crucial to the success of this project. In the highest resolution frames, an error of a tenth of a pixel in image space location can lead to a 300 m error in true location. In the lowest resolution frames, the same error can lead to an uncertainty of several km.

  12. Large Scale Homing in Honeybees

    PubMed Central

    Pahl, Mario; Zhu, Hong; Tautz, Jürgen; Zhang, Shaowu

    2011-01-01

    Honeybee foragers frequently fly several kilometres to and from vital resources, and communicate those locations to their nest mates by a symbolic dance language. Research has shown that they achieve this feat by memorizing landmarks and the skyline panorama, using the sun and polarized skylight as compasses and by integrating their outbound flight paths. In order to investigate the capacity of the honeybees' homing abilities, we artificially displaced foragers to novel release spots at various distances up to 13 km in the four cardinal directions. Returning bees were individually registered by a radio frequency identification (RFID) system at the hive entrance. We found that homing rate, homing speed and the maximum homing distance depend on the release direction. Bees released in the east were more likely to find their way back home, and returned faster than bees released in any other direction, due to the familiarity of global landmarks seen from the hive. Our findings suggest that such large scale homing is facilitated by global landmarks acting as beacons, and possibly the entire skyline panorama. PMID:21602920

  13. Take a look at the ancient observatories in Iran and prospects for the future

    NASA Astrophysics Data System (ADS)

    Kayanikhoo, F.; Bahrani, F.

    2014-12-01

    In this article, we want to introduce ancient observatories of Iran and study about applications of two of them in ancient times. Then, we will introduce one of the robotic observatories of Iran that is located in university of Kashan. We, also, will study about features of Iranian National Observatory that is an under construction robotic observatory.

  14. Large Scale Nanolaminate Deformable Mirror

    SciTech Connect

    Papavasiliou, A; Olivier, S; Barbee, T; Miles, R; Chang, K

    2005-11-30

    This work concerns the development of a technology that uses Nanolaminate foils to form light-weight, deformable mirrors that are scalable over a wide range of mirror sizes. While MEMS-based deformable mirrors and spatial light modulators have considerably reduced the cost and increased the capabilities of adaptive optic systems, there has not been a way to utilize the advantages of lithography and batch-fabrication to produce large-scale deformable mirrors. This technology is made scalable by using fabrication techniques and lithography that are not limited to the sizes of conventional MEMS devices. Like many MEMS devices, these mirrors use parallel plate electrostatic actuators. This technology replicates that functionality by suspending a horizontal piece of nanolaminate foil over an electrode by electroplated nickel posts. This actuator is attached, with another post, to another nanolaminate foil that acts as the mirror surface. Most MEMS devices are produced with integrated circuit lithography techniques that are capable of very small line widths, but are not scalable to large sizes. This technology is very tolerant of lithography errors and can use coarser, printed circuit board lithography techniques that can be scaled to very large sizes. These mirrors use small, lithographically defined actuators and thin nanolaminate foils allowing them to produce deformations over a large area while minimizing weight. This paper will describe a staged program to develop this technology. First-principles models were developed to determine design parameters. Three stages of fabrication will be described starting with a 3 x 3 device using conventional metal foils and epoxy to a 10-across all-metal device with nanolaminate mirror surfaces.

  15. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  16. An Engineering Design Reference Mission for a Future Large-Aperture UVOIR Space Observatory

    NASA Astrophysics Data System (ADS)

    Thronson, Harley A.; Bolcar, Matthew R.; Clampin, Mark; Crooke, Julie A.; Redding, David; Rioux, Norman; Stahl, H. Philip

    2016-01-01

    From the 2010 NRC Decadal Survey and the NASA Thirty-Year Roadmap, Enduring Quests, Daring Visions, to the recent AURA report, From Cosmic Birth to Living Earths, multiple community assessments have recommended development of a large-aperture UVOIR space observatory capable of achieving a broad range of compelling scientific goals. Of these priority science goals, the most technically challenging is the search for spectroscopic biomarkers in the atmospheres of exoplanets in the solar neighborhood. Here we present an engineering design reference mission (EDRM) for the Advanced Technology Large-Aperture Space Telescope (ATLAST), which was conceived from the start as capable of breakthrough science paired with an emphasis on cost control and cost effectiveness. An EDRM allows the engineering design trade space to be explored in depth to determine what are the most demanding requirements and where there are opportunities for margin against requirements. Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. The ATLAST observatory is designed to operate at a Sun-Earth L2 orbit, which provides a stable thermal environment and excellent field of regard. Our reference designs have emphasized a serviceable 36-segment 9.2 m aperture telescope that stows within a five-meter diameter launch vehicle fairing. As part of our cost-management effort, this particular reference mission builds upon the engineering design for JWST. Moreover, it is scalable to a variety of launch vehicle fairings. Performance needs developed under the study are traceable to a variety of additional reference designs, including options for a monolithic primary mirror.

  17. Large-Scale Astrophysical Visualization on Smartphones

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  18. Needs, opportunities, and options for large scale systems research

    SciTech Connect

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  19. The ISS as a Testbed for Future Large Astronomical Observatories: The OpTIIX Demonstration Program

    NASA Technical Reports Server (NTRS)

    Burdick, G.; Callen, P.; Ess, K.; Liu, F.; Postman, M.; Sparks, W.; Seery, B.; Thronson, H.

    2012-01-01

    Future large (diameters in excess of approx. 10 m) astronomical observatories in space will need to employ advanced technologies if they are to be affordable. Many of these technologies are ready to be validated on orbit and the International Space Station (ISS) provides a suitable platform for such demonstrations. These technologies include low-cost, low-density, highly deformable mirror segments, coupled with advanced sensing and control methods. In addition, the ISS offers available telerobotic assembly techniques to build an optical testbed that embodies this new cost-effective approach to assemble and achieve diffraction-limited optical performance for very large space telescopes. Given the importance that NASA attaches to the recommendations of the National Academy of Sciences "Decadal Survey" process, essential capabilities and technologies will be demonstrated well in advance of the next Survey, which commences in 2019. To achieve this objective, the Jet Propulsion Laboratory (JPL), NASA Johnson Space Center (JSC), NASA Goddard Space Flight Center (GSFC), and the Space Telescope Science Institute (STScI) are carrying out a Phase A/B study of the Optical Testbed and Integration on ISS eXperiment (OpTIIX). The overarching goal is to demonstrate well before the end of this decade key capabilities intended to enable very large optical systems in the decade of the 2020s. Such a demonstration will retire technical risk in the assembly, alignment, calibration, and operation of future space observatories. The OpTIIX system, as currently designed, is a six-hexagon element, segmented visual-wavelength telescope with an edge-to-edge aperture of 1.4 m, operating at its diffraction limit,

  20. Advanced situation awareness with localised environmental community observatories in the Future Internet

    NASA Astrophysics Data System (ADS)

    Sabeur, Z. A.; Denis, H.; Nativi, S.

    2012-04-01

    The phenomenal advances in information and communication technologies over the last decade have led to offering unprecedented connectivity with real potentials for "Smart living" between large segments of human populations around the world. In particular, Voluntary Groups(VGs) and individuals with interest in monitoring the state of their local environment can be connected through the internet and collaboratively generate important localised environmental observations. These could be considered as the Community Observatories(CO) of the Future Internet(FI). However, a set of FI enablers are needed to be deployed for these communities to become effective COs in the Future Internet. For example, these communities will require access to services for the intelligent processing of heterogeneous data and capture of advancend situation awarness about the environment. This important enablement will really unlock the communities true potential for participating in localised monitoring of the environment in addition to their contribution in the creation of business entreprise. Among the eight Usage Areas(UA) projects of the FP7 FI-PPP programme, the ENVIROFI Integrated Project focuses on the specifications of the Future Internet enablers of the Environment UA. The specifications are developed under multiple environmental domains in context of users needs for the development of mash-up applications in the Future Internet. It will enable users access to real-time, on-demand fused information with advanced situation awareness about the environment at localised scales. The mash-up applications shall get access to rich spatio-temporal information from structured fusion services which aggregate COs information with existing environmental monitoring stations data, established by research organisations and private entreprise. These applications are being developed in ENVIROFI for the atmospheric, marine and biodiversity domains, together with a potential to be extended to other

  1. Survey of decentralized control methods. [for large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.

  2. The SOFIA Airborne Infrared Observatory - first science highlights and future science potential

    NASA Astrophysics Data System (ADS)

    Zinnecker, H.

    2014-10-01

    SOFIA, short for Stratospheric Observatory for Infrared Astronomy, is a Boeing 747SP aircraft with a 2.7m telescope flying as high as 45000 ft in the stratosphere above 99 percent of the precipitable water vapor. SOFIA normally operates from its base in Palmdale, California, and a typical observing flight lasts for 10 hours before returning to base. SOFIA has started astronomical observations in Dec 2010 and has completed some 30 early science flights in 2011, delivering a number of exciting results and discoveries, both in mid-infrared imaging (5-40mu) and in far-infrared (THz) heterodyne high-resolution spectroscopy which were published in mid-2012 in special issues of ApJ Letters and A & A, respectively. Meanwhile, in July 2013, as part of Cycle 1, SOFIA has deployed to New Zealand for a total of 9 flights (all of them successful) and has observed key targets in the southern hemisphere at THz frequencies, including star forming regions in the Large and Small Magellanic Clouds. In this talk, I will present a few highlights of SOFIA early science and its future potential, when the full suite of 7 instruments will be implemented by the time of full operations in 2015. As Herschel ran out of cryogens in April 2013, SOFIA will be the premier FIR-astronomical facility for many years to come. Synergies with ALMA and CCAT must be explored. SOFIA is a major bilateral project between NASA and the German Space Agency (DLR), however as an international observatory it offers observing time to the whole astronomical community world-wide, not only to the US and German primary partners.

  3. Development of a TES-Based Anti-Coincidence Detector for Future X-ray Observatories

    NASA Technical Reports Server (NTRS)

    Bailey, Catherine

    2011-01-01

    Microcalorimeters onboard future x-ray observatories require an anti-coincidence detector to remove environmental backgrounds. In order to most effectively integrate this anticoincidence detector with the main microcalorimeter array, both instruments should use similar read-out technology. The detectors used in the Cryogenic Dark Matter Search (CDMS) use a phonon measurement technique that is well suited for an anti-coincidence detector with a microcalorimeter array using SQUID readout. This technique works by using a transition-edge sensor (TES) connected to superconducting collection fins to measure the athermal phonon signal produced when an event occurs in the substrate crystal. Energy from the event propagates through the crystal to the superconducting collection fins, creating quasiparticles, which are then trapped as they enter the TES where they produce a signal. We are currently developing a prototype anti-coincidence detector for future x-ray missions and have recently fabricated test devices with Mo/Au TESs and Al collection fins. We will present results from the first tests of these devices which indicate a proof of concept that quasiparticle trapping is occurring in these materials.

  4. Large-Scale Reform Comes of Age

    ERIC Educational Resources Information Center

    Fullan, Michael

    2009-01-01

    This article reviews the history of large-scale education reform and makes the case that large-scale or whole system reform policies and strategies are becoming increasingly evident. The review briefly addresses the pre 1997 period concluding that while the pressure for reform was mounting that there were very few examples of deliberate or…

  5. Large-scale infrared scene projectors

    NASA Astrophysics Data System (ADS)

    Murray, Darin A.

    1999-07-01

    Large-scale infrared scene projectors, typically have unique opto-mechanical characteristics associated to their application. This paper outlines two large-scale zoom lens assemblies with different environmental and package constraints. Various challenges and their respective solutions are discussed and presented.

  6. Large-Scale periodic solar velocities: An observational study

    NASA Technical Reports Server (NTRS)

    Dittmer, P. H.

    1977-01-01

    Observations of large-scale solar velocities were made using the mean field telescope and Babcock magnetograph of the Stanford Solar Observatory. Observations were made in the magnetically insensitive ion line at 5124 A, with light from the center (limb) of the disk right (left) circularly polarized, so that the magnetograph measures the difference in wavelength between center and limb. Computer calculations are made of the wavelength difference produced by global pulsations for spherical harmonics up to second order and of the signal produced by displacing the solar image relative to polarizing optics or diffraction grating.

  7. Seismic observations at the Sodankylä Geophysical Observatory: history, present, and the future

    NASA Astrophysics Data System (ADS)

    Kozlovskaya, Elena; Narkilahti, Janne; Nevalainen, Jouni; Hurskainen, Riitta; Silvennoinen, Hanna

    2016-08-01

    Instrumental seismic observations in northern Finland started in the 1950s. They were originally initiated by the Institute of Seismology of the University of Helsinki (ISUH), but the staff of Sodankylä Geophysical Observatory (SGO) and later geophysicists of the University of Oulu (UO) were involved in the development of seismological observations and research in northern Finland from the very beginning. This close cooperation between seismologists and the technical staff of ISUH, UO, and SGO continued in many significant international projects and enabled a high level of seismological research in Finland. In our paper, we present history and current status of seismic observations and seismological research in northern Finland at the UO and SGO. These include both seismic observations at permanent seismic stations and temporary seismic experiments with portable seismic equipment. We describe the present seismic instrumentation and major research topics of the seismic group at SGO and discuss plans for future development of permanent seismological observations and portable seismic instrumentation at SGO as part of the European Plate Observing System (EPOS) research infrastructure. We also present the research topics of the recently organized Laboratory of Applied Seismology, and show examples of seismic observations performed by new seismic equipment located at this laboratory and selected results of time-lapse seismic body wave travel-time tomography using the data of microseismic monitoring in the Pyhäsalmi Mine (northern Finland).

  8. Molecular clouds and the large-scale structure of the galaxy

    NASA Technical Reports Server (NTRS)

    Thaddeus, Patrick; Stacy, J. Gregory

    1990-01-01

    The application of molecular radio astronomy to the study of the large-scale structure of the Galaxy is reviewed and the distribution and characteristic properties of the Galactic population of Giant Molecular Clouds (GMCs), derived primarily from analysis of the Columbia CO survey, and their relation to tracers of Population 1 and major spiral features are described. The properties of the local molecular interstellar gas are summarized. The CO observing programs currently underway with the Center for Astrophysics 1.2 m radio telescope are described, with an emphasis on projects relevant to future comparison with high-energy gamma-ray observations. Several areas are discussed in which high-energy gamma-ray observations by the EGRET (Energetic Gamma-Ray Experiment Telescope) experiment aboard the Gamma Ray Observatory will directly complement radio studies of the Milky Way, with the prospect of significant progress on fundamental issues related to the structure and content of the Galaxy.

  9. Ali Observatory in Tibet: a unique northern site for future CMB ground-based observations

    NASA Astrophysics Data System (ADS)

    Su, Meng

    2015-08-01

    Ground-based CMB observations have been performed at the South Pole and the Atacama desert in Chile. However, a significant fraction of the sky can not be observed from just these two sites. For a full sky coverage from the ground in the future, a northern site for CMB observation, in particular CMB polarization, is required. Besides the long-thought site in Greenland, the high altitude Tibet plateau provides another opportunity. I will describe the Ali Observatory in Tibet, located at N32°19', E80°01', as a potential site for ground-based CMB observations. The new site is located on almost 5100m mountain, near Gar town, where is an excellent site for both infrared and submillimeter observations. Study with the long-term database of ground weather stations and archival satellite data has been performed. The site has enough relative height on the plateau and is accessible by car. The Shiquanhe town is 40 mins away by driving, and a recently opened airport with 40 mins driving, the site also has road excess, electricity, and optical fiber with fast internet. Preliminary measurement of the Precipitable Water Vapor is ~one quarter less than 0.5mm per year and the long term monitoring is under development. In addition, surrounding higher sites are also available and could be further developed if necessary. Ali provides unique northern sky coverage and together with the South Pole and the Atacama desert, future CMB observations will be able to cover the full sky from ground.

  10. Potential of a Future Large Aperture UVOIR Space Observatory for Breakthrough Observations of Star and Planet Formation

    NASA Astrophysics Data System (ADS)

    Danchi, William C.; Grady, Carol A.; Padgett, Deborah

    2015-01-01

    A future large aperture space observatory operating from the UV to the near-infrared with a diameter between 10 and 15 meters will provide a unique opportunity for observations of star and planet formation, from nearby moving groups and associations to star formation in galaxies in the local universe. Our newly formed working group will examine the unique opportunities that such a telescope will give observers in a post-JWST/WFIRST-AFTA era that includes extremely large ground-based observatories such as the TMT, E-ELT, ALMA, and the VLTI. Given a potential suite of instruments for this observatory we will discuss some of the key areas of star and planet formation science where breakthroughs might occur.

  11. Curvature constraints from large scale structure

    NASA Astrophysics Data System (ADS)

    Di Dio, Enea; Montanari, Francesco; Raccanelli, Alvise; Durrer, Ruth; Kamionkowski, Marc; Lesgourgues, Julien

    2016-06-01

    We modified the CLASS code in order to include relativistic galaxy number counts in spatially curved geometries; we present the formalism and study the effect of relativistic corrections on spatial curvature. The new version of the code is now publicly available. Using a Fisher matrix analysis, we investigate how measurements of the spatial curvature parameter ΩK with future galaxy surveys are affected by relativistic effects, which influence observations of the large scale galaxy distribution. These effects include contributions from cosmic magnification, Doppler terms and terms involving the gravitational potential. As an application, we consider angle and redshift dependent power spectra, which are especially well suited for model independent cosmological constraints. We compute our results for a representative deep, wide and spectroscopic survey, and our results show the impact of relativistic corrections on spatial curvature parameter estimation. We show that constraints on the curvature parameter may be strongly biased if, in particular, cosmic magnification is not included in the analysis. Other relativistic effects turn out to be subdominant in the studied configuration. We analyze how the shift in the estimated best-fit value for the curvature and other cosmological parameters depends on the magnification bias parameter, and find that significant biases are to be expected if this term is not properly considered in the analysis.

  12. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  13. Synthesis of small and large scale dynamos

    NASA Astrophysics Data System (ADS)

    Subramanian, Kandaswamy

    Using a closure model for the evolution of magnetic correlations, we uncover an interesting plausible saturated state of the small-scale fluctuation dynamo (SSD) and a novel analogy between quantum mechanical tunnelling and the generation of large-scale fields. Large scale fields develop via the α-effect, but as magnetic helicity can only change on a resistive timescale, the time it takes to organize the field into large scales increases with magnetic Reynolds number. This is very similar to the results which obtain from simulations using the full MHD equations.

  14. Modeling of Carbon Tetrachloride Flow and Transport in the Subsurface of the 200 West Disposal Sites: Large-Scale Model Configuration and Prediction of Future Carbon Tetrachloride Distribution Beneath the 216-Z-9 Disposal Site

    SciTech Connect

    Oostrom, Mart; Thorne, Paul D.; Zhang, Z. F.; Last, George V.; Truex, Michael J.

    2008-12-17

    Three-dimensional simulations considered migration of dense, nonaqueous phase liquid (DNAPL) consisting of CT and co disposed organics in the subsurface as a function of the properties and distribution of subsurface sediments and of the properties and disposal history of the waste. Simulations of CT migration were conducted using the Water-Oil-Air mode of Subsurface Transport Over Multiple Phases (STOMP) simulator. A large-scale model was configured to model CT and waste water discharge from the major CT and waste-water disposal sites.

  15. International space station. Large scale integration approach

    NASA Astrophysics Data System (ADS)

    Cohen, Brad

    The International Space Station is the most complex large scale integration program in development today. The approach developed for specification, subsystem development, and verification lay a firm basis on which future programs of this nature can be based. International Space Station is composed of many critical items, hardware and software, built by numerous International Partners, NASA Institutions, and U.S. Contractors and is launched over a period of five years. Each launch creates a unique configuration that must be safe, survivable, operable, and support ongoing assembly (assemblable) to arrive at the assembly complete configuration in 2003. The approaches to integrating each of the modules into a viable spacecraft and continue the assembly is a challenge in itself. Added to this challenge are the severe schedule constraints and lack of an "Iron Bird", which prevents assembly and checkout of each on-orbit configuration prior to launch. This paper will focus on the following areas: 1) Specification development process explaining how the requirements and specifications were derived using a modular concept driven by launch vehicle capability. Each module is composed of components of subsystems versus completed subsystems. 2) Approach to stage (each stage consists of the launched module added to the current on-orbit spacecraft) specifications. Specifically, how each launched module and stage ensures support of the current and future elements of the assembly. 3) Verification approach, due to the schedule constraints, is primarily analysis supported by testing. Specifically, how are the interfaces ensured to mate and function on-orbit when they cannot be mated before launch. 4) Lessons learned. Where can we improve this complex system design and integration task?

  16. Acoustic Studies of the Large Scale Ocean Circulation

    NASA Technical Reports Server (NTRS)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  17. Large-scale regions of antimatter

    SciTech Connect

    Grobov, A. V. Rubin, S. G.

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  18. Future space missions and ground observatory for measurements of coronal magnetic fields

    NASA Astrophysics Data System (ADS)

    Fineschi, Silvano; Gibson, Sarah; Bemporad, Alessandro; Zhukov, Andrei; Damé, Luc; Susino, Roberto; Larruquert, Juan

    2016-07-01

    This presentation gives an overview of the near-future perspectives for probing coronal magnetism from space missions (i.e., SCORE and ASPIICS) and ground-based observatory (ESCAPE). Spectro-polarimetric imaging of coronal emission-lines in the visible-light wavelength-band provides an important diagnostics tool of the coronal magnetism. The interpretation in terms of Hanle and Zeeman effect of the line-polarization in forbidden emission-lines yields information on the direction and strength of the coronal magnetic field. As study case, this presentation will describe the Torino Coronal Magnetograph (CorMag) for the spectro-polarimetric observation of the FeXIV, 530.3 nm, forbidden emission-line. CorMag - consisting of a Liquid Crystal (LC) Lyot filter and a LC linear polarimeter. The CorMag filter is part of the ESCAPE experiment to be based at the French-Italian Concordia base in Antarctica. The linear polarization by resonance scattering of coronal permitted line-emission in the ultraviolet (UV)can be modified by magnetic fields through the Hanle effect. Space-based UV spectro-polarimeters would provide an additional tool for the disgnostics of coronal magnetism. As a case study of space-borne UV spectro-polarimeters, this presentation will describe the future upgrade of the Sounding-rocket Coronagraphic Experiment (SCORE) to include new generation, high-efficiency UV polarizer with the capability of imaging polarimetry of the HI Lyman-α, 121.6 nm. SCORE is a multi-wavelength imager for the emission-lines, HeII 30.4 nm and HI 121.6 nm, and visible-light broad-band emission of the polarized K-corona. SCORE has flown successfully in 2009. The second lauch is scheduled in 2016. Proba-3 is the other future solar mission that would provide the opportunity of diagnosing the coronal magnetic field. Proba-3 is the first precision formation-flying mission to launched in 2019). A pair of satellites will fly together maintaining a fixed configuration as a 'large rigid

  19. The Future of the Plate Boundary Observatory in the GAGE Facility and beyond 2018

    NASA Astrophysics Data System (ADS)

    Mattioli, G. S.; Bendick, R. O.; Foster, J. H.; Freymueller, J. T.; La Femina, P. C.; Miller, M. M.; Rowan, L.

    2014-12-01

    The Geodesy Advancing Geosciences and Earthscope (GAGE) Facility, which operates the Plate Boundary Observatory (PBO), builds on UNAVCO's strong record of facilitating research and education in the geosciences and geodesy-related engineering fields. Precise positions and velocities for the PBO's ~1100 continuous GPS stations and other PBO data products are used to address a wide range of scientific and technical issues across North America. A large US and international community of scientists, surveyors, and civil engineers access PBO data streams, software, and other on-line resources daily. In a global society that is increasingly technology-dependent, consistently risk-averse, and often natural resource-limited, communities require geodetic research, education, and infrastructure to make informed decisions about living on a dynamic planet. The western U.S. and Alaska, where over 95% of the PBO sensor assets are located, have recorded significant geophysical events like earthquakes, volcanic eruptions, and tsunami. UNAVCO community science provides first-order constraints on geophysical processes to support hazards mapping and zoning, and form the basis for earthquake and tsunami early warning applications currently under development. The future of PBO was discussed at a NSF-sponsored three-day workshop held in September 2014 in Breckenridge, CO. Over 40 invited participants and community members, including representatives from interested stakeholder groups, UNAVCO staff, and members of the PBO Working Group and Geodetic Infrastructure Advisory Committee participated in workshop, which included retrospective and prospective plenary presentations and breakout sessions focusing on specific scientific themes. We will present some of the findings of that workshop in order to continue a dialogue about policies and resources for long-term earth observing networks. How PBO fits into the recently released U.S. National Plan for Civil Earth Observations will also be

  20. The Signature of Large Scale Structures on the Very High Energy Gamma-Ray Sky

    SciTech Connect

    Cuoco, A.; Hannestad, S.; Haugbolle, T.; Miele, G.; Serpico, P.D.; Tu, H.; /Aarhus U. /UC, Irvine

    2006-12-01

    If the diffuse extragalactic gamma ray emission traces the large scale structures of the universe, peculiar anisotropy patterns are expected in the gamma ray sky. In particular, because of the cutoff distance introduced by the absorption of 0.1-10 TeV photons on the infrared/optical background, prominent correlations with the local structures within a range of few hundreds Mpc should be present. We provide detailed predictions of the signal based on the PSCz map of the local universe. We also use mock N-body catalogues complemented with the halo model of structures to study some statistical features of the expected signatures. The results are largely independent from cosmological details, and depend mostly on the index of correlation (or bias) of the sources with respect to the large scale distribution of galaxies. For instance, the predicted signal in the case of a quadratic correlation (as it may happen for a dark matter annihilation contribution to the diffuse gamma flux) differs substantially from a linear correlation case, providing a complementary tool to unveil the nature of the sources of the diffuse gamma ray emission. The chances of the present and future space and ground based observatories to measure these features are discussed.

  1. A virtual observatory in a real world: building capacity for an uncertain future

    NASA Astrophysics Data System (ADS)

    Blair, Gordon; Buytaert, Wouter; Emmett, Bridget; Freer, Jim; Gurney, Robert; Haygarth, Phil; McDonald, Adrian; Rees, Gwyn; Tetzlaff, Doerthe

    2010-05-01

    Environmental managers and policy makers face a challenging future trying to accommodate growing expectations of environmental well-being, while subject to maturing regulation, constrained budgets and a public scrutiny that expects easier and more meaningful access. To support such a challenge requires new tools and new approaches. The VO is a new initiative from the Natural Environment Research Council (NERC) designed to deliver proof of concept for these new tools and approaches. The VO is at an early stage and we first evaluate the role of existing ‘observatories' in the UK and elsewhere both to learn good practice (and just as valuable - errors) and to define boundaries. A series of exemplar ‘big catchment science questions' are posed - distinguishing between science and society positions - and the prospects for their solution are assessed. The VO vision of being driven by these questions is outlined as are the seven key ambitions namely: i. being driven by the need to contribute to the solution of major environmental issues that impinge on, or link to, catchment science ii. having the flexibility and adaptability to address future problems not yet defined or fully clarified iii. being able to communicate issues and solutions to a range of audiences iv. supporting easy access by a variety of users v. drawing meaningful information from data and models and identifying the constraints on application in terms of errors, uncertainties, etc vi. adding value and cost effectiveness to current investigations by supporting transfer and scale adjustment thus limiting the repetition of expensive field monitoring addressing essentially the same issues in varying locations vii. promoting effective interfacing of robust science with a variety of end users by using terminology or measures familiar to the user (or required by regulation), including financial and carbon accounting, whole life or fixed period costing, risk as probability or as disability adjusted life years

  2. Large-scale inhomogeneities and galaxy statistics

    NASA Technical Reports Server (NTRS)

    Schaeffer, R.; Silk, J.

    1984-01-01

    The density fluctuations associated with the formation of large-scale cosmic pancake-like and filamentary structures are evaluated using the Zel'dovich approximation for the evolution of nonlinear inhomogeneities in the expanding universe. It is shown that the large-scale nonlinear density fluctuations in the galaxy distribution due to pancakes modify the standard scale-invariant correlation function xi(r) at scales comparable to the coherence length of adiabatic fluctuations. The typical contribution of pancakes and filaments to the J3 integral, and more generally to the moments of galaxy counts in a volume of approximately (15-40 per h Mpc)exp 3, provides a statistical test for the existence of large scale inhomogeneities. An application to several recent three dimensional data sets shows that despite large observational uncertainties over the relevant scales characteristic features may be present that can be attributed to pancakes in most, but not all, of the various galaxy samples.

  3. Tephra Studies by the Alaska Volcano Observatory: Present and Future Research

    NASA Astrophysics Data System (ADS)

    Waythomas, C. F.; Wallace, K. L.

    2004-12-01

    Tephra from Aleutian arc volcanoes constitutes an important volcanic hazard for Alaska, western Canada, and some parts of the conterminous U.S. where even small amounts of airborne ash may have dire consequences for jet aircraft traversing North Pacific and western U.S. air routes. Motivated by the need to address volcanic ash hazards on a regional scale, we have initiated a program of tephra studies within the auspices of the Alaska Volcano Observatory (AVO) of the U.S. Geological Survey. A concentrated focus on tephra problems and a new laboratory facility within AVO will help facilitate studies of Quaternary age tephra at Alaskan volcanoes by providing a regional center for laboratory analyses of volcanic ash and standardized web-based reporting and archiving of tephra data. In its first year of operation, the laboratory has been engaged in research at Veniaminof, Mt. Spurr, and Augustine volcanoes, has sponsored research on Holocene tephra deposits of upper Cook Inlet, and has initiated analytical studies of tephra deposits on Adak and Kanaga Islands in the western Aleutians. The objective of these studies is to develop multiparameter techniques for characterization and correlation of tephra deposits, establish radiocarbon-controlled tephrostratigraphic frameworks, and to evaluate the magnitude and frequency of tephra-producing eruptions. In the upper Cook Inlet region of Alaska, we and our colleagues have begun developing a comprehensive record of ash fall by systematically selecting and coring shallow lakes and evaluating the tephra preserved in the lacustrine sediment. Sediment cores from these lakes contain numerous tephra deposits of Holocene age in datable context that can be correlated with proximal tephra deposits on the flanks of their source volcanoes. By combining tephra data from lacustrine deposits and natural exposures we hope to develop a robust geologic catalog of tephra deposits that will enable long-distance correlation of tephras, provide

  4. The large-scale distribution of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.

    1989-01-01

    The spatial distribution of galaxies in the universe is characterized on the basis of the six completed strips of the Harvard-Smithsonian Center for Astrophysics redshift-survey extension. The design of the survey is briefly reviewed, and the results are presented graphically. Vast low-density voids similar to the void in Bootes are found, almost completely surrounded by thin sheets of galaxies. Also discussed are the implications of the results for the survey sampling problem, the two-point correlation function of the galaxy distribution, the possibility of detecting large-scale coherent flows, theoretical models of large-scale structure, and the identification of groups and clusters of galaxies.

  5. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    SciTech Connect

    Nusser, Adi; Branchini, Enzo; Davis, Marc E-mail: branchin@fis.uniroma3.it

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  6. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  7. A Large Scale Computer Terminal Output Controller.

    ERIC Educational Resources Information Center

    Tucker, Paul Thomas

    This paper describes the design and implementation of a large scale computer terminal output controller which supervises the transfer of information from a Control Data 6400 Computer to a PLATO IV data network. It discusses the cost considerations leading to the selection of educational television channels rather than telephone lines for…

  8. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  9. ARPACK: Solving large scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Lehoucq, Rich; Maschhoff, Kristi; Sorensen, Danny; Yang, Chao

    2013-11-01

    ARPACK is a collection of Fortran77 subroutines designed to solve large scale eigenvalue problems. The package is designed to compute a few eigenvalues and corresponding eigenvectors of a general n by n matrix A. It is most appropriate for large sparse or structured matrices A where structured means that a matrix-vector product w

  10. Large-scale CFB combustion demonstration project

    SciTech Connect

    Nielsen, P.T.; Hebb, J.L.; Aquino, R.

    1998-07-01

    The Jacksonville Electric Authority's large-scale CFB demonstration project is described. Given the early stage of project development, the paper focuses on the project organizational structure, its role within the Department of Energy's Clean Coal Technology Demonstration Program, and the projected environmental performance. A description of the CFB combustion process in included.

  11. Large-scale CFB combustion demonstration project

    SciTech Connect

    Nielsen, P.T.; Hebb, J.L.; Aquino, R.

    1998-04-01

    The Jacksonville Electric Authority`s large-scale CFB demonstration project is described. Given the early stage of project development, the paper focuses on the project organizational structure, its role within the Department of Energy`s Clean Coal Technology Demonstration Program, and the projected environmental performance. A description of the CFB combustion process is included.

  12. Large-Scale Spacecraft Fire Safety Tests

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; Toth, Balazs; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Jomaas, Grunde

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  13. Large Scale Deformation of the Western U.S. Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2002-01-01

    Over the past couple of years, with support from NASA, we used a large collection of data from GPS, VLBI, SLR, and DORIS networks which span the Westem U.S. Cordillera (WUSC) to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work was roughly divided into an analysis of these space geodetic observations to infer the deformation field across and within the entire plate boundary zone, and an investigation of the implications of this deformation field regarding plate boundary dynamics. Following the determination of the first generation WUSC velocity solution, we placed high priority on the dissemination of the velocity estimates. With in-kind support from the Smithsonian Astrophysical Observatory, we constructed a web-site which allows anyone to access the data, and to determine their own velocity reference frame.

  14. Large Scale Deformation of the Western U.S. Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2002-01-01

    Over the past couple of years, with support from NASA, we used a large collection of data from GPS, VLBI, SLR, and DORIS networks which span the Western U.S. Cordillera (WUSC) to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work was roughly divided into an analysis of these space geodetic observations to infer the deformation field across and within the entire plate boundary zone, and an investigation of the implications of this deformation field regarding plate boundary dynamics. Following the determination of the first generation WUSC velocity solution, we placed high priority on the dissemination of the velocity estimates. With in-kind support from the Smithsonian Astrophysical Observatory, we constructed a web-site which allows anyone to access the data, and to determine their own velocity reference frame.

  15. World Space Observatory-UltraViolet: International Space Mission for the Nearest Future

    NASA Astrophysics Data System (ADS)

    Sachkov, M.; Gómez de Castro, A. I.; Pagano, I.; Torres, F.; Zaiko, Y.; Shustov, B.

    2009-03-01

    The World Space Observatory UltraViolet (WSO-UV) project is an international space observatory designed for observations in the ultraviolet domain where some of the most important astrophysical processes can be efficiently studied with unprecedented sensitivity. WSO-UV is a multipurpose observatory, consisting of a 170 cm aperture telescope, capable of high-resolution spectroscopy, long slit low-resolution spectroscopy, and deep UV and optical imaging. With a nominal mission life time of 5 years, and a planned extension to 10 years, from a geosynchronous orbit with an inclination of 51.8 degrees, the WSO-UV will provide observations that are of exceptional importance for the study of many astrophysical problems. WSO-UV is implemented in the framework of a collaboration between Russia (chair), China, Germany, Italy, Spain, and Ukraine. This article is the first of three papers in this proceedings dedicated to the WSO-UV project This paper gives general information on the WSO-UV project and its status.

  16. Alignment of quasar polarizations with large-scale structures

    NASA Astrophysics Data System (ADS)

    Hutsemékers, D.; Braibant, L.; Pelgrims, V.; Sluse, D.

    2014-12-01

    We have measured the optical linear polarization of quasars belonging to Gpc scale quasar groups at redshift z ~ 1.3. Out of 93 quasars observed, 19 are significantly polarized. We found that quasar polarization vectors are either parallel or perpendicular to the directions of the large-scale structures to which they belong. Statistical tests indicate that the probability that this effect can be attributed to randomly oriented polarization vectors is on the order of 1%. We also found that quasars with polarization perpendicular to the host structure preferentially have large emission line widths while objects with polarization parallel to the host structure preferentially have small emission line widths. Considering that quasar polarization is usually either parallel or perpendicular to the accretion disk axis depending on the inclination with respect to the line of sight, and that broader emission lines originate from quasars seen at higher inclinations, we conclude that quasar spin axes are likely parallel to their host large-scale structures. Based on observations made with ESO Telescopes at the La Silla Paranal Observatory under program ID 092.A-0221.Table 1 is available in electronic form at http://www.aanda.org

  17. Fractals and cosmological large-scale structure

    NASA Technical Reports Server (NTRS)

    Luo, Xiaochun; Schramm, David N.

    1992-01-01

    Observations of galaxy-galaxy and cluster-cluster correlations as well as other large-scale structure can be fit with a 'limited' fractal with dimension D of about 1.2. This is not a 'pure' fractal out to the horizon: the distribution shifts from power law to random behavior at some large scale. If the observed patterns and structures are formed through an aggregation growth process, the fractal dimension D can serve as an interesting constraint on the properties of the stochastic motion responsible for limiting the fractal structure. In particular, it is found that the observed fractal should have grown from two-dimensional sheetlike objects such as pancakes, domain walls, or string wakes. This result is generic and does not depend on the details of the growth process.

  18. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  19. Large-scale objective phenotyping of 3D facial morphology

    PubMed Central

    Hammond, Peter; Suttie, Michael

    2012-01-01

    Abnormal phenotypes have played significant roles in the discovery of gene function, but organized collection of phenotype data has been overshadowed by developments in sequencing technology. In order to study phenotypes systematically, large-scale projects with standardized objective assessment across populations are considered necessary. The report of the 2006 Human Variome Project meeting recommended documentation of phenotypes through electronic means by collaborative groups of computational scientists and clinicians using standard, structured descriptions of disease-specific phenotypes. In this report, we describe progress over the past decade in 3D digital imaging and shape analysis of the face, and future prospects for large-scale facial phenotyping. Illustrative examples are given throughout using a collection of 1107 3D face images of healthy controls and individuals with a range of genetic conditions involving facial dysmorphism. PMID:22434506

  20. Large-scale extraction of proteins.

    PubMed

    Cunha, Teresa; Aires-Barros, Raquel

    2002-01-01

    The production of foreign proteins using selected host with the necessary posttranslational modifications is one of the key successes in modern biotechnology. This methodology allows the industrial production of proteins that otherwise are produced in small quantities. However, the separation and purification of these proteins from the fermentation media constitutes a major bottleneck for the widespread commercialization of recombinant proteins. The major production costs (50-90%) for typical biological product resides in the purification strategy. There is a need for efficient, effective, and economic large-scale bioseparation techniques, to achieve high purity and high recovery, while maintaining the biological activity of the molecule. Aqueous two-phase systems (ATPS) allow process integration as simultaneously separation and concentration of the target protein is achieved, with posterior removal and recycle of the polymer. The ease of scale-up combined with the high partition coefficients obtained allow its potential application in large-scale downstream processing of proteins produced by fermentation. The equipment and the methodology for aqueous two-phase extraction of proteins on a large scale using mixer-settlerand column contractors are described. The operation of the columns, either stagewise or differential, are summarized. A brief description of the methods used to account for mass transfer coefficients, hydrodynamics parameters of hold-up, drop size, and velocity, back mixing in the phases, and flooding performance, required for column design, is also provided. PMID:11876297

  1. Large scale processes in the solar nebula.

    NASA Astrophysics Data System (ADS)

    Boss, A. P.

    Most proposed chondrule formation mechanisms involve processes occurring inside the solar nebula, so the large scale (roughly 1 to 10 AU) structure of the nebula is of general interest for any chrondrule-forming mechanism. Chondrules and Ca, Al-rich inclusions (CAIs) might also have been formed as a direct result of the large scale structure of the nebula, such as passage of material through high temperature regions. While recent nebula models do predict the existence of relatively hot regions, the maximum temperatures in the inner planet region may not be high enough to account for chondrule or CAI thermal processing, unless the disk mass is considerably greater than the minimum mass necessary to restore the planets to solar composition. Furthermore, it does not seem to be possible to achieve both rapid heating and rapid cooling of grain assemblages in such a large scale furnace. However, if the accretion flow onto the nebula surface is clumpy, as suggested by observations of variability in young stars, then clump-disk impacts might be energetic enough to launch shock waves which could propagate through the nebula to the midplane, thermally processing any grain aggregates they encounter, and leaving behind a trail of chondrules.

  2. Back to the future: virtualization of the computing environment at the W. M. Keck Observatory

    NASA Astrophysics Data System (ADS)

    McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.

    2014-07-01

    Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.

  3. Population generation for large-scale simulation

    NASA Astrophysics Data System (ADS)

    Hannon, Andrew C.; King, Gary; Morrison, Clayton; Galstyan, Aram; Cohen, Paul

    2005-05-01

    Computer simulation is used to research phenomena ranging from the structure of the space-time continuum to population genetics and future combat.1-3 Multi-agent simulations in particular are now commonplace in many fields.4, 5 By modeling populations whose complex behavior emerges from individual interactions, these simulations help to answer questions about effects where closed form solutions are difficult to solve or impossible to derive.6 To be useful, simulations must accurately model the relevant aspects of the underlying domain. In multi-agent simulation, this means that the modeling must include both the agents and their relationships. Typically, each agent can be modeled as a set of attributes drawn from various distributions (e.g., height, morale, intelligence and so forth). Though these can interact - for example, agent height is related to agent weight - they are usually independent. Modeling relations between agents, on the other hand, adds a new layer of complexity, and tools from graph theory and social network analysis are finding increasing application.7, 8 Recognizing the role and proper use of these techniques, however, remains the subject of ongoing research. We recently encountered these complexities while building large scale social simulations.9-11 One of these, the Hats Simulator, is designed to be a lightweight proxy for intelligence analysis problems. Hats models a "society in a box" consisting of many simple agents, called hats. Hats gets its name from the classic spaghetti western, in which the heroes and villains are known by the color of the hats they wear. The Hats society also has its heroes and villains, but the challenge is to identify which color hat they should be wearing based on how they behave. There are three types of hats: benign hats, known terrorists, and covert terrorists. Covert terrorists look just like benign hats but act like terrorists. Population structure can make covert hat identification significantly more

  4. A permanent free tropospheric observatory at Pico summit in the Azores Islands? Past measurements (2001-2005) and future plans

    NASA Astrophysics Data System (ADS)

    Honrath, R. E.; Fialho, P.; Helmig, D.; Val Martin, M.; Owen, R. C.; Kleissl, J.; Strane, J. M.; Dziobak, M. P.; Tanner, D. M.; Barata, F.

    2005-12-01

    Pico mountain in the Azores Islands provides a base for continuous, free tropospheric measurements that is unique in the central North Atlantic region. The PICO-NARE station was installed there in 2001 as a temporary observatory. However, the location proved ideal for studies of aged emissions from anthropogenic (N. American) and boreal fire (N. American and Russian) emissions, as well as for less frequent interception of European and African plumes. As a result, station operation was continued through summer 2005, and we are planning for continuing operation and conversion into a permanent Portuguese GAW station in the future. This poster will provide an overview of the station, the measurements made there, typical transport pathways to the station and interannual variability in transport, and an overview of the full suite of multi-season observations and key findings from measurements to date. In addition, data availability and near-term and long-term plans for the station's future will be discussed.

  5. Large scale anisotropy of UHECRs for the Telescope Array

    SciTech Connect

    Kido, E.

    2011-09-22

    The origin of Ultra High Energy Cosmic Rays (UHECRs) is one of the most interesting questions in astroparticle physics. Despite of the efforts by other previous measurements, there is no consensus of both of the origin and the mechanism of UHECRs generation and propagation yet. In this context, Telescope Array (TA) experiment is expected to play an important role as the largest detector in the northern hemisphere which consists of an array of surface particle detectors (SDs) and fluorescence detectors (FDs) and other important calibration devices. We searched for large scale anisotropy using SD data of TA. UHECRs are expected to be restricted in GZK horizon when the composition of UHECRs is proton, so the observed arrival directions are expected to exhibit local large scale anisotropy if UHECR sources are some astrophysical objects. We used the SD data set from 11 May 2008 to 7 September 2010 to search for large-scale anisotropy. The discrimination power between LSS and isotropy is not enough yet, but the statistics in TA is expected to discriminate between those in about 95% confidence level on average in near future.

  6. Colloquium: Large scale simulations on GPU clusters

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  7. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  8. Large-Scale PV Integration Study

    SciTech Connect

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  9. Large-scale planar lightwave circuits

    NASA Astrophysics Data System (ADS)

    Bidnyk, Serge; Zhang, Hua; Pearson, Matt; Balakrishnan, Ashok

    2011-01-01

    By leveraging advanced wafer processing and flip-chip bonding techniques, we have succeeded in hybrid integrating a myriad of active optical components, including photodetectors and laser diodes, with our planar lightwave circuit (PLC) platform. We have combined hybrid integration of active components with monolithic integration of other critical functions, such as diffraction gratings, on-chip mirrors, mode-converters, and thermo-optic elements. Further process development has led to the integration of polarization controlling functionality. Most recently, all these technological advancements have been combined to create large-scale planar lightwave circuits that comprise hundreds of optical elements integrated on chips less than a square inch in size.

  10. Neutrinos and large-scale structure

    SciTech Connect

    Eisenstein, Daniel J.

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  11. Large scale phononic metamaterials for seismic isolation

    SciTech Connect

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  12. Large-scale Globally Propagating Coronal Waves

    NASA Astrophysics Data System (ADS)

    Warmuth, Alexander

    2015-09-01

    Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the "classical" interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which "pseudo waves" are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  13. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  14. Global food insecurity. Treatment of major food crops with elevated carbon dioxide or ozone under large-scale fully open-air conditions suggests recent models may have overestimated future yields

    PubMed Central

    Long, Stephen P; Ainsworth, Elizabeth A; Leakey, Andrew D.B; Morgan, Patrick B

    2005-01-01

    Predictions of yield for the globe's major grain and legume arable crops suggest that, with a moderate temperature increase, production may increase in the temperate zone, but decline in the tropics. In total, global food supply may show little change. This security comes from inclusion of the direct effect of rising carbon dioxide (CO2) concentration, [CO2], which significantly stimulates yield by decreasing photorespiration in C3 crops and transpiration in all crops. Evidence for a large response to [CO2] is largely based on studies made within chambers at small scales, which would be considered unacceptable for standard agronomic trials of new cultivars or agrochemicals. Yet, predictions of the globe's future food security are based on such inadequate information. Free-Air Concentration Enrichment (FACE) technology now allows investigation of the effects of rising [CO2] and ozone on field crops under fully open-air conditions at an agronomic scale. Experiments with rice, wheat, maize and soybean show smaller increases in yield than anticipated from studies in chambers. Experiments with increased ozone show large yield losses (20%), which are not accounted for in projections of global food security. These findings suggest that current projections of global food security are overoptimistic. The fertilization effect of CO2 is less than that used in many models, while rising ozone will cause large yield losses in the Northern Hemisphere. Unfortunately, FACE studies have been limited in geographical extent and interactive effects of CO2, ozone and temperature have yet to be studied. Without more extensive study of the effects of these changes at an agronomic scale in the open air, our ever-more sophisticated models will continue to have feet of clay. PMID:16433090

  15. Simulating subsurface heterogeneity improves large-scale water resources predictions

    NASA Astrophysics Data System (ADS)

    Hartmann, A. J.; Gleeson, T.; Wagener, T.; Wada, Y.

    2014-12-01

    Heterogeneity is abundant everywhere across the hydrosphere. It exists in the soil, the vadose zone and the groundwater. In large-scale hydrological models, subsurface heterogeneity is usually not considered. Instead average or representative values are chosen for each of the simulated grid cells, not incorporating any sub-grid variability. This may lead to unreliable predictions when the models are used for assessing future water resources availability, floods or droughts, or when they are used for recommendations for more sustainable water management. In this study we use a novel, large-scale model that takes into account sub-grid heterogeneity for the simulation of groundwater recharge by using statistical distribution functions. We choose all regions over Europe that are comprised by carbonate rock (~35% of the total area) because the well understood dissolvability of carbonate rocks (karstification) allows for assessing the strength of subsurface heterogeneity. Applying the model with historic data and future climate projections we show that subsurface heterogeneity lowers the vulnerability of groundwater recharge on hydro-climatic extremes and future changes of climate. Comparing our simulations with the PCR-GLOBWB model we can quantify the deviations of simulations for different sub-regions in Europe.

  16. Space-based gravitational wave observatories: Learning from the past, moving towards the future

    NASA Astrophysics Data System (ADS)

    Mueller, Guido; Cornish, Neil

    2014-03-01

    This century began with a planned launch of the joint NASA/ESA Laser Interferometer Space Antenna in 2011. In a remarkable reversal of fate, 2011 instead saw the end of the NASA/ESA partnership and the termination of the LISA project. This was despite the very high scientific ranking of a mHz gravitational wave observatory in both the US and Europe, and significant progress in technology development, mostly spearhead by industrial studies in Europe. The first half of the current decade continues to be dominated by struggles of the international community to get a LISA-like mission back on track for a launch in the next decade. Following a second place in ESA's L1 selection, the science theme ``The Gravitational Universe'' has now been selected as the L3 mission in Europe which is scheduled to launch in 2034 assuming no further delays or re-plans for the L1-L2-L3 mission sequence. On a more optimistic note, the upcoming launch of the LISA Pathfinder in 2015 and the first direct detections of gravitational waves by Advanced LIGO and by pulsar timing later in this decade may provide the necessary impetus to accelerate the development of a space-based gravitational wave detector.

  17. Large-Scale Organization of Glycosylation Networks

    NASA Astrophysics Data System (ADS)

    Kim, Pan-Jun; Lee, Dong-Yup; Jeong, Hawoong

    2009-03-01

    Glycosylation is a highly complex process to produce a diverse repertoire of cellular glycans that are frequently attached to proteins and lipids. Glycans participate in fundamental biological processes including molecular trafficking and clearance, cell proliferation and apoptosis, developmental biology, immune response, and pathogenesis. N-linked glycans found on proteins are formed by sequential attachments of monosaccharides with the help of a relatively small number of enzymes. Many of these enzymes can accept multiple N-linked glycans as substrates, thus generating a large number of glycan intermediates and their intermingled pathways. Motivated by the quantitative methods developed in complex network research, we investigate the large-scale organization of such N-glycosylation pathways in a mammalian cell. The uncovered results give the experimentally-testable predictions for glycosylation process, and can be applied to the engineering of therapeutic glycoproteins.

  18. Large-scale databases of proper names.

    PubMed

    Conley, P; Burgess, C; Hage, D

    1999-05-01

    Few tools for research in proper names have been available--specifically, there is no large-scale corpus of proper names. Two corpora of proper names were constructed, one based on U.S. phone book listings, the other derived from a database of Usenet text. Name frequencies from both corpora were compared with human subjects' reaction times (RTs) to the proper names in a naming task. Regression analysis showed that the Usenet frequencies contributed to predictions of human RT, whereas phone book frequencies did not. In addition, semantic neighborhood density measures derived from the HAL corpus were compared with the subjects' RTs and found to be a better predictor of RT than was frequency in either corpus. These new corpora are freely available on line for download. Potentials for these corpora range from using the names as stimuli in experiments to using the corpus data in software applications. PMID:10495803

  19. Estimation of large-scale dimension densities.

    PubMed

    Raab, C; Kurths, J

    2001-07-01

    We propose a technique to calculate large-scale dimension densities in both higher-dimensional spatio-temporal systems and low-dimensional systems from only a few data points, where known methods usually have an unsatisfactory scaling behavior. This is mainly due to boundary and finite-size effects. With our rather simple method, we normalize boundary effects and get a significant correction of the dimension estimate. This straightforward approach is based on rather general assumptions. So even weak coherent structures obtained from small spatial couplings can be detected with this method, which is impossible by using the Lyapunov-dimension density. We demonstrate the efficiency of our technique for coupled logistic maps, coupled tent maps, the Lorenz attractor, and the Roessler attractor. PMID:11461376

  20. The challenge of large-scale structure

    NASA Astrophysics Data System (ADS)

    Gregory, S. A.

    1996-03-01

    The tasks that I have assumed for myself in this presentation include three separate parts. The first, appropriate to the particular setting of this meeting, is to review the basic work of the founding of this field; the appropriateness comes from the fact that W. G. Tifft made immense contributions that are not often realized by the astronomical community. The second task is to outline the general tone of the observational evidence for large scale structures. (Here, in particular, I cannot claim to be complete. I beg forgiveness from any workers who are left out by my oversight for lack of space and time.) The third task is to point out some of the major aspects of the field that may represent the clues by which some brilliant sleuth will ultimately figure out how galaxies formed.

  1. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  2. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.

  3. Batteries for Large Scale Energy Storage

    SciTech Connect

    Soloveichik, Grigorii L.

    2011-07-15

    In recent years, with the deployment of renewable energy sources, advances in electrified transportation, and development in smart grids, the markets for large-scale stationary energy storage have grown rapidly. Electrochemical energy storage methods are strong candidate solutions due to their high energy density, flexibility, and scalability. This review provides an overview of mature and emerging technologies for secondary and redox flow batteries. New developments in the chemistry of secondary and flow batteries as well as regenerative fuel cells are also considered. Advantages and disadvantages of current and prospective electrochemical energy storage options are discussed. The most promising technologies in the short term are high-temperature sodium batteries with β”-alumina electrolyte, lithium-ion batteries, and flow batteries. Regenerative fuel cells and lithium metal batteries with high energy density require further research to become practical.

  4. Large scale water lens for solar concentration.

    PubMed

    Mondol, A S; Vogel, B; Bastian, G

    2015-06-01

    Properties of large scale water lenses for solar concentration were investigated. These lenses were built from readily available materials, normal tap water and hyper-elastic linear low density polyethylene foil. Exposed to sunlight, the focal lengths and light intensities in the focal spot were measured and calculated. Their optical properties were modeled with a raytracing software based on the lens shape. We have achieved a good match of experimental and theoretical data by considering wavelength dependent concentration factor, absorption and focal length. The change in light concentration as a function of water volume was examined via the resulting load on the foil and the corresponding change of shape. The latter was extracted from images and modeled by a finite element simulation. PMID:26072893

  5. Large Scale Quantum Simulations of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 < ρ < 0 . 10 fm-3, proton fractions 0 . 05

  6. Large-scale simulations of reionization

    SciTech Connect

    Kohler, Katharina; Gnedin, Nickolay Y.; Hamilton, Andrew J.S.; /JILA, Boulder

    2005-11-01

    We use cosmological simulations to explore the large-scale effects of reionization. Since reionization is a process that involves a large dynamic range--from galaxies to rare bright quasars--we need to be able to cover a significant volume of the universe in our simulation without losing the important small scale effects from galaxies. Here we have taken an approach that uses clumping factors derived from small scale simulations to approximate the radiative transfer on the sub-cell scales. Using this technique, we can cover a simulation size up to 1280h{sup -1} Mpc with 10h{sup -1} Mpc cells. This allows us to construct synthetic spectra of quasars similar to observed spectra of SDSS quasars at high redshifts and compare them to the observational data. These spectra can then be analyzed for HII region sizes, the presence of the Gunn-Peterson trough, and the Lyman-{alpha} forest.

  7. The XMM Large Scale Structure Survey

    NASA Astrophysics Data System (ADS)

    Pierre, Marguerite

    2005-10-01

    We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.

  8. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  9. Estimation of large-scale dimension densities

    NASA Astrophysics Data System (ADS)

    Raab, Corinna; Kurths, Jürgen

    2001-07-01

    We propose a technique to calculate large-scale dimension densities in both higher-dimensional spatio-temporal systems and low-dimensional systems from only a few data points, where known methods usually have an unsatisfactory scaling behavior. This is mainly due to boundary and finite-size effects. With our rather simple method, we normalize boundary effects and get a significant correction of the dimension estimate. This straightforward approach is based on rather general assumptions. So even weak coherent structures obtained from small spatial couplings can be detected with this method, which is impossible by using the Lyapunov-dimension density. We demonstrate the efficiency of our technique for coupled logistic maps, coupled tent maps, the Lorenz attractor, and the Roessler attractor.

  10. Supporting large-scale computational science

    SciTech Connect

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  11. Supporting large-scale computational science

    SciTech Connect

    Musick, R., LLNL

    1998-02-19

    Business needs have driven the development of commercial database systems since their inception. As a result, there has been a strong focus on supporting many users, minimizing the potential corruption or loss of data, and maximizing performance metrics like transactions per second, or TPC-C and TPC-D results. It turns out that these optimizations have little to do with the needs of the scientific community, and in particular have little impact on improving the management and use of large-scale high-dimensional data. At the same time, there is an unanswered need in the scientific community for many of the benefits offered by a robust DBMS. For example, tying an ad-hoc query language such as SQL together with a visualization toolkit would be a powerful enhancement to current capabilities. Unfortunately, there has been little emphasis or discussion in the VLDB community on this mismatch over the last decade. The goal of the paper is to identify the specific issues that need to be resolved before large-scale scientific applications can make use of DBMS products. This topic is addressed in the context of an evaluation of commercial DBMS technology applied to the exploration of data generated by the Department of Energy`s Accelerated Strategic Computing Initiative (ASCI). The paper describes the data being generated for ASCI as well as current capabilities for interacting with and exploring this data. The attraction of applying standard DBMS technology to this domain is discussed, as well as the technical and business issues that currently make this an infeasible solution.

  12. Large-scale sequential quadratic programming algorithms

    SciTech Connect

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  13. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  14. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-02-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  15. OceanSITES format and Ocean Observatory Output harmonisation: past, present and future

    NASA Astrophysics Data System (ADS)

    Pagnani, Maureen; Galbraith, Nan; Diggs, Stephen; Lankhorst, Matthias; Hidas, Marton; Lampitt, Richard

    2015-04-01

    The Global Ocean Observing System (GOOS) initiative was launched in 1991, and was the first step in creating a global view of ocean observations. In 1999 oceanographers at the OceanObs conference envisioned a 'global system of eulerian observatories' which evolved into the OceanSITES project. OceanSITES has been generously supported by individual oceanographic institutes and agencies across the globe, as well as by the WMO-IOC Joint Technical Commission for Oceanography and Marine Meteorology (under JCOMMOPS). The project is directed by the needs of research scientists, but has a strong data management component, with an international team developing content standards, metadata specifications, and NetCDF templates for many types of in situ oceanographic data. The OceanSITES NetCDF format specification is intended as a robust data exchange and archive format specifically for time-series observatory data from the deep ocean. First released in February 2006, it has evolved to build on and extend internationally recognised standards such as the Climate and Forecast (CF) standard, BODC vocabularies, ISO formats and vocabularies, and in version 1.3, released in 2014, ACDD (Attribute Convention for Dataset Discovery). The success of the OceanSITES format has inspired other observational groups, such as autonomous vehicles and ships of opportunity, to also use the format and today it is fulfilling the original concept of providing a coherent set of data from eurerian observatories. Data in the OceanSITES format is served by 2 Global Data Assembly Centres (GDACs), one at Coriolis, in France, at ftp://ftp.ifremer.fr/ifremer/oceansites/ and one at the US NDBC, at ftp://data.ndbc.noaa.gov/data/oceansites/. These two centres serve over 26,800 OceanSITES format data files from 93 moorings. The use of standardised and controlled features enables the files held at the OceanSITES GDACs to be electronically discoverable and ensures the widest access to the data. The Ocean

  16. OceanSITES format and Ocean Observatory Output harmonisation: past, present and future

    NASA Astrophysics Data System (ADS)

    Pagnani, Maureen; Galbraith, Nan; Diggs, Stephen; Lankhorst, Matthias; Hidas, Marton; Lampitt, Richard

    2015-04-01

    The Global Ocean Observing System (GOOS) initiative was launched in 1991, and was the first step in creating a global view of ocean observations. In 1999 oceanographers at the OceanObs conference envisioned a 'global system of eulerian observatories' which evolved into the OceanSITES project. OceanSITES has been generously supported by individual oceanographic institutes and agencies across the globe, as well as by the WMO-IOC Joint Technical Commission for Oceanography and Marine Meteorology (under JCOMMOPS). The project is directed by the needs of research scientists, but has a strong data management component, with an international team developing content standards, metadata specifications, and NetCDF templates for many types of in situ oceanographic data. The OceanSITES NetCDF format specification is intended as a robust data exchange and archive format specifically for time-series observatory data from the deep ocean. First released in February 2006, it has evolved to build on and extend internationally recognised standards such as the Climate and Forecast (CF) standard, BODC vocabularies, ISO formats and vocabularies, and in version 1.3, released in 2014, ACDD (Attribute Convention for Dataset Discovery). The success of the OceanSITES format has inspired other observational groups, such as autonomous vehicles and ships of opportunity, to also use the format and today it is fulfilling the original concept of providing a coherent set of data from eurerian observatories. Data in the OceanSITES format is served by 2 Global Data Assembly Centres (GDACs), one at Coriolis, in France, at ftp://ftp.ifremer.fr/ifremer/oceansites/ and one at the US NDBC, at ftp://data.ndbc.noaa.gov/data/oceansites/. These two centres serve over 26,800 OceanSITES format data files from 93 moorings. The use of standardised and controlled features enables the files held at the OceanSITES GDACs to be electronically discoverable and ensures the widest access to the data. The Ocean

  17. Effects of subsurface heterogeneity on large-scale hydrological predictions

    NASA Astrophysics Data System (ADS)

    Hartmann, Andreas; Gleeson, Tom; Wagener, Thorsten; Wada, Yoshihide

    2015-04-01

    Heterogeneity is abundant everywhere across the hydrosphere. It exists in the soil, the vadose zone and the groundwater producing preferential flow and complex threshold behavior. In large-scale hydrological models, subsurface heterogeneity is usually not considered. Instead average or representative values are chosen for each of the simulated grid cells, not incorporating any sub-grid variability. This may lead to unreliable predictions when the models are used for assessing future water resources availability, floods or droughts, or when they are used for recommendations for more sustainable water management. In this study we use a novel, large-scale model that takes into account sub-grid heterogeneity for the simulation of groundwater recharge by using statistical distribution functions. We choose all regions over Europe that are comprised by carbonate rock (~35% of the total area) because the well understood dissolvability of carbonate rocks (karstification) allows for assessing the strength of subsurface heterogeneity. Applying the model with historic data and future climate projections we show that subsurface heterogeneity results (1) in larger present-day groundwater recharge and (2) a greater vulnerability to climate in terms of long-term decrease and hydrological extremes.

  18. Reconstruction of the solar coronal magnetic field, from active region to large scale

    NASA Astrophysics Data System (ADS)

    Amari, T.; Canou, A.; Delyon, F.; Aly, J. J.; Frey, P.; Alauzet, F.

    2011-12-01

    The low solar corona is dominated by the magnetic field which is created inside the sun by a dynamo process and then emerges into the atmosphere. This magnetic field plays an important role in most structures and phenomena observed at various wavelengths such as prominences, small and large scale eruptive events, and continuous heating of the plasma, and therefore it is important to understand its three-dimensional properties in order to elaborate efficient theoretical models. Unfortunately, the magnetic field is difficult to measure locally in the hot and tenuous corona. But this can be done at the level of the cooler and denser photosphere, and several instruments with high resolution vector magnetographs are currently available (THEMIS, Imaging Vector Magnetograph (IVM), the Advanced Stokes Polarimeter (ASP), SOLIS, HINODE, Solar Dynamics Observatory (SDO), or will be shortly available by future telescopes such as EST and solar missions as SOLAR-ORBITER. This has lead solar physicists to develop an approach which consists in " reconstructing" the coronal magnetic field from boundary data given on the photosphere. We will discuss some of the issues encountered in solving this problem as well our recent progress and results at the scale of active region scales or the larger one such as full sun scale.

  19. A large-scale dataset of solar event reports from automated feature recognition modules

    NASA Astrophysics Data System (ADS)

    Schuh, Michael A.; Angryk, Rafal A.; Martens, Petrus C.

    2016-05-01

    The massive repository of images of the Sun captured by the Solar Dynamics Observatory (SDO) mission has ushered in the era of Big Data for Solar Physics. In this work, we investigate the entire public collection of events reported to the Heliophysics Event Knowledgebase (HEK) from automated solar feature recognition modules operated by the SDO Feature Finding Team (FFT). With the SDO mission recently surpassing five years of operations, and over 280,000 event reports for seven types of solar phenomena, we present the broadest and most comprehensive large-scale dataset of the SDO FFT modules to date. We also present numerous statistics on these modules, providing valuable contextual information for better understanding and validating of the individual event reports and the entire dataset as a whole. After extensive data cleaning through exploratory data analysis, we highlight several opportunities for knowledge discovery from data (KDD). Through these important prerequisite analyses presented here, the results of KDD from Solar Big Data will be overall more reliable and better understood. As the SDO mission remains operational over the coming years, these datasets will continue to grow in size and value. Future versions of this dataset will be analyzed in the general framework established in this work and maintained publicly online for easy access by the community.

  20. Industrial Large Scale Applications of Superconductivity -- Current and Future Trends

    NASA Astrophysics Data System (ADS)

    Amm, Kathleen

    2011-03-01

    Since the initial development of NbTi and Nb3Sn superconducting wires in the early 1960's, superconductivity has developed a broad range of industrial applications in research, medicine and energy. Superconductivity has been used extensively in NMR low field and high field spectrometers and MRI systems, and has been demonstrated in many power applications, including power cables, transformers, fault current limiters, and motors and generators. To date, the most commercially successful application for superconductivity has been the high field magnets required for magnetic resonance imaging (MRI), with a global market well in excess of 4 billion excluding the service industry. The unique ability of superconductors to carry large currents with no losses enabled high field MRI and its unique clinical capabilities in imaging soft tissue. The rapid adoption of high field MRI with superconducting magnets was because superconductivity was a key enabler for high field magnets with their high field uniformity and image quality. With over 30 years of developing MRI systems and applications, MRI has become a robust clinical tool that is ever expanding into new and developing markets. Continued innovation in system design is continuing to address these market needs. One of the key questions that innovators in industrial superconducting magnet design must consider today is what application of superconductivity may lead to a market on the scale of MRI? What are the key considerations for where superconductivity can provide a unique solution as it did in the case of MRI? Many companies in the superconducting industry today are investigating possible technologies that may be the next large market like MRI.

  1. Future Large - Scale Projects and Programmes in Astronomy and Astrophysics

    NASA Astrophysics Data System (ADS)

    Corbett, I.

    2004-03-01

    This workshop was proposed by Germany, which invited ESO to act as host, and took place on December 1-3, at the Deutsches Museum (December 1) and at the Ludwig- Maximilians-Universität (December 2, 3). It was attended by government-appointed delegates from fifteen Global Science Forum Member countries and Observers, three non- OECD countries, representatives of ESO, the President of the International Astronomical Union, invited speakers, and the OECD secretariat, and was chaired by Ian Corbett of ESO.

  2. The TMT International Observatory: A quick overview of future opportunities for planetary science exploration

    NASA Astrophysics Data System (ADS)

    Dumas, Christophe; Dawson, Sandra; Otarola, Angel; Skidmore, Warren; Squires, Gordon; Travouillon, Tony; Greathouse, Thomas K.; Li, Jian-Yang; Lu, Junjun; Marchis, Frank; Meech, Karen J.; Wong, Michael H.

    2015-11-01

    The construction of the Thirty-Meter-Telescope International Observatory (TIO) is scheduled to take about eight years, with first-light currently planned for the horizon 2023/24, and start of science operations soon after. Its innovative design, the unequalled astronomical quality of its location, and the scientific capabilities that will be offered by its suite of instruments, all contribute to position TIO as a major ground-based facility of the next decade.In this talk, we will review the expected observing performances of the facility, which will combine adaptive-optics corrected wavefronts with powerful imaging and spectroscopic capabilities. TMT will enable ground-based exploration of our solar system - and planetary systems at large - at a dramatically enhanced sensitivity and spatial resolution across the visible and near-/thermal- infrared regimes. This sharpened vision, spanning the study of planetary atmospheres, ring systems, (cryo-)volcanic activity, small body populations (asteroids, comets, trans-Neptunian objects), and exoplanets, will shed new lights on the processes involved in the formation and evolution of our solar system, including the search for life outside the Earth, and will expand our understanding of the physical and chemical properties of extra-solar planets, complementing TIO's direct studies of planetary systems around other stars.TIO operations will meet a wide range of observing needs. Observing support associated with "classical" and "queue" modes will be offered (including some flavors of remote observing). The TIO schedule will integrate observing programs so as to optimize scientific outputs and take into account the stringent observing time constraints often encountered for observations of our solar system such as, for instance, the scheduling of target-of-oportunity observations, the implementation of short observing runs, or the support of long-term "key-science" programmes.Complementary information about TIO, and the

  3. Gravity and large-scale nonlocal bias

    NASA Astrophysics Data System (ADS)

    Chan, Kwan Chuen; Scoccimarro, Román; Sheth, Ravi K.

    2012-04-01

    For Gaussian primordial fluctuations the relationship between galaxy and matter overdensities, bias, is most often assumed to be local at the time of observation in the large-scale limit. This hypothesis is however unstable under time evolution, we provide proofs under several (increasingly more realistic) sets of assumptions. In the simplest toy model galaxies are created locally and linearly biased at a single formation time, and subsequently move with the dark matter (no velocity bias) conserving their comoving number density (no merging). We show that, after this formation time, the bias becomes unavoidably nonlocal and nonlinear at large scales. We identify the nonlocal gravitationally induced fields in which the galaxy overdensity can be expanded, showing that they can be constructed out of the invariants of the deformation tensor (Galileons), the main signature of which is a quadrupole field in second-order perturbation theory. In addition, we show that this result persists if we include an arbitrary evolution of the comoving number density of tracers. We then include velocity bias, and show that new contributions appear; these are related to the breaking of Galilean invariance of the bias relation, a dipole field being the signature at second order. We test these predictions by studying the dependence of halo overdensities in cells of fixed dark matter density: measurements in simulations show that departures from the mean bias relation are strongly correlated with the nonlocal gravitationally induced fields identified by our formalism, suggesting that the halo distribution at the present time is indeed more closely related to the mass distribution at an earlier rather than present time. However, the nonlocality seen in the simulations is not fully captured by assuming local bias in Lagrangian space. The effects on nonlocal bias seen in the simulations are most important for the most biased halos, as expected from our predictions. Accounting for these

  4. Large-Scale Statistics for Cu Electromigration

    NASA Astrophysics Data System (ADS)

    Hauschildt, M.; Gall, M.; Hernandez, R.

    2009-06-01

    Even after the successful introduction of Cu-based metallization, the electromigration failure risk has remained one of the important reliability concerns for advanced process technologies. The observation of strong bimodality for the electron up-flow direction in dual-inlaid Cu interconnects has added complexity, but is now widely accepted. The failure voids can occur both within the via ("early" mode) or within the trench ("late" mode). More recently, bimodality has been reported also in down-flow electromigration, leading to very short lifetimes due to small, slit-shaped voids under vias. For a more thorough investigation of these early failure phenomena, specific test structures were designed based on the Wheatstone Bridge technique. The use of these structures enabled an increase of the tested sample size close to 675000, allowing a direct analysis of electromigration failure mechanisms at the single-digit ppm regime. Results indicate that down-flow electromigration exhibits bimodality at very small percentage levels, not readily identifiable with standard testing methods. The activation energy for the down-flow early failure mechanism was determined to be 0.83±0.02 eV. Within the small error bounds of this large-scale statistical experiment, this value is deemed to be significantly lower than the usually reported activation energy of 0.90 eV for electromigration-induced diffusion along Cu/SiCN interfaces. Due to the advantages of the Wheatstone Bridge technique, we were also able to expand the experimental temperature range down to 150° C, coming quite close to typical operating conditions up to 125° C. As a result of the lowered activation energy, we conclude that the down-flow early failure mode may control the chip lifetime at operating conditions. The slit-like character of the early failure void morphology also raises concerns about the validity of the Blech-effect for this mechanism. A very small amount of Cu depletion may cause failure even before a

  5. Solving large scale structure in ten easy steps with COLA

    SciTech Connect

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J. E-mail: matiasz@ias.edu

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  6. Solving large scale structure in ten easy steps with COLA

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 109Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 1011Msolar/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  7. LARGE-SCALE CO2 TRANSPORTATION AND DEEP OCEAN SEQUESTRATION

    SciTech Connect

    Hamid Sarv

    1999-03-01

    Technical and economical feasibility of large-scale CO{sub 2} transportation and ocean sequestration at depths of 3000 meters or grater was investigated. Two options were examined for transporting and disposing the captured CO{sub 2}. In one case, CO{sub 2} was pumped from a land-based collection center through long pipelines laid on the ocean floor. Another case considered oceanic tanker transport of liquid carbon dioxide to an offshore floating structure for vertical injection to the ocean floor. In the latter case, a novel concept based on subsurface towing of a 3000-meter pipe, and attaching it to the offshore structure was considered. Budgetary cost estimates indicate that for distances greater than 400 km, tanker transportation and offshore injection through a 3000-meter vertical pipe provides the best method for delivering liquid CO{sub 2} to deep ocean floor depressions. For shorter distances, CO{sub 2} delivery by parallel-laid, subsea pipelines is more cost-effective. Estimated costs for 500-km transport and storage at a depth of 3000 meters by subsea pipelines and tankers were 1.5 and 1.4 dollars per ton of stored CO{sub 2}, respectively. At these prices, economics of ocean disposal are highly favorable. Future work should focus on addressing technical issues that are critical to the deployment of a large-scale CO{sub 2} transportation and disposal system. Pipe corrosion, structural design of the transport pipe, and dispersion characteristics of sinking CO{sub 2} effluent plumes have been identified as areas that require further attention. Our planned activities in the next Phase include laboratory-scale corrosion testing, structural analysis of the pipeline, analytical and experimental simulations of CO{sub 2} discharge and dispersion, and the conceptual economic and engineering evaluation of large-scale implementation.

  8. Large Scale Computer Simulation of Erthocyte Membranes

    NASA Astrophysics Data System (ADS)

    Harvey, Cameron; Revalee, Joel; Laradji, Mohamed

    2007-11-01

    The cell membrane is crucial to the life of the cell. Apart from partitioning the inner and outer environment of the cell, they also act as a support of complex and specialized molecular machinery, important for both the mechanical integrity of the cell, and its multitude of physiological functions. Due to its relative simplicity, the red blood cell has been a favorite experimental prototype for investigations of the structural and functional properties of the cell membrane. The erythrocyte membrane is a composite quasi two-dimensional structure composed essentially of a self-assembled fluid lipid bilayer and a polymerized protein meshwork, referred to as the cytoskeleton or membrane skeleton. In the case of the erythrocyte, the polymer meshwork is mainly composed of spectrin, anchored to the bilayer through specialized proteins. Using a coarse-grained model, recently developed by us, of self-assembled lipid membranes with implicit solvent and using soft-core potentials, we simulated large scale red-blood-cells bilayers with dimensions ˜ 10-1 μm^2, with explicit cytoskeleton. Our aim is to investigate the renormalization of the elastic properties of the bilayer due to the underlying spectrin meshwork.

  9. Large-scale carbon fiber tests

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A realistic release of carbon fibers was established by burning a minimum of 45 kg of carbon fiber composite aircraft structural components in each of five large scale, outdoor aviation jet fuel fire tests. This release was quantified by several independent assessments with various instruments developed specifically for these tests. The most likely values for the mass of single carbon fibers released ranged from 0.2 percent of the initial mass of carbon fiber for the source tests (zero wind velocity) to a maximum of 0.6 percent of the initial carbon fiber mass for dissemination tests (5 to 6 m/s wind velocity). Mean fiber lengths for fibers greater than 1 mm in length ranged from 2.5 to 3.5 mm. Mean diameters ranged from 3.6 to 5.3 micrometers which was indicative of significant oxidation. Footprints of downwind dissemination of the fire released fibers were measured to 19.1 km from the fire.

  10. Food appropriation through large scale land acquisitions

    NASA Astrophysics Data System (ADS)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  11. Large-scale wind turbine structures

    NASA Technical Reports Server (NTRS)

    Spera, David A.

    1988-01-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  12. Large-scale wind turbine structures

    NASA Astrophysics Data System (ADS)

    Spera, David A.

    1988-05-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  13. Future observations of planetary rings from groundbased observatories and earth-orbiting satellites

    NASA Technical Reports Server (NTRS)

    Smith, B. A.

    1984-01-01

    Three of the outer planets of our solar system are known to possess ring systems and, among the three known systems, all have one or more components that are detectable from the vicinity of the earth. Among the more promising methods for continuing study are direct imaging and occultations, obtainable both from the ground and from earth-orbiting facilities such as Space Telescope. Relevant future observations made from the vicinity of earth are expected to provide new knowledge of the dynamical and photometric properties of these outer solar system planetary rings.

  14. Terminology of Large-Scale Waves in the Solar Atmosphere

    NASA Astrophysics Data System (ADS)

    Vršnak, Bojan

    2005-03-01

    This is the fourth in a series of essays on terms used in solar-terrestrial physics that are thought to be in need of clarification. Terms are identified and essays are commissioned by a committee chartered by Division II (Sun and Heliosphere) of the International Astronomical Union. Terminology Committee members include Ed Cliver (chair), Jean-Louis Bougeret, Hilary Cane, Takeo Kosugi, Sara Martin, Rainer Schwenn, and Lidia van Driel-Gestelyi. Authors are asked to review the origins of terms and their current usage/misusage. The goals are to inform the community and to open a discussion. The following article by Bojan Vršnak focuses on terms used to describe large-scale waves in the solar atmosphere, an area of research that has been given great impetus by the images of waves from the Extreme ultraviolet Imaging Telescope (EIT) on board the Solar and Heliospheric Observatory (SOHO). The committee welcomes suggestions for other terms to address in this forum.

  15. Investigation of Coronal Large Scale Structures Utilizing Spartan 201 Data

    NASA Technical Reports Server (NTRS)

    Guhathakurta, Madhulika

    1998-01-01

    Two telescopes aboard Spartan 201, a small satellite has been launched from the Space Shuttles, on April 8th, 1993, September 8th, 1994, September 7th, 1995 and November 20th, 1997. The main objective of the mission was to answer some of the most fundamental unanswered questions of solar physics-What accelerates the solar wind and what heats the corona? The two telescopes are 1) Ultraviolet Coronal Spectrometer (UVCS) provided by the Smithsonian Astrophysical Observatory which uses ultraviolet emissions from neutral hydrogen and ions in the corona to determine velocities of the coronal plasma within the solar wind source region, and the temperature and density distributions of protons and 2) White Light Coronagraph (WLC) provided by NASA's Goddard Space Flight Center which measures visible light to determine the density distribution of coronal electrons within the same region. The PI has had the primary responsibility in the development and application of computer codes necessary for scientific data analysis activities, end instrument calibration for the white-light coronagraph for the entire Spartan mission. The PI was responsible for the science output from the WLC instrument. PI has also been involved in the investigation of coronal density distributions in large-scale structures by use of numerical models which are (mathematically) sufficient to reproduce the details of the observed brightness and polarized brightness distributions found in SPARTAN 201 data.

  16. An informal paper on large-scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Ho, Y. C.

    1975-01-01

    Large scale systems are defined as systems requiring more than one decision maker to control the system. Decentralized control and decomposition are discussed for large scale dynamic systems. Information and many-person decision problems are analyzed.

  17. Long-lived space observatories for astronomy and astrophysics

    NASA Technical Reports Server (NTRS)

    Savage, Blair D.; Becklin, Eric E.; Beckwith, Steven V. W.; Cowie, Lennox L.; Dupree, Andrea K.; Elliot, James L.; Gallagher, John S.; Helfand, David J.; Jenkins, Edward F.; Johnston, Kenneth J.

    1987-01-01

    NASA's plan to build and launch a fleet of long-lived space observatories that include the Hubble Space Telescope (HST), the Gamma Ray Observatory (GRO), the Advanced X Ray Astrophysics Observatory (AXAF), and the Space Infrared Telescope Facility (SIRTF) are discussed. These facilities are expected to have a profound impact on the sciences of astronomy and astrophysics. The long-lived observatories will provide new insights about astronomical and astrophysical problems that range from the presence of planets orbiting nearby stars to the large-scale distribution and evolution of matter in the universe. An important concern to NASA and the scientific community is the operation and maintenance cost of the four observatories described above. The HST cost about $1.3 billion (1984 dollars) to build and is estimated to require $160 million (1986 dollars) a year to operate and maintain. If HST is operated for 20 years, the accumulated costs will be considerably more than those required for its construction. Therefore, it is essential to plan carefully for observatory operations and maintenance before a long-lived facility is constructed. The primary goal of this report is to help NASA develop guidelines for the operations and management of these future observatories so as to achieve the best possible scientific results for the resources available. Eight recommendations are given.

  18. UAV Data Processing for Large Scale Topographical Mapping

    NASA Astrophysics Data System (ADS)

    Tampubolon, W.; Reinhardt, W.

    2014-06-01

    data acquisition in the future in which it can support national large scale topographical mapping program up to the 1:1.000 map scale.

  19. Sensitivity technologies for large scale simulation.

    SciTech Connect

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  20. Simulating the large-scale structure of HI intensity maps

    NASA Astrophysics Data System (ADS)

    Seehars, Sebastian; Paranjape, Aseem; Witzemann, Amadeus; Refregier, Alexandre; Amara, Adam; Akeret, Joel

    2016-03-01

    Intensity mapping of neutral hydrogen (HI) is a promising observational probe of cosmology and large-scale structure. We present wide field simulations of HI intensity maps based on N-body simulations of a 2.6 Gpc / h box with 20483 particles (particle mass 1.6 × 1011 Msolar / h). Using a conditional mass function to populate the simulated dark matter density field with halos below the mass resolution of the simulation (108 Msolar / h < Mhalo < 1013 Msolar / h), we assign HI to those halos according to a phenomenological halo to HI mass relation. The simulations span a redshift range of 0.35 lesssim z lesssim 0.9 in redshift bins of width Δ z ≈ 0.05 and cover a quarter of the sky at an angular resolution of about 7'. We use the simulated intensity maps to study the impact of non-linear effects and redshift space distortions on the angular clustering of HI. Focusing on the autocorrelations of the maps, we apply and compare several estimators for the angular power spectrum and its covariance. We verify that these estimators agree with analytic predictions on large scales and study the validity of approximations based on Gaussian random fields, particularly in the context of the covariance. We discuss how our results and the simulated maps can be useful for planning and interpreting future HI intensity mapping surveys.

  1. Large-scale mapping of mutations affecting zebrafish development

    PubMed Central

    Geisler, Robert; Rauch, Gerd-Jörg; Geiger-Rudolph, Silke; Albrecht, Andrea; van Bebber, Frauke; Berger, Andrea; Busch-Nentwich, Elisabeth; Dahm, Ralf; Dekens, Marcus PS; Dooley, Christopher; Elli, Alexandra F; Gehring, Ines; Geiger, Horst; Geisler, Maria; Glaser, Stefanie; Holley, Scott; Huber, Matthias; Kerr, Andy; Kirn, Anette; Knirsch, Martina; Konantz, Martina; Küchler, Axel M; Maderspacher, Florian; Neuhauss, Stephan C; Nicolson, Teresa; Ober, Elke A; Praeg, Elke; Ray, Russell; Rentzsch, Brit; Rick, Jens M; Rief, Eva; Schauerte, Heike E; Schepp, Carsten P; Schönberger, Ulrike; Schonthaler, Helia B; Seiler, Christoph; Sidi, Samuel; Söllner, Christian; Wehner, Anja; Weiler, Christian; Nüsslein-Volhard, Christiane

    2007-01-01

    Background Large-scale mutagenesis screens in the zebrafish employing the mutagen ENU have isolated several hundred mutant loci that represent putative developmental control genes. In order to realize the potential of such screens, systematic genetic mapping of the mutations is necessary. Here we report on a large-scale effort to map the mutations generated in mutagenesis screening at the Max Planck Institute for Developmental Biology by genome scanning with microsatellite markers. Results We have selected a set of microsatellite markers and developed methods and scoring criteria suitable for efficient, high-throughput genome scanning. We have used these methods to successfully obtain a rough map position for 319 mutant loci from the Tübingen I mutagenesis screen and subsequent screening of the mutant collection. For 277 of these the corresponding gene is not yet identified. Mapping was successful for 80 % of the tested loci. By comparing 21 mutation and gene positions of cloned mutations we have validated the correctness of our linkage group assignments and estimated the standard error of our map positions to be approximately 6 cM. Conclusion By obtaining rough map positions for over 300 zebrafish loci with developmental phenotypes, we have generated a dataset that will be useful not only for cloning of the affected genes, but also to suggest allelism of mutations with similar phenotypes that will be identified in future screens. Furthermore this work validates the usefulness of our methodology for rapid, systematic and inexpensive microsatellite mapping of zebrafish mutations. PMID:17212827

  2. Large Scale Flame Spread Environmental Characterization Testing

    NASA Technical Reports Server (NTRS)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  3. Synchronization of coupled large-scale Boolean networks

    SciTech Connect

    Li, Fangfei

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  4. Space Active Optics: toward optimized correcting mirrors for future large spaceborne observatories

    NASA Astrophysics Data System (ADS)

    Laslandes, Marie; Hugot, Emmanuel; Ferrari, Marc; Lemaitre, Gérard; Liotard, Arnaud

    2011-10-01

    Wave-front correction in optical instruments is often needed, either to compensate Optical Path Differences, off-axis aberrations or mirrors deformations. Active optics techniques are developed to allow efficient corrections with deformable mirrors. In this paper, we will present the conception of particular deformation systems which could be used in space telescopes and instruments in order to improve their performances while allowing relaxing specifications on the global system stability. A first section will be dedicated to the design and performance analysis of an active mirror specifically designed to compensate for aberrations that might appear in future 3m-class space telescopes, due to lightweight primary mirrors, thermal variations or weightless conditions. A second section will be dedicated to a brand new design of active mirror, able to compensate for given combinations of aberrations with a single actuator. If the aberrations to be corrected in an instrument and their evolutions are known in advance, an optimal system geometry can be determined thanks to the elasticity theory and Finite Element Analysis.

  5. Large-scale signal detection: A unified perspective.

    PubMed

    Mukhopadhyay, Subhadeep

    2016-06-01

    There is an overwhelmingly large literature and algorithms already available on "large-scale inference problems" based on different modeling techniques and cultures. Our primary goal in this article is not to add one more new methodology to the existing toolbox but instead (i) to clarify the mystery how these different simultaneous inference methods are connected, (ii) to provide an alternative more intuitive derivation of the formulas that leads to simpler expressions in order (iii) to develop a unified algorithm for practitioners. A detailed discussion on representation, estimation, inference, and model selection is given. Applications to a variety of real and simulated datasets show promise. We end with several future research directions. PMID:26433744

  6. Large-scale coastal evolution of Louisiana's barrier islands

    USGS Publications Warehouse

    List, Jeffrey H.; Jaffe, Bruce E.; Sallenger,, Asbury H., Jr.

    1991-01-01

    The prediction of large-scale coastal change is an extremely important, but distant goal. Here we describe some of our initial efforts in this direction, using historical bathymetric information along a 150 km reach of the rapidly evolving barrier island coast of Louisiana. Preliminary results suggest that the relative sea level rise rate, though extremely high in the area, has played a secondary role in coastal erosion over the last 100 years, with longshore transport of sand-sized sediment being the primary cause. Prediction of future conditions is hampered by a general lack of erosion processes understanding; however, an examination of the changing volumes of sand stored in a large ebb-tidal delta system suggests a continued high rate of shoreline retreat driven by the longshore re-distribution of sand.

  7. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    NASA Astrophysics Data System (ADS)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  8. Planning under uncertainty solving large-scale stochastic linear programs

    SciTech Connect

    Infanger, G. . Dept. of Operations Research Technische Univ., Vienna . Inst. fuer Energiewirtschaft)

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  9. Successful Physician Training Program for Large Scale EMR Implementation

    PubMed Central

    Stevens, L.A.; Mailes, E.S.; Goad, B.A.; Longhurst, C.A.

    2015-01-01

    Summary End-user training is an essential element of electronic medical record (EMR) implementation and frequently suffers from minimal institutional investment. In addition, discussion of successful EMR training programs for physicians is limited in the literature. The authors describe a successful physician-training program at Stanford Children’s Health as part of a large scale EMR implementation. Evaluations of classroom training, obtained at the conclusion of each class, revealed high physician satisfaction with the program. Free-text comments from learners focused on duration and timing of training, the learning environment, quality of the instructors, and specificity of training to their role or department. Based upon participant feedback and institutional experience, best practice recommendations, including physician engagement, curricular design, and assessment of proficiency and recognition, are suggested for future provider EMR training programs. The authors strongly recommend the creation of coursework to group providers by common workflow. PMID:25848415

  10. Climatological context for large-scale coral bleaching

    NASA Astrophysics Data System (ADS)

    Barton, A. D.; Casey, K. S.

    2005-12-01

    Large-scale coral bleaching was first observed in 1979 and has occurred throughout virtually all of the tropics since that time. Severe bleaching may result in the loss of live coral and in a decline of the integrity of the impacted coral reef ecosystem. Despite the extensive scientific research and increased public awareness of coral bleaching, uncertainties remain about the past and future of large-scale coral bleaching. In order to reduce these uncertainties and place large-scale coral bleaching in the longer-term climatological context, specific criteria and methods for using historical sea surface temperature (SST) data to examine coral bleaching-related thermal conditions are proposed by analyzing three, 132 year SST reconstructions: ERSST, HadISST1, and GISST2.3b. These methodologies are applied to case studies at Discovery Bay, Jamaica (77.27°W, 18.45°N), Sombrero Reef, Florida, USA (81.11°W, 24.63°N), Academy Bay, Galápagos, Ecuador (90.31°W, 0.74°S), Pearl and Hermes Reef, Northwest Hawaiian Islands, USA (175.83°W, 27.83°N), Midway Island, Northwest Hawaiian Islands, USA (177.37°W, 28.25°N), Davies Reef, Australia (147.68°E, 18.83°S), and North Male Atoll, Maldives (73.35°E, 4.70°N). The results of this study show that (1) The historical SST data provide a useful long-term record of thermal conditions in reef ecosystems, giving important insight into the thermal history of coral reefs and (2) While coral bleaching and anomalously warm SSTs have occurred over much of the world in recent decades, case studies in the Caribbean, Northwest Hawaiian Islands, and parts of other regions such as the Great Barrier Reef exhibited SST conditions and cumulative thermal stress prior to 1979 that were comparable to those conditions observed during the strong, frequent coral bleaching events since 1979. This climatological context and knowledge of past environmental conditions in reef ecosystems may foster a better understanding of how coral reefs will

  11. Activities with Goto 45-CM Reflector at Bosscha Observatory, Lembang, Indonesia: Results and Aspects for Future Development

    NASA Astrophysics Data System (ADS)

    Malasan, H. L.

    In 1989 a 45-cm telescope of a cassegrainian type was installed, tested and commissioned at the Bosscha Observatory, Institut Teknologi Bandung. It was immediately put into use for UBV photometric observations of close binary systems. While the main function of the telescope was for photometric observations, the versatile design inherent to a reflector made possible to include a spectrograph which spectral dispersion could match the MK spectral classification. Activities related both to education and research conducted using this reflector since its installation comprise of scientific (photometry, spectroscopy, imagery) and experiment in instrumentation (fiber-fed spectrograph, CCD camera insitu testing). An important side result of the photometric observations is atmospheric study based on long-term atmospheric extionction coefficients. A multidiscipline approach, involving meteorologist and mathematiciants, on the study of natural and antrophogenic pollution of the atmosphere over Lembang has been recently undertaken. At present the telescope is, however, suffered from obsolete technology in its control functions. This has hampered it to be utilized fully, and therefore a plan to upgrade and extend the capability of the telescope has been made. The background, activities and results with emphasize to the collaborative work will be presented. Aspects for future development of the telescope and its auxiliary instruments will be discussed.

  12. Large-scale assembly of colloidal particles

    NASA Astrophysics Data System (ADS)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  13. Probing the imprint of interacting dark energy on very large scales

    NASA Astrophysics Data System (ADS)

    Duniya, Didam G. A.; Bertacca, Daniele; Maartens, Roy

    2015-03-01

    The observed galaxy power spectrum acquires relativistic corrections from light-cone effects, and these corrections grow on very large scales. Future galaxy surveys in optical, infrared and radio bands will probe increasingly large wavelength modes and reach higher redshifts. In order to exploit the new data on large scales, an accurate analysis requires inclusion of the relativistic effects. This is especially the case for primordial non-Gaussianity and for extending tests of dark energy models to horizon scales. Here we investigate the latter, focusing on models where the dark energy interacts nongravitationally with dark matter. Interaction in the dark sector can also lead to large-scale deviations in the power spectrum. If the relativistic effects are ignored, the imprint of interacting dark energy will be incorrectly identified and thus lead to a bias in constraints on interacting dark energy on very large scales.

  14. Multitree Algorithms for Large-Scale Astrostatistics

    NASA Astrophysics Data System (ADS)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  15. Probes of large-scale structure in the universe

    NASA Technical Reports Server (NTRS)

    Suto, Yasushi; Gorski, Krzysztof; Juszkiewicz, Roman; Silk, Joseph

    1988-01-01

    A general formalism is developed which shows that the gravitational instability theory for the origin of the large-scale structure of the universe is now capable of critically confronting observational results on cosmic background radiation angular anisotropies, large-scale bulk motions, and large-scale clumpiness in the galaxy counts. The results indicate that presently advocated cosmological models will have considerable difficulty in simultaneously explaining the observational results.

  16. Improving the sensitivity of future GW observatories in the 1-10 Hz band: Newtonian and seismic noise

    NASA Astrophysics Data System (ADS)

    Beker, M. G.; Cella, G.; Desalvo, R.; Doets, M.; Grote, H.; Harms, J.; Hennes, E.; Mandic, V.; Rabeling, D. S.; van den Brand, J. F. J.; van Leeuwen, C. M.

    2011-02-01

    The next generation gravitational wave interferometric detectors will likely be underground detectors to extend the GW detection frequency band to frequencies below the Newtonian noise limit. Newtonian noise originates from the continuous motion of the Earth's crust driven by human activity, tidal stresses and seismic motion, and from mass density fluctuations in the atmosphere. It is calculated that on Earth's surface, on a typical day, it will exceed the expected GW signals at frequencies below 10 Hz. The noise will decrease underground by an unknown amount. It is important to investigate and to quantify this expected reduction and its effect on the sensitivity of future detectors, to plan for further improvement strategies. We report about some of these aspects. Analytical models can be used in the simplest scenarios to get a better qualitative and semi-quantitative understanding. As more complete modeling can be done numerically, we will discuss also some results obtained with a finite-element-based modeling tool. The method is verified by comparing its results with the results of analytic calculations for surface detectors. A key point about noise models is their initial parameters and conditions, which require detailed information about seismic motion in a real scenario. We will describe an effort to characterize the seismic activity at the Homestake mine which is currently in progress. This activity is specifically aimed to provide informations and to explore the site as a possible candidate for an underground observatory. Although the only compelling reason to put the interferometer underground is to reduce the Newtonian noise, we expect that the more stable underground environment will have a more general positive impact on the sensitivity. We will end this report with some considerations about seismic and suspension noise.

  17. HTS cables open the window for large-scale renewables

    NASA Astrophysics Data System (ADS)

    Geschiere, A.; Willén, D.; Piga, E.; Barendregt, P.

    2008-02-01

    In a realistic approach to future energy consumption, the effects of sustainable power sources and the effects of growing welfare with increased use of electricity need to be considered. These factors lead to an increased transfer of electric energy over the networks. A dominant part of the energy need will come from expanded large-scale renewable sources. To use them efficiently over Europe, large energy transits between different countries are required. Bottlenecks in the existing infrastructure will be avoided by strengthening the network. For environmental reasons more infrastructure will be built underground. Nuon is studying the HTS technology as a component to solve these challenges. This technology offers a tremendously large power transport capacity as well as the possibility to reduce short circuit currents, making integration of renewables easier. Furthermore, power transport will be possible at lower voltage levels, giving the opportunity to upgrade the existing network while re-using it. This will result in large cost savings while reaching the future energy challenges. In a 6 km backbone structure in Amsterdam Nuon wants to install a 50 kV HTS Triax cable for a significant increase of the transport capacity, while developing its capabilities. Nevertheless several barriers have to be overcome.

  18. Optimal Wind Energy Integration in Large-Scale Electric Grids

    NASA Astrophysics Data System (ADS)

    Albaijat, Mohammad H.

    profit for investors for renting their transmission capacity, and cheaper electricity for end users. We propose a hybrid method based on a heuristic and deterministic method to attain new transmission lines additions and increase transmission capacity. Renewable energy resources (RES) have zero operating cost, which makes them very attractive for generation companies and market participants. In addition, RES have zero carbon emission, which helps relieve the concerns of environmental impacts of electric generation resources' carbon emission. RES are wind, solar, hydro, biomass, and geothermal. By 2030, the expectation is that more than 30% of electricity in the U.S. will come from RES. One major contributor of RES generation will be from wind energy resources (WES). Furthermore, WES will be an important component of the future generation portfolio. However, the nature of WES is that it experiences a high intermittency and volatility. Because of the great expectation of high WES penetration and the nature of such resources, researchers focus on studying the effects of such resources on the electric grid operation and its adequacy from different aspects. Additionally, current market operations of electric grids add another complication to consider while integrating RES (e.g., specifically WES). Mandates by market rules and long-term analysis of renewable penetration in large-scale electric grid are also the focus of researchers in recent years. We advocate a method for high-wind resources penetration study on large-scale electric grid operations. PMU is a geographical positioning system (GPS) based device, which provides immediate and precise measurements of voltage angle in a high-voltage transmission system. PMUs can update the status of a transmission line and related measurements (e.g., voltage magnitude and voltage phase angle) more frequently. Every second, a PMU can provide 30 samples of measurements compared to traditional systems (e.g., supervisory control and

  19. Large-scale solar magnetic fields and H-alpha patterns

    NASA Technical Reports Server (NTRS)

    Mcintosh, P. S.

    1972-01-01

    Coronal and interplanetary magnetic fields computed from measurements of large-scale photospheric magnetic fields suffer from interruptions in day-to-day observations and the limitation of using only measurements made near the solar central meridian. Procedures were devised for inferring the lines of polarity reversal from H-alpha solar patrol photographs that map the same large-scale features found on Mt. Wilson magnetograms. These features may be monitored without interruption by combining observations from the global network of observatories associated with NOAA's Space Environment Services Center. The patterns of inferred magnetic fields may be followed accurately as far as 60 deg from central meridian. Such patterns will be used to improve predictions of coronal features during the next solar eclipse.

  20. Large-Scale Pattern Discovery in Music

    NASA Astrophysics Data System (ADS)

    Bertin-Mahieux, Thierry

    This work focuses on extracting patterns in musical data from very large collections. The problem is split in two parts. First, we build such a large collection, the Million Song Dataset, to provide researchers access to commercial-size datasets. Second, we use this collection to study cover song recognition which involves finding harmonic patterns from audio features. Regarding the Million Song Dataset, we detail how we built the original collection from an online API, and how we encouraged other organizations to participate in the project. The result is the largest research dataset with heterogeneous sources of data available to music technology researchers. We demonstrate some of its potential and discuss the impact it already has on the field. On cover song recognition, we must revisit the existing literature since there are no publicly available results on a dataset of more than a few thousand entries. We present two solutions to tackle the problem, one using a hashing method, and one using a higher-level feature computed from the chromagram (dubbed the 2DFTM). We further investigate the 2DFTM since it has potential to be a relevant representation for any task involving audio harmonic content. Finally, we discuss the future of the dataset and the hope of seeing more work making use of the different sources of data that are linked in the Million Song Dataset. Regarding cover songs, we explain how this might be a first step towards defining a harmonic manifold of music, a space where harmonic similarities between songs would be more apparent.

  1. Svetloe Radio Astronomical Observatory

    NASA Technical Reports Server (NTRS)

    Smolentsev, Sergey; Rahimov, Ismail

    2013-01-01

    This report summarizes information about the Svetloe Radio Astronomical Observatory activities in 2012. Last year, a number of changes took place in the observatory to improve some technical characteristics and to upgrade some units to their required status. The report provides an overview of current geodetic VLBI activities and gives an outlook for the future.

  2. Zelenchukskaya Radio Astronomical Observatory

    NASA Technical Reports Server (NTRS)

    Smolentsev, Sergey; Dyakov, Andrei

    2013-01-01

    This report summarizes information about Zelenchukskaya Radio Astronomical Observatory activities in 2012. Last year a number of changes took place in the observatory to improve some technical characteristics and to upgrade some units to the required status. The report provides an overview of current geodetic VLBI activities and gives an outlook for the future.

  3. INTERNATIONAL WORKSHOP ON LARGE-SCALE REFORESTATION: PROCEEDINGS

    EPA Science Inventory

    The purpose of the workshop was to identify major operational and ecological considerations needed to successfully conduct large-scale reforestation projects throughout the forested regions of the world. Large-scale" for this workshop means projects where, by human effort, approx...

  4. Using Large-Scale Assessment Scores to Determine Student Grades

    ERIC Educational Resources Information Center

    Miller, Tess

    2013-01-01

    Many Canadian provinces provide guidelines for teachers to determine students' final grades by combining a percentage of students' scores from provincial large-scale assessments with their term scores. This practice is thought to hold students accountable by motivating them to put effort into completing the large-scale assessment, thereby…

  5. The Challenge of Large-Scale Literacy Improvement

    ERIC Educational Resources Information Center

    Levin, Ben

    2010-01-01

    This paper discusses the challenge of making large-scale improvements in literacy in schools across an entire education system. Despite growing interest and rhetoric, there are very few examples of sustained, large-scale change efforts around school-age literacy. The paper reviews 2 instances of such efforts, in England and Ontario. After…

  6. Large Scale Turbulent Structures in Supersonic Jets

    NASA Technical Reports Server (NTRS)

    Rao, Ram Mohan; Lundgren, Thomas S.

    1997-01-01

    Jet noise is a major concern in the design of commercial aircraft. Studies by various researchers suggest that aerodynamic noise is a major contributor to jet noise. Some of these studies indicate that most of the aerodynamic jet noise due to turbulent mixing occurs when there is a rapid variation in turbulent structure, i.e. rapidly growing or decaying vortices. The objective of this research was to simulate a compressible round jet to study the non-linear evolution of vortices and the resulting acoustic radiations. In particular, to understand the effect of turbulence structure on the noise. An ideal technique to study this problem is Direct Numerical Simulations (DNS), because it provides precise control on the initial and boundary conditions that lead to the turbulent structures studied. It also provides complete 3-dimensional time dependent data. Since the dynamics of a temporally evolving jet are not greatly different from those of a spatially evolving jet, a temporal jet problem was solved, using periodicity in the direction of the jet axis. This enables the application of Fourier spectral methods in the streamwise direction. Physically this means that turbulent structures in the jet are repeated in successive downstream cells instead of being gradually modified downstream into a jet plume. The DNS jet simulation helps us understand the various turbulent scales and mechanisms of turbulence generation in the evolution of a compressible round jet. These accurate flow solutions will be used in future research to estimate near-field acoustic radiation by computing the total outward flux across a surface and determine how it is related to the evolution of the turbulent solutions. Furthermore, these simulations allow us to investigate the sensitivity of acoustic radiations to inlet/boundary conditions, with possible appli(,a- tion to active noise suppression. In addition, the data generated can be used to compute, various turbulence quantities such as mean

  7. Large Scale Turbulent Structures in Supersonic Jets

    NASA Technical Reports Server (NTRS)

    Rao, Ram Mohan; Lundgren, Thomas S.

    1997-01-01

    Jet noise is a major concern in the design of commercial aircraft. Studies by various researchers suggest that aerodynamic noise is a major contributor to jet noise. Some of these studies indicate that most of the aerodynamic jet noise due to turbulent mixing occurs when there is a rapid variation in turbulent structure, i.e. rapidly growing or decaying vortices. The objective of this research was to simulate a compressible round jet to study the non-linear evolution of vortices and the resulting acoustic radiations. In particular, to understand the effect of turbulence structure on the noise. An ideal technique to study this problem is Direct Numerical Simulations(DNS), because it provides precise control on the initial and boundary conditions that lead to the turbulent structures studied. It also provides complete 3-dimensional time dependent data. Since the dynamics of a temporally evolving jet are not greatly different from those, of a spatially evolving jet, a temporal jet problem was solved, using periodicity ill the direction of the jet axis. This enables the application of Fourier spectral methods in the streamwise direction. Physically this means that turbulent structures in the jet are repeated in successive downstream cells instead of being gradually modified downstream into a jet plume. The DNS jet simulation helps us understand the various turbulent scales and mechanisms of turbulence generation in the evolution of a compressible round jet. These accurate flow solutions will be used in future research to estimate near-field acoustic radiation by computing the total outward flux across a surface and determine how it is related to the evolution of the turbulent solutions. Furthermore, these simulations allow us to investigate the sensitivity of acoustic radiations to inlet/boundary conditions, with possible application to active noise suppression. In addition, the data generated can be used to compute various turbulence quantities such as mean velocities

  8. Distribution probability of large-scale landslides in central Nepal

    NASA Astrophysics Data System (ADS)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  9. Importance-truncated large-scale shell model

    NASA Astrophysics Data System (ADS)

    Stumpf, Christina; Braun, Jonas; Roth, Robert

    2016-02-01

    We propose an importance-truncation scheme for the large-scale nuclear shell model that extends its range of applicability to larger valence spaces and midshell nuclei. It is based on a perturbative measure for the importance of individual basis states that acts as an additional truncation for the many-body model space in which the eigenvalue problem of the Hamiltonian is solved numerically. Through a posteriori extrapolations of all observables to vanishing importance threshold, the full shell-model results can be recovered. In addition to simple threshold extrapolations, we explore extrapolations based on the energy variance. We apply the importance-truncated shell model for the study of 56Ni in the p f valence space and of 60Zn and 64Ge in the p f g9 /2 space. We demonstrate the efficiency and accuracy of the approach, which pave the way for future applications of valence-space interactions derived in ab initio approaches in larger valence spaces.

  10. How large-scale subsidence affects stratocumulus transitions

    NASA Astrophysics Data System (ADS)

    van der Dussen, J. J.; de Roode, S. R.; Siebesma, A. P.

    2016-01-01

    Some climate modeling results suggest that the Hadley circulation might weaken in a future climate, causing a subsequent reduction in the large-scale subsidence velocity in the subtropics. In this study we analyze the cloud liquid water path (LWP) budget from large-eddy simulation (LES) results of three idealized stratocumulus transition cases, each with a different subsidence rate. As shown in previous studies a reduced subsidence is found to lead to a deeper stratocumulus-topped boundary layer, an enhanced cloud-top entrainment rate and a delay in the transition of stratocumulus clouds into shallow cumulus clouds during its equatorwards advection by the prevailing trade winds. The effect of a reduction of the subsidence rate can be summarized as follows. The initial deepening of the stratocumulus layer is partly counteracted by an enhanced absorption of solar radiation. After some hours the deepening of the boundary layer is accelerated by an enhancement of the entrainment rate. Because this is accompanied by a change in the cloud-base turbulent fluxes of moisture and heat, the net change in the LWP due to changes in the turbulent flux profiles is negligibly small.

  11. Assessing large-scale wildlife responses to human infrastructure development.

    PubMed

    Torres, Aurora; Jaeger, Jochen A G; Alonso, Juan Carlos

    2016-07-26

    Habitat loss and deterioration represent the main threats to wildlife species, and are closely linked to the expansion of roads and human settlements. Unfortunately, large-scale effects of these structures remain generally overlooked. Here, we analyzed the European transportation infrastructure network and found that 50% of the continent is within 1.5 km of transportation infrastructure. We present a method for assessing the impacts from infrastructure on wildlife, based on functional response curves describing density reductions in birds and mammals (e.g., road-effect zones), and apply it to Spain as a case study. The imprint of infrastructure extends over most of the country (55.5% in the case of birds and 97.9% for mammals), with moderate declines predicted for birds (22.6% of individuals) and severe declines predicted for mammals (46.6%). Despite certain limitations, we suggest the approach proposed is widely applicable to the evaluation of effects of planned infrastructure developments under multiple scenarios, and propose an internationally coordinated strategy to update and improve it in the future. PMID:27402749

  12. Scalable pattern recognition for large-scale scientific data mining

    SciTech Connect

    Kamath, C.; Musick, R.

    1998-03-23

    Our ability to generate data far outstrips our ability to explore and understand it. The true value of this data lies not in its final size or complexity, but rather in our ability to exploit the data to achieve scientific goals. The data generated by programs such as ASCI have such a large scale that it is impractical to manually analyze, explore, and understand it. As a result, useful information is overlooked, and the potential benefits of increased computational and data gathering capabilities are only partially realized. The difficulties that will be faced by ASCI applications in the near future are foreshadowed by the challenges currently facing astrophysicists in making full use of the data they have collected over the years. For example, among other difficulties, astrophysicists have expressed concern that the sheer size of their data restricts them to looking at very small, narrow portions at any one time. This narrow focus has resulted in the loss of ``serendipitous`` discoveries which have been so vital to progress in the area in the past. To solve this problem, a new generation of computational tools and techniques is needed to help automate the exploration and management of large scientific data. This whitepaper proposes applying and extending ideas from the area of data mining, in particular pattern recognition, to improve the way in which scientists interact with large, multi-dimensional, time-varying data.

  13. Collaborative Large-scale Engineering Analysis Network for Environmental Research (CLEANER)Science Planning

    NASA Astrophysics Data System (ADS)

    Schnoor, J. L.; Minsker, B. S.; Haas, C. N.

    2005-12-01

    The Project Office of the Collaborative Large-scale Engineering Analysis Network for Environmental Research (CLEANER) was awarded a cooperative agreement from the National Science Foundation (NSF)and began operations on August 1, 2005. Since that time we have organized six standing committees and an executive committee with an advisory board. The first all-hands meeting of CLEANER took place at NSF and the National Center for Supercomputing Applications (NCSA) Access facility in Arlington, Virginia, in September. Among the initial tasks of CLEANER is to join with the Consortium of Universities for the Advancement of Hydrological Sciences Incorporated (CUAHSI) in developing a joint science plan for a national observatory for environmental research utilizing NSF Major Research Equipment and Facilities Construction (MREFC) funds slated for 2011. This presentation describes our initial thinking on the science plan and our vision for the national environmental observatory and cyberinfrastructure.

  14. The Virtual Solar Observatory - Status and Plans

    NASA Astrophysics Data System (ADS)

    Hill, F.

    2001-05-01

    The Virtual Solar Observatory (VSO) is a software environment for searching, obtaining and analyzing data from archives of solar data that are distributed at many different observatories around the world. This "observatory" is virtual since it exists only on the Internet, not as a physical structure. As a research tool, the VSO would enable a new field of correlative statistical solar physics in which large-scale comparative studies spanning many dimensions and data sources could be carried out. Several groups with solar archives have indicated their willingness to particpate as a VSO component. These include NSO (KPVT GONG, and SOLIS); NASA/GSFC SDAC; SOHO; Stanford (SOI/MDI, TON, WSO); Lockheed (TRACE); MSU (Yohkoh); UCLA (Mt. Wilson 150-ft Tower); USC (Mt. Wilson 60-ft Tower); BBSO/NJIT; Arcetri (ARTHEMIS); Meudon; HAO; and CSUN/SFO. The VSO will be implemented so that additional systems can be easily incorporated. The VSO technical concept includes the federation of distributed solar archives, an adaptive metadata thesaurus, a single unified intuitive GUI, context-based searches, and distributed computing. The underlying structure would most likely be constructed using platform-independent tools such as XML and JavaScript. There are several technical challenges facing the VSO development. Issues of security, bandwidth, metadata, and load balancing must be faced. While the VSO is currently in the concept phase, a number of funding opportunities are bing pursued. The status of these proposals and plans for the future will be updated at the meeting.

  15. Food security through large scale investments in agriculture

    NASA Astrophysics Data System (ADS)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  16. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    SciTech Connect

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  17. Copy of Using Emulation and Simulation to Understand the Large-Scale Behavior of the Internet.

    SciTech Connect

    Adalsteinsson, Helgi; Armstrong, Robert C.; Chiang, Ken; Gentile, Ann C.; Lloyd, Levi; Minnich, Ronald G.; Vanderveen, Keith; Van Randwyk, Jamie A; Rudish, Don W.

    2008-10-01

    We report on the work done in the late-start LDRDUsing Emulation and Simulation toUnderstand the Large-Scale Behavior of the Internet. We describe the creation of a researchplatform that emulates many thousands of machines to be used for the study of large-scale inter-net behavior. We describe a proof-of-concept simple attack we performed in this environment.We describe the successful capture of a Storm bot and, from the study of the bot and furtherliterature search, establish large-scale aspects we seek to understand via emulation of Storm onour research platform in possible follow-on work. Finally, we discuss possible future work.3

  18. Mechanisation of large-scale agricultural fields in developing countries - a review.

    PubMed

    Onwude, Daniel I; Abdulstter, Rafia; Gomes, Chandima; Hashim, Norhashila

    2016-09-01

    Mechanisation of large-scale agricultural fields often requires the application of modern technologies such as mechanical power, automation, control and robotics. These technologies are generally associated with relatively well developed economies. The application of these technologies in some developing countries in Africa and Asia is limited by factors such as technology compatibility with the environment, availability of resources to facilitate the technology adoption, cost of technology purchase, government policies, adequacy of technology and appropriateness in addressing the needs of the population. As a result, many of the available resources have been used inadequately by farmers, who continue to rely mostly on conventional means of agricultural production, using traditional tools and equipment in most cases. This has led to low productivity and high cost of production among others. Therefore this paper attempts to evaluate the application of present day technology and its limitations to the advancement of large-scale mechanisation in developing countries of Africa and Asia. Particular emphasis is given to a general understanding of the various levels of mechanisation, present day technology, its management and application to large-scale agricultural fields. This review also focuses on/gives emphasis to future outlook that will enable a gradual, evolutionary and sustainable technological change. The study concludes that large-scale-agricultural farm mechanisation for sustainable food production in Africa and Asia must be anchored on a coherent strategy based on the actual needs and priorities of the large-scale farmers. © 2016 Society of Chemical Industry. PMID:26940194

  19. Large scale suppression of scalar power on a spatial condensation

    NASA Astrophysics Data System (ADS)

    Kouwn, Seyen; Kwon, O.-Kab; Oh, Phillial

    2015-03-01

    We consider a deformed single-field inflation model in terms of three SO(3) symmetric moduli fields. We find that spatially linear solutions for the moduli fields induce a phase transition during the early stage of the inflation and the suppression of scalar power spectrum at large scales. This suppression can be an origin of anomalies for large-scale perturbation modes in the cosmological observation.

  20. Interpretation of large-scale deviations from the Hubble flow

    NASA Astrophysics Data System (ADS)

    Grinstein, B.; Politzer, H. David; Rey, S.-J.; Wise, Mark B.

    1987-03-01

    The theoretical expectation for large-scale streaming velocities relative to the Hubble flow is expressed in terms of statistical correlation functions. Only for objects that trace the mass would these velocities have a simple cosmological interpretation. If some biasing effects the objects' formation, then nonlinear gravitational evolution is essential to predicting the expected large-scale velocities, which also depend on the nature of the biasing.

  1. Radio frequency interference measurement in site testing programs for the future multi-wavelength observatory in Indonesia

    NASA Astrophysics Data System (ADS)

    Hidayat, T.; Dermawan, B.; Mahasena, P.; Munir, A.; Nurzaman, M. Z.; Jaelani, A. T.

    2015-09-01

    A new multi-wavelength astronomical observatory in Indonesia is currently under preparation. To pave the way the presence of radio astronomical facilities in the planned observatory, we conduct a series of radio frequency interference (RFI) measurements as part of its site testing programs. The corresponding sites, instruments as well as its measurement set up must be selected, planned, and implemented accordingly. This work presents our preparation set up and considers the RFI measurement at meter and centimeter wavelengths (or frequencies from 50 MHz up to 6 GHz). In this frequency range, it is relevant to adopt the Square Kilometre Array (SKA) Protocol as our measurement method. The first results using the Mode 1 of the SKA Protocol are used as reference in this work. Preparation of the Mode 2 is currently undertaken and its preliminary results are presented.

  2. Large-scale motions in a plane wall jet

    NASA Astrophysics Data System (ADS)

    Gnanamanickam, Ebenezer; Jonathan, Latim; Shibani, Bhatt

    2015-11-01

    The dynamic significance of large-scale motions in turbulent boundary layers have been the focus of several recent studies, primarily focussing on canonical flows - zero pressure gradient boundary layers, flows within pipes and channels. This work presents an investigation into the large-scale motions in a boundary layer that is used as the prototypical flow field for flows with large-scale mixing and reactions, the plane wall jet. An experimental investigation is carried out in a plane wall jet facility designed to operate at friction Reynolds numbers Reτ > 1000 , which allows for the development of a significant logarithmic region. The streamwise turbulent intensity across the boundary layer is decomposed into small-scale (less than one integral length-scale δ) and large-scale components. The small-scale energy has a peak in the near-wall region associated with the near-wall turbulent cycle as in canonical boundary layers. However, eddies of large-scales are the dominating eddies having significantly higher energy, than the small-scales across almost the entire boundary layer even at the low to moderate Reynolds numbers under consideration. The large-scales also appear to amplitude and frequency modulate the smaller scales across the entire boundary layer.

  3. Large-scale CO2 storage — Is it feasible?

    NASA Astrophysics Data System (ADS)

    Johansen, H.

    2013-06-01

    CCS is generally estimated to have to account for about 20% of the reduction of CO2 emissions to the atmosphere. This paper focuses on the technical aspects of CO2 storage, even if the CCS challenge is equally dependent upon finding viable international solutions to a wide range of economic, political and cultural issues. It has already been demonstrated that it is technically possible to store adequate amounts of CO2 in the subsurface (Sleipner, InSalah, Snøhvit). The large-scale storage challenge (several Gigatons of CO2 per year) is more an issue of minimizing cost without compromising safety, and of making international regulations.The storage challenge may be split into 4 main parts: 1) finding reservoirs with adequate storage capacity, 2) make sure that the sealing capacity above the reservoir is sufficient, 3) build the infrastructure for transport, drilling and injection, and 4) set up and perform the necessary monitoring activities. More than 150 years of worldwide experience from the production of oil and gas is an important source of competence for CO2 storage. The storage challenge is however different in three important aspects: 1) the storage activity results in pressure increase in the subsurface, 2) there is no production of fluids that give important feedback on reservoir performance, and 3) the monitoring requirement will have to extend for a much longer time into the future than what is needed during oil and gas production. An important property of CO2 is that its behaviour in the subsurface is significantly different from that of oil and gas. CO2 in contact with water is reactive and corrosive, and may impose great damage on both man-made and natural materials, if proper precautions are not executed. On the other hand, the long-term effect of most of these reactions is that a large amount of CO2 will become immobilized and permanently stored as solid carbonate minerals. The reduced opportunity for direct monitoring of fluid samples close to the

  4. The large-scale landslide risk classification in catchment scale

    NASA Astrophysics Data System (ADS)

    Liu, Che-Hsin; Wu, Tingyeh; Chen, Lien-Kuang; Lin, Sheng-Chi

    2013-04-01

    The landslide disasters caused heavy casualties during Typhoon Morakot, 2009. This disaster is defined as largescale landslide due to the casualty numbers. This event also reflects the survey on large-scale landslide potential is so far insufficient and significant. The large-scale landslide potential analysis provides information about where should be focused on even though it is very difficult to distinguish. Accordingly, the authors intend to investigate the methods used by different countries, such as Hong Kong, Italy, Japan and Switzerland to clarify the assessment methodology. The objects include the place with susceptibility of rock slide and dip slope and the major landslide areas defined from historical records. Three different levels of scales are confirmed necessarily from country to slopeland, which are basin, catchment, and slope scales. Totally ten spots were classified with high large-scale landslide potential in the basin scale. The authors therefore focused on the catchment scale and employ risk matrix to classify the potential in this paper. The protected objects and large-scale landslide susceptibility ratio are two main indexes to classify the large-scale landslide risk. The protected objects are the constructions and transportation facilities. The large-scale landslide susceptibility ratio is based on the data of major landslide area and dip slope and rock slide areas. Totally 1,040 catchments are concerned and are classified into three levels, which are high, medium, and low levels. The proportions of high, medium, and low levels are 11%, 51%, and 38%, individually. This result represents the catchments with high proportion of protected objects or large-scale landslide susceptibility. The conclusion is made and it be the base material for the slopeland authorities when considering slopeland management and the further investigation.

  5. Large Scale Relationship between Aquatic Insect Traits and Climate

    PubMed Central

    Bhowmik, Avit Kumar; Schäfer, Ralf B.

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices (~18 km resolution) for five insect orders (Diptera, Ephemeroptera, Odonata, Plecoptera and Trichoptera), evaluated their potential for changing distribution pattern under future climate change and identified the most influential bioclimatic indices. The data comprised 782 species and 395 genera sampled in 4,752 stream sites during 2006 and 2007 in Germany (~357,000 km² spatial extent). We quantified the variability and spatial autocorrelation in the traits and orders that are associated with the combined and individual bioclimatic indices. Traits of temperature preference grouping feature that are the products of several other underlying climate-associated traits, and the insect order Ephemeroptera exhibited the strongest response to the bioclimatic indices as well as the highest potential for changing distribution pattern. Regarding individual traits, insects in general and ephemeropterans preferring very cold temperature showed the highest response, and the insects preferring cold and trichopterans preferring moderate temperature showed the highest potential for changing distribution. We showed that the seasonal radiation and moisture are the most influential bioclimatic aspects, and thus changes in these aspects may affect the most responsive traits and orders and drive a change in their spatial distribution pattern. Our findings support the development of trait-based metrics to predict and detect climate

  6. Large Scale Relationship between Aquatic Insect Traits and Climate.

    PubMed

    Bhowmik, Avit Kumar; Schäfer, Ralf B

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices (~18 km resolution) for five insect orders (Diptera, Ephemeroptera, Odonata, Plecoptera and Trichoptera), evaluated their potential for changing distribution pattern under future climate change and identified the most influential bioclimatic indices. The data comprised 782 species and 395 genera sampled in 4,752 stream sites during 2006 and 2007 in Germany (~357,000 km² spatial extent). We quantified the variability and spatial autocorrelation in the traits and orders that are associated with the combined and individual bioclimatic indices. Traits of temperature preference grouping feature that are the products of several other underlying climate-associated traits, and the insect order Ephemeroptera exhibited the strongest response to the bioclimatic indices as well as the highest potential for changing distribution pattern. Regarding individual traits, insects in general and ephemeropterans preferring very cold temperature showed the highest response, and the insects preferring cold and trichopterans preferring moderate temperature showed the highest potential for changing distribution. We showed that the seasonal radiation and moisture are the most influential bioclimatic aspects, and thus changes in these aspects may affect the most responsive traits and orders and drive a change in their spatial distribution pattern. Our findings support the development of trait-based metrics to predict and detect climate

  7. Wireless gigabit data telemetry for large-scale neural recording.

    PubMed

    Kuan, Yen-Cheng; Lo, Yi-Kai; Kim, Yanghyo; Chang, Mau-Chung Frank; Liu, Wentai

    2015-05-01

    Implantable wireless neural recording from a large ensemble of simultaneously acting neurons is a critical component to thoroughly investigate neural interactions and brain dynamics from freely moving animals. Recent researches have shown the feasibility of simultaneously recording from hundreds of neurons and suggested that the ability of recording a larger number of neurons results in better signal quality. This massive recording inevitably demands a large amount of data transfer. For example, recording 2000 neurons while keeping the signal fidelity ( > 12 bit, > 40 KS/s per neuron) needs approximately a 1-Gb/s data link. Designing a wireless data telemetry system to support such (or higher) data rate while aiming to lower the power consumption of an implantable device imposes a grand challenge on neuroscience community. In this paper, we present a wireless gigabit data telemetry for future large-scale neural recording interface. This telemetry comprises of a pair of low-power gigabit transmitter and receiver operating at 60 GHz, and establishes a short-distance wireless link to transfer the massive amount of neural signals outward from the implanted device. The transmission distance of the received neural signal can be further extended by an externally rendezvous wireless transceiver, which is less power/heat-constraint since it is not at the immediate proximity of the cortex and its radiated signal is not seriously attenuated by the lossy tissue. The gigabit data link has been demonstrated to achieve a high data rate of 6 Gb/s with a bit-error-rate of 10(-12) at a transmission distance of 6 mm, an applicable separation between transmitter and receiver. This high data rate is able to support thousands of recording channels while ensuring a low energy cost per bit of 2.08 pJ/b. PMID:25823050

  8. Designing large-scale conservation corridors for pattern and process.

    PubMed

    Rouget, Mathieu; Cowling, Richard M; Lombard, Amanda T; Knight, Andrew T; Kerley, Graham I H

    2006-04-01

    A major challenge for conservation assessments is to identify priority areas that incorporate biological patterns and processes. Because large-scale processes are mostly oriented along environmental gradients, we propose to accommodate them by designing regional-scale corridors to capture these gradients. Based on systematic conservation planning principles such as representation and persistence, we identified large tracts of untransformed land (i.e., conservation corridors) for conservation that would achieve biodiversity targets for pattern and process in the Subtropical Thicket Biome of South Africa. We combined least-cost path analysis with a target-driven algorithm to identify the best option for capturing key environmental gradients while considering biodiversity targets and conservation opportunities and constraints. We identified seven conservation corridors on the basis of subtropical thicket representation, habitat transformation and degradation, wildlife suitability, irreplaceability of vegetation types, protected area networks, and future land-use pressures. These conservation corridors covered 21.1% of the planning region (ranging from 600 to 5200 km2) and successfully achieved targets for biological processes and to a lesser extent for vegetation types. The corridors we identified are intended to promote the persistence of ecological processes (gradients and fixed processes) and fulfill half of the biodiversity pattern target. We compared the conservation corridors with a simplified corridor design consisting of a fixed-width buffer along major rivers. Conservation corridors outperformed river buffers in seven out of eight criteria. Our corridor design can provide a tool for quantifying trade-offs between various criteria (biodiversity pattern and process, implementation constraints and opportunities). A land-use management model was developed to facilitate implementation of conservation actions within these corridors. PMID:16903115

  9. EINSTEIN'S SIGNATURE IN COSMOLOGICAL LARGE-SCALE STRUCTURE

    SciTech Connect

    Bruni, Marco; Hidalgo, Juan Carlos; Wands, David

    2014-10-10

    We show how the nonlinearity of general relativity generates a characteristic nonGaussian signal in cosmological large-scale structure that we calculate at all perturbative orders in a large-scale limit. Newtonian gravity and general relativity provide complementary theoretical frameworks for modeling large-scale structure in ΛCDM cosmology; a relativistic approach is essential to determine initial conditions, which can then be used in Newtonian simulations studying the nonlinear evolution of the matter density. Most inflationary models in the very early universe predict an almost Gaussian distribution for the primordial metric perturbation, ζ. However, we argue that it is the Ricci curvature of comoving-orthogonal spatial hypersurfaces, R, that drives structure formation at large scales. We show how the nonlinear relation between the spatial curvature, R, and the metric perturbation, ζ, translates into a specific nonGaussian contribution to the initial comoving matter density that we calculate for the simple case of an initially Gaussian ζ. Our analysis shows the nonlinear signature of Einstein's gravity in large-scale structure.

  10. Unsaturated Hydraulic Conductivity for Evaporation in Large scale Heterogeneous Soils

    NASA Astrophysics Data System (ADS)

    Sun, D.; Zhu, J.

    2014-12-01

    In this study we aim to provide some practical guidelines of how the commonly used simple averaging schemes (arithmetic, geometric, or harmonic mean) perform in simulating large scale evaporation in a large scale heterogeneous landscape. Previous studies on hydraulic property upscaling focusing on steady state flux exchanges illustrated that an effective hydraulic property is usually more difficult to define for evaporation. This study focuses on upscaling hydraulic properties of large scale transient evaporation dynamics using the idea of the stream tube approach. Specifically, the two main objectives are: (1) if the three simple averaging schemes (i.e., arithmetic, geometric and harmonic means) of hydraulic parameters are appropriate in representing large scale evaporation processes, and (2) how the applicability of these simple averaging schemes depends on the time scale of evaporation processes in heterogeneous soils. Multiple realizations of local evaporation processes are carried out using HYDRUS-1D computational code (Simunek et al, 1998). The three averaging schemes of soil hydraulic parameters were used to simulate the cumulative flux exchange, which is then compared with the large scale average cumulative flux. The sensitivity of the relative errors to the time frame of evaporation processes is also discussed.