Science.gov

Sample records for mammogrid large-scale distributed

  1. The large-scale distribution of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.

    1989-01-01

    The spatial distribution of galaxies in the universe is characterized on the basis of the six completed strips of the Harvard-Smithsonian Center for Astrophysics redshift-survey extension. The design of the survey is briefly reviewed, and the results are presented graphically. Vast low-density voids similar to the void in Bootes are found, almost completely surrounded by thin sheets of galaxies. Also discussed are the implications of the results for the survey sampling problem, the two-point correlation function of the galaxy distribution, the possibility of detecting large-scale coherent flows, theoretical models of large-scale structure, and the identification of groups and clusters of galaxies.

  2. Distribution probability of large-scale landslides in central Nepal

    NASA Astrophysics Data System (ADS)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  3. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    NASA Astrophysics Data System (ADS)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  4. Large-scale mass distribution in the Illustris simulation

    NASA Astrophysics Data System (ADS)

    Haider, M.; Steinhauser, D.; Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Hernquist, L.

    2016-04-01

    Observations at low redshifts thus far fail to account for all of the baryons expected in the Universe according to cosmological constraints. A large fraction of the baryons presumably resides in a thin and warm-hot medium between the galaxies, where they are difficult to observe due to their low densities and high temperatures. Cosmological simulations of structure formation can be used to verify this picture and provide quantitative predictions for the distribution of mass in different large-scale structure components. Here we study the distribution of baryons and dark matter at different epochs using data from the Illustris simulation. We identify regions of different dark matter density with the primary constituents of large-scale structure, allowing us to measure mass and volume of haloes, filaments and voids. At redshift zero, we find that 49 per cent of the dark matter and 23 per cent of the baryons are within haloes more massive than the resolution limit of 2 × 108 M⊙. The filaments of the cosmic web host a further 45 per cent of the dark matter and 46 per cent of the baryons. The remaining 31 per cent of the baryons reside in voids. The majority of these baryons have been transported there through active galactic nuclei feedback. We note that the feedback model of Illustris is too strong for heavy haloes, therefore it is likely that we are overestimating this amount. Categorizing the baryons according to their density and temperature, we find that 17.8 per cent of them are in a condensed state, 21.6 per cent are present as cold, diffuse gas, and 53.9 per cent are found in the state of a warm-hot intergalactic medium.

  5. Distribution of entanglement in large-scale quantum networks

    NASA Astrophysics Data System (ADS)

    Perseguers, S.; Lapeyre, G. J., Jr.; Cavalcanti, D.; Lewenstein, M.; Acín, A.

    2013-09-01

    The concentration and distribution of quantum entanglement is an essential ingredient in emerging quantum information technologies. Much theoretical and experimental effort has been expended in understanding how to distribute entanglement in one-dimensional networks. However, as experimental techniques in quantum communication develop, protocols for multi-dimensional systems become essential. Here, we focus on recent theoretical developments in protocols for distributing entanglement in regular and complex networks, with particular attention to percolation theory and network-based error correction.

  6. Secure Large-Scale Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Dan (Technical Monitor)

    2001-01-01

    To fully conduct research that will support the far-term concepts, technologies and methods required to improve the safety of Air Transportation a simulation environment of the requisite degree of fidelity must first be in place. The Virtual National Airspace Simulation (VNAS) will provide the underlying infrastructure necessary for such a simulation system. Aerospace-specific knowledge management services such as intelligent data-integration middleware will support the management of information associated with this complex and critically important operational environment. This simulation environment, in conjunction with a distributed network of supercomputers, and high-speed network connections to aircraft, and to Federal Aviation Administration (FAA), airline and other data-sources will provide the capability to continuously monitor and measure operational performance against expected performance. The VNAS will also provide the tools to use this performance baseline to obtain a perspective of what is happening today and of the potential impact of proposed changes before they are introduced into the system.

  7. Design of Availability-Dependent Distributed Services in Large-Scale Uncooperative Settings

    ERIC Educational Resources Information Center

    Morales, Ramses Victor

    2009-01-01

    Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…

  8. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    PubMed Central

    He, Hui; Fan, Guotao; Ye, Jianwei; Zhang, Weizhe

    2013-01-01

    It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system's emergency response capabilities, alleviate the cyber attacks' damage, and strengthen the system's counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system's plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks' topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology. PMID:24191145

  9. A topology visualization early warning distribution algorithm for large-scale network security incidents.

    PubMed

    He, Hui; Fan, Guotao; Ye, Jianwei; Zhang, Weizhe

    2013-01-01

    It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system's emergency response capabilities, alleviate the cyber attacks' damage, and strengthen the system's counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system's plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks' topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology. PMID:24191145

  10. Large-scale geographic variation in distribution and abundance of Australian deep-water kelp forests.

    PubMed

    Marzinelli, Ezequiel M; Williams, Stefan B; Babcock, Russell C; Barrett, Neville S; Johnson, Craig R; Jordan, Alan; Kendrick, Gary A; Pizarro, Oscar R; Smale, Dan A; Steinberg, Peter D

    2015-01-01

    Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV) facility of Australia's Integrated Marine Observing System (IMOS) to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10-100 m to 100-1,000 km) and depths (15-60 m) across several regions ca 2-6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth) and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40-50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves. PMID:25693066

  11. Large-Scale Ichthyoplankton and Water Mass Distribution along the South Brazil Shelf

    PubMed Central

    de Macedo-Soares, Luis Carlos Pinto; Garcia, Carlos Alberto Eiras; Freire, Andrea Santarosa; Muelbert, José Henrique

    2014-01-01

    Ichthyoplankton is an essential component of pelagic ecosystems, and environmental factors play an important role in determining its distribution. We have investigated simultaneous latitudinal and cross-shelf gradients in ichthyoplankton abundance to test the hypothesis that the large-scale distribution of fish larvae in the South Brazil Shelf is associated with water mass composition. Vertical plankton tows were collected between 21°27′ and 34°51′S at 107 stations, in austral late spring and early summer seasons. Samples were taken with a conical-cylindrical plankton net from the depth of chlorophyll maxima to the surface in deep stations, or from 10 m from the bottom to the surface in shallow waters. Salinity and temperature were obtained with a CTD/rosette system, which provided seawater for chlorophyll-a and nutrient concentrations. The influence of water mass on larval fish species was studied using Indicator Species Analysis, whereas environmental effects on the distribution of larval fish species were analyzed by Distance-based Redundancy Analysis. Larval fish species were associated with specific water masses: in the north, Sardinella brasiliensis was found in Shelf Water; whereas in the south, Engraulis anchoita inhabited the Plata Plume Water. At the slope, Tropical Water was characterized by the bristlemouth Cyclothone acclinidens. The concurrent analysis showed the importance of both cross-shelf and latitudinal gradients on the large-scale distribution of larval fish species. Our findings reveal that ichthyoplankton composition and large-scale spatial distribution are determined by water mass composition in both latitudinal and cross-shelf gradients. PMID:24614798

  12. Large-Scale Geographic Variation in Distribution and Abundance of Australian Deep-Water Kelp Forests

    PubMed Central

    Marzinelli, Ezequiel M.; Williams, Stefan B.; Babcock, Russell C.; Barrett, Neville S.; Johnson, Craig R.; Jordan, Alan; Kendrick, Gary A.; Pizarro, Oscar R.; Smale, Dan A.; Steinberg, Peter D.

    2015-01-01

    Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV) facility of Australia’s Integrated Marine Observing System (IMOS) to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10–100 m to 100–1,000 km) and depths (15–60 m) across several regions ca 2–6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth) and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40–50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves. PMID:25693066

  13. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  14. Multi-agent based control of large-scale complex systems employing distributed dynamic inference engine

    NASA Astrophysics Data System (ADS)

    Zhang, Daili

    Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications

  15. Distribution of large-scale depositionally related biostratigraphic units, National Petroleum Reserve in Alaska

    SciTech Connect

    Mickey, M.B.; Haga, H.

    1989-01-01

    The stratigraphic and geographic distribution of 6 biostratigraphic units has been inferred on the basis of data from 18 wells in the National Petroleum Reserve in Alaska (NPRA). These large-scale biostratigraphic units represent depositionally related packages of sediment. Therefore, the units should correspond to the major recognized seismic sequence intervals. Seismic stratigraphy is the appropriate approach to refining interpretations of subsurface geology on the North Slope. Properly applied biostratigraphy is essential to this approach. The authors attempt here to show relations among current results from various disciplines, recognizing that revisions will be needed as work progresses.

  16. On distributed wavefront reconstruction for large-scale adaptive optics systems.

    PubMed

    de Visser, Cornelis C; Brunner, Elisabeth; Verhaegen, Michel

    2016-05-01

    The distributed-spline-based aberration reconstruction (D-SABRE) method is proposed for distributed wavefront reconstruction with applications to large-scale adaptive optics systems. D-SABRE decomposes the wavefront sensor domain into any number of partitions and solves a local wavefront reconstruction problem on each partition using multivariate splines. D-SABRE accuracy is within 1% of a global approach with a speedup that scales quadratically with the number of partitions. The D-SABRE is compared to the distributed cumulative reconstruction (CuRe-D) method in open-loop and closed-loop simulations using the YAO adaptive optics simulation tool. D-SABRE accuracy exceeds CuRe-D for low levels of decomposition, and D-SABRE proved to be more robust to variations in the loop gain. PMID:27140879

  17. Upper limit on periodicity in the three-dimensional large-scale distribution of matter

    NASA Technical Reports Server (NTRS)

    Tytler, David; Sandoval, John; Fan, Xiao-Ming

    1993-01-01

    A search for large-scale periodicity in the 3D distribution of 268 Mg II QSO absorption systems which are distributed over 60 percent of the sky, at redshifts 0.1-2.0 is presented. The scalar 3D comoving separations of all pairs of absorption systems are calculated, and peaks in the power spectrum of the distribution of those separations are searched for. The present 95-percent confidence upper limit on the amplitude of a possible periodic fluctuation in the density of galaxies is between one-fourth and three-fourths of the amplitude implied by the data of Broadhurst et al. (1990), depending on the extent to which the wavelength varies and the phase of the signal drifts down lines of sight. A description is presented of how QSO absorption systems sample the 3D population of absorbers and how 3D positions can be represented by their scalar separations.

  18. iTVP: large-scale content distribution for live and on-demand video services

    NASA Astrophysics Data System (ADS)

    Kusmierek, Ewa; Czyrnek, Miroslaw; Mazurek, Cezary; Stroinski, Maciej

    2007-01-01

    iTVP is a system built for IP-based delivery of live TV programming, video-on-demand and audio-on-demand with interactive access over IP networks. It has a country-wide range and is designed to provide service to a high number of concurrent users. iTVP prototype contains the backbone of a two-level hierarchical system designed for distribution of multimedia content from a content provider to end users. In this paper we present experience gained during a few months of the prototype operation. We analyze efficiency of iTVP content distribution system and resource usage at various levels of the hierarchy. We also characterize content access patterns and their influence on system performance, as well as quality experienced by users and user behavior. In our investigation, scalability is one of the most important aspects of the system performance evaluation. Although the range of the prototype operation is limited, as far as the number of users and the content repository is concerned, we believe that data collected from such a large scale operational system provides a valuable insight into efficiency of a CDN-type of solution to large scale streaming services. We find that the systems exhibits good performance and low resource usage.

  19. Large-Scale Distributed Computational Fluid Dynamics on the Information Power Grid Using Globus

    NASA Technical Reports Server (NTRS)

    Barnard, Stephen; Biswas, Rupak; Saini, Subhash; VanderWijngaart, Robertus; Yarrow, Maurice; Zechtzer, Lou; Foster, Ian; Larsson, Olle

    1999-01-01

    This paper describes an experiment in which a large-scale scientific application development for tightly-coupled parallel machines is adapted to the distributed execution environment of the Information Power Grid (IPG). A brief overview of the IPG and a description of the computational fluid dynamics (CFD) algorithm are given. The Globus metacomputing toolkit is used as the enabling device for the geographically-distributed computation. Modifications related to latency hiding and Load balancing were required for an efficient implementation of the CFD application in the IPG environment. Performance results on a pair of SGI Origin 2000 machines indicate that real scientific applications can be effectively implemented on the IPG; however, a significant amount of continued effort is required to make such an environment useful and accessible to scientists and engineers.

  20. The impact of the stratospheric ozone distribution on large-scale tropospheric systems over South America

    NASA Astrophysics Data System (ADS)

    Da Silva, L. A.; Vieira, L. A.; Prestes, A.; Pacini, A. A.; Rigozo, N. R.

    2013-12-01

    Most of the large-scale changes of the climate can be attributed to the cumulative impact of the human activities since the beginning of the industrial revolution. However, the impact of natural drivers to the present climate change is still under debate, especially on regional scale. These regional changes over South America can potentially affect large vulnerable populations in the near future. Here, we show that the distribution of the stratospheric ozone can affect the climate patterns over South America and adjoin oceans. The impact of the stratospheric ozone distribution was evaluated employing the Global Atmospheric-Ocean Model developed by the Goddard Institute for Space Studies (GISS Model E). We conducted two numerical experiments. In the first experiment we used a realistic distribution of the stratospheric ozone, while in the second experiment we employed a uniform longitudinal distribution. We have integrated each model over 60 years. We find that the distribution of stratospheric ozone has a strong influence on the Intertropical Convergence Zone (ITCZ) and South Atlantic Convergence Zone (SACZ). However, the Upper Tropospheric Cyclonic Vortex (UTCV) is not affected by the ozone's distribution.

  1. Shared and Distributed Memory Parallel Security Analysis of Large-Scale Source Code and Binary Applications

    SciTech Connect

    Quinlan, D; Barany, G; Panas, T

    2007-08-30

    Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

  2. Spatially-Explicit Estimation of Geographical Representation in Large-Scale Species Distribution Datasets

    PubMed Central

    Kalwij, Jesse M.; Robertson, Mark P.; Ronk, Argo; Zobel, Martin; Pärtel, Meelis

    2014-01-01

    Much ecological research relies on existing multispecies distribution datasets. Such datasets, however, can vary considerably in quality, extent, resolution or taxonomic coverage. We provide a framework for a spatially-explicit evaluation of geographical representation within large-scale species distribution datasets, using the comparison of an occurrence atlas with a range atlas dataset as a working example. Specifically, we compared occurrence maps for 3773 taxa from the widely-used Atlas Florae Europaeae (AFE) with digitised range maps for 2049 taxa of the lesser-known Atlas of North European Vascular Plants. We calculated the level of agreement at a 50-km spatial resolution using average latitudinal and longitudinal species range, and area of occupancy. Agreement in species distribution was calculated and mapped using Jaccard similarity index and a reduced major axis (RMA) regression analysis of species richness between the entire atlases (5221 taxa in total) and between co-occurring species (601 taxa). We found no difference in distribution ranges or in the area of occupancy frequency distribution, indicating that atlases were sufficiently overlapping for a valid comparison. The similarity index map showed high levels of agreement for central, western, and northern Europe. The RMA regression confirmed that geographical representation of AFE was low in areas with a sparse data recording history (e.g., Russia, Belarus and the Ukraine). For co-occurring species in south-eastern Europe, however, the Atlas of North European Vascular Plants showed remarkably higher richness estimations. Geographical representation of atlas data can be much more heterogeneous than often assumed. Level of agreement between datasets can be used to evaluate geographical representation within datasets. Merging atlases into a single dataset is worthwhile in spite of methodological differences, and helps to fill gaps in our knowledge of species distribution ranges. Species distribution

  3. A practical large scale/high speed data distribution system using 8 mm libraries

    NASA Technical Reports Server (NTRS)

    Howard, Kevin

    1993-01-01

    Eight mm tape libraries are known primarily for their small size, large storage capacity, and low cost. However, many applications require an additional attribute which, heretofore, has been lacking -- high transfer rate. Transfer rate is particularly important in a large scale data distribution environment -- an environment in which 8 mm tape should play a very important role. Data distribution is a natural application for 8 mm for several reasons: most large laboratories have access to 8 mm tape drives, 8 mm tapes are upwardly compatible, 8 mm media are very inexpensive, 8 mm media are light weight (important for shipping purposes), and 8 mm media densely pack data (5 gigabytes now and 15 gigabytes on the horizon). If the transfer rate issue were resolved, 8 mm could offer a good solution to the data distribution problem. To that end Exabyte has analyzed four ways to increase its transfer rate: native drive transfer rate increases, data compression at the drive level, tape striping, and homogeneous drive utilization. Exabyte is actively pursuing native drive transfer rate increases and drive level data compression. However, for non-transmitted bulk data applications (which include data distribution) the other two methods (tape striping and homogeneous drive utilization) hold promise.

  4. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    NASA Technical Reports Server (NTRS)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated

  5. Distributed computing as a virtual supercomputer: Tools to run and manage large-scale BOINC simulations

    NASA Astrophysics Data System (ADS)

    Giorgino, Toni; Harvey, M. J.; de Fabritiis, Gianni

    2010-08-01

    Distributed computing (DC) projects tackle large computational problems by exploiting the donated processing power of thousands of volunteered computers, connected through the Internet. To efficiently employ the computational resources of one of world's largest DC efforts, GPUGRID, the project scientists require tools that handle hundreds of thousands of tasks which run asynchronously and generate gigabytes of data every day. We describe RBoinc, an interface that allows computational scientists to embed the DC methodology into the daily work-flow of high-throughput experiments. By extending the Berkeley Open Infrastructure for Network Computing (BOINC), the leading open-source middleware for current DC projects, with mechanisms to submit and manage large-scale distributed computations from individual workstations, RBoinc turns distributed grids into cost-effective virtual resources that can be employed by researchers in work-flows similar to conventional supercomputers. The GPUGRID project is currently using RBoinc for all of its in silico experiments based on molecular dynamics methods, including the determination of binding free energies and free energy profiles in all-atom models of biomolecules.

  6. Distributed weighted least-squares estimation with fast convergence for large-scale systems☆

    PubMed Central

    Marelli, Damián Edgardo; Fu, Minyue

    2015-01-01

    In this paper we study a distributed weighted least-squares estimation problem for a large-scale system consisting of a network of interconnected sub-systems. Each sub-system is concerned with a subset of the unknown parameters and has a measurement linear in the unknown parameters with additive noise. The distributed estimation task is for each sub-system to compute the globally optimal estimate of its own parameters using its own measurement and information shared with the network through neighborhood communication. We first provide a fully distributed iterative algorithm to asymptotically compute the global optimal estimate. The convergence rate of the algorithm will be maximized using a scaling parameter and a preconditioning method. This algorithm works for a general network. For a network without loops, we also provide a different iterative algorithm to compute the global optimal estimate which converges in a finite number of steps. We include numerical experiments to illustrate the performances of the proposed methods. PMID:25641976

  7. Large scale patterns of abundance and distribution of parasites in Mexican bumblebees.

    PubMed

    Gallot-Lavallée, Marie; Schmid-Hempel, Regula; Vandame, Rémy; Vergara, Carlos H; Schmid-Hempel, Paul

    2016-01-01

    Bumblebees are highly valued for their pollination services in natural ecosystems as well as for agricultural crops. These precious pollinators are known to be declining worldwide, and one major factor contributing to this decline are infections by parasites. Knowledge about parasites in wild bumblebee populations is thus of paramount importance for conservation purposes. We here report the geographical distribution of Crithidia and Nosema, two common parasites of bumblebees, in a yet poorly investigated country: Mexico. Based on sequence divergence of the Cytochrome b and Glycosomal glyceraldehyde phosphate deshydrogenase (gGPDAH) genes, we discovered the presence of a new Crithidia species, which is mainly distributed in the southern half of the country. It is placed by Bayesian inference as a sister species to C. bombi. We suggest the name Crithidia mexicana for this newly discovered organism. A population of C. expoeki was encountered concentrated on the flanks of the dormant volcanic mountain, Iztaccihuatl, and microsatellite data showed evidence of a bottleneck in this population. This study is the first to provide a large-scale insight into the health status of endemic bumblebees in Mexico, based on a large sample size (n=3,285 bees examined) over a variety of host species and habitats. PMID:26678506

  8. Analysis of porosity distribution of large-scale porous media and their reconstruction by Langevin equation.

    PubMed

    Jafari, G Reza; Sahimi, Muhammad; Rasaei, M Reza; Tabar, M Reza Rahimi

    2011-02-01

    Several methods have been developed in the past for analyzing the porosity and other types of well logs for large-scale porous media, such as oil reservoirs, as well as their permeability distributions. We developed a method for analyzing the porosity logs ϕ(h) (where h is the depth) and similar data that are often nonstationary stochastic series. In this method one first generates a new stationary series based on the original data, and then analyzes the resulting series. It is shown that the series based on the successive increments of the log y(h)=ϕ(h+δh)-ϕ(h) is a stationary and Markov process, characterized by a Markov length scale h(M). The coefficients of the Kramers-Moyal expansion for the conditional probability density function (PDF) P(y,h|y(0),h(0)) are then computed. The resulting PDFs satisfy a Fokker-Planck (FP) equation, which is equivalent to a Langevin equation for y(h) that provides probabilistic predictions for the porosity logs. We also show that the Hurst exponent H of the self-affine distributions, which have been used in the past to describe the porosity logs, is directly linked to the drift and diffusion coefficients that we compute for the FP equation. Also computed are the level-crossing probabilities that provide insight into identifying the high or low values of the porosity beyond the depth interval in which the data have been measured. PMID:21405908

  9. Polarimetric consequences of large-scale structure in the distribution of galaxies and quasars

    NASA Astrophysics Data System (ADS)

    Silant'ev, N. A.; Gnedin, Y. N.; Piotrovich, M. Yu.; Natsvlishvili, T. M.; Buliga, S. D.

    2010-12-01

    The problem of inhomogeneities in the distribution of galaxies and quasars over cosmological distances (cell structure) has been discussed in many papers. Here, in particular, we wish to draw attention to the polarimetric consequences of this structure. We discuss in detail the possibility of a large-scale rotation of the mean position angle of the observed polarization over the scale of the cellular structure. We mainly consider rotation mechanisms associated with polarized radiation from magnetized accretion disks near quasars and black holes. In that case the possible correlation of magnetic fields on cosmological scales will show up as a rotation of the mean position angle ranging from 0 to 45 degrees. Correlations in nonspherical formations of galaxies and quasars over cosmological distances also lead to rotation in the mean position angle over these distances. In principle, these two rotation mechanisms can, together, produce an arbitrary rotation of the mean position angle over distances corresponding to the inhomogeneous structure in the distribution of galaxies and quasars.

  10. Tropospheric aerosols: size-differentiated chemistry and large-scale spatial distributions.

    PubMed

    Hidy, George M; Mohnen, Volker; Blanchard, Charles L

    2013-04-01

    Worldwide interest in atmospheric aerosols has emerged since the late 20th century as a part of concerns for air pollution and radiative forcing of the earth's climate. The use of aircraft and balloons for sampling and the use of remote sensing have dramatically expanded knowledge about tropospheric aerosols. Our survey gives an overview of contemporary tropospheric aerosol chemistry based mainly on in situ measurements. It focuses on fine particles less than 1-2.5 microm in diameter. The physical properties of particles by region and altitude are exemplified by particle size distributions, total number and volume concentration, and optical parameters such as extinction coefficient and aerosol optical depth. Particle chemical characterization is size dependent, differentiated by ubiquitous sulfate, and carbon, partially from anthropogenic activity. Large-scale particle distributions extend to intra- and intercontinental proportions involving plumes from population centers to natural disturbances such as dust storms and vegetation fires. In the marine environment, sea salt adds an important component to aerosols. Generally, aerosol components, most of whose sources are at the earth's surface, tend to dilute and decrease in concentration with height, but often show different (layered) profiles depending on meteorological conditions. Key microscopic processes include new particle formation aloft and cloud interactions, both cloud initiation and cloud evaporation. Measurement campaigns aloft are short term, giving snapshots of inherently transient phenomena in the troposphere. Nevertheless, these data, combined with long-term data at the surface and optical depth and transmission observations, yield a unique picture of global tropospheric particle chemistry. PMID:23687724

  11. Vertical Distributions of Sulfur Species Simulated by Large Scale Atmospheric Models in COSAM: Comparison with Observations

    SciTech Connect

    Lohmann, U.; Leaitch, W. R.; Barrie, Leonard A.; Law, K.; Yi, Y.; Bergmann, D.; Bridgeman, C.; Chin, M.; Christensen, J.; Easter, Richard C.; Feichter, J.; Jeuken, A.; Kjellstrom, E.; Koch, D.; Land, C.; Rasch, P.; Roelofs, G.-J.

    2001-11-01

    A comparison of large-scale models simulating atmospheric sulfate aerosols (COSAM) was conducted to increase our understanding of global distributions of sulfate aerosols and precursors. Earlier model comparisons focused on wet deposition measurements and sulfate aerosol concentrations in source regions at the surface. They found that different models simulated the observed sulfate surface concentrations mostly within a factor of two, but that the simulated column burdens and vertical profiles were very different amongst different models. In the COSAM exercise, one aspect is the comparison of sulfate aerosol and precursor gases above the surface. Vertical profiles of SO2, SO42-, oxidants and cloud properties were measured by aircraft during the North Atlantic Regional Experiment (NARE) experiment in August/September 1993 off the coast of Nova Scotia and during the Second Eulerian Model Evaluation Field Study (EMEFSII) in central Ontario in March/April 1990. While no single model stands out as being best or worst, the general tendency is that those models simulating the full oxidant chemistry tend to agree best with observations although differences in transport and treatment of clouds are important as well.

  12. Rucio - The next generation of large scale distributed system for ATLAS Data Management

    NASA Astrophysics Data System (ADS)

    Garonne, V.; Vigne, R.; Stewart, G.; Barisits, M.; eermann, T. B.; Lassnig, M.; Serfon, C.; Goossens, L.; Nairz, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation Distributed Data Management (DDM) system benefiting from recent advances in cloud and "Big Data" computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will deal with these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how to manage central group and user activities. The Rucio design, and the technology it employs, is described, specifically looking at its RESTful architecture and the various software components it uses. We show also the performance of the system.

  13. Communication interval selection in distributed heterogeneous simulation of large-scale dynamical systems

    NASA Astrophysics Data System (ADS)

    Lucas, Charles E.; Walters, Eric A.; Jatskevich, Juri; Wasynczuk, Oleg; Lamm, Peter T.

    2003-09-01

    In this paper, a new technique useful for the numerical simulation of large-scale systems is presented. This approach enables the overall system simulation to be formed by the dynamic interconnection of the various interdependent simulations, each representing a specific component or subsystem such as control, electrical, mechanical, hydraulic, or thermal. Each simulation may be developed separately using possibly different commercial-off-the-shelf simulation programs thereby allowing the most suitable language or tool to be used based on the design/analysis needs. These subsystems communicate the required interface variables at specific time intervals. A discussion concerning the selection of appropriate communication intervals is presented herein. For the purpose of demonstration, this technique is applied to a detailed simulation of a representative aircraft power system, such as that found on the Joint Strike Fighter (JSF). This system is comprised of ten component models each developed using MATLAB/Simulink, EASY5, or ACSL. When the ten component simulations were distributed across just four personal computers (PCs), a greater than 15-fold improvement in simulation speed (compared to the single-computer implementation) was achieved.

  14. Large-scale spatial distribution patterns of gastropod assemblages in rocky shores.

    PubMed

    Miloslavich, Patricia; Cruz-Motta, Juan José; Klein, Eduardo; Iken, Katrin; Weinberger, Vanessa; Konar, Brenda; Trott, Tom; Pohle, Gerhard; Bigatti, Gregorio; Benedetti-Cecchi, Lisandro; Shirayama, Yoshihisa; Mead, Angela; Palomo, Gabriela; Ortiz, Manuel; Gobin, Judith; Sardi, Adriana; Díaz, Juan Manuel; Knowlton, Ann; Wong, Melisa; Peralta, Ana C

    2013-01-01

    Gastropod assemblages from nearshore rocky habitats were studied over large spatial scales to (1) describe broad-scale patterns in assemblage composition, including patterns by feeding modes, (2) identify latitudinal pattern of biodiversity, i.e., richness and abundance of gastropods and/or regional hotspots, and (3) identify potential environmental and anthropogenic drivers of these assemblages. Gastropods were sampled from 45 sites distributed within 12 Large Marine Ecosystem regions (LME) following the NaGISA (Natural Geography in Shore Areas) standard protocol (www.nagisa.coml.org). A total of 393 gastropod taxa from 87 families were collected. Eight of these families (9.2%) appeared in four or more different LMEs. Among these, the Littorinidae was the most widely distributed (8 LMEs) followed by the Trochidae and the Columbellidae (6 LMEs). In all regions, assemblages were dominated by few species, the most diverse and abundant of which were herbivores. No latitudinal gradients were evident in relation to species richness or densities among sampling sites. Highest diversity was found in the Mediterranean and in the Gulf of Alaska, while highest densities were found at different latitudes and represented by few species within one genus (e.g. Afrolittorina in the Agulhas Current, Littorina in the Scotian Shelf, and Lacuna in the Gulf of Alaska). No significant correlation was found between species composition and environmental variables (r≤0.355, p>0.05). Contributing variables to this low correlation included invasive species, inorganic pollution, SST anomalies, and chlorophyll-a anomalies. Despite data limitations in this study which restrict conclusions in a global context, this work represents the first effort to sample gastropod biodiversity on rocky shores using a standardized protocol across a wide scale. Our results will generate more work to build global databases allowing for large-scale diversity comparisons of rocky intertidal assemblages. PMID

  15. Large-Scale Spatial Distribution Patterns of Echinoderms in Nearshore Rocky Habitats

    PubMed Central

    Iken, Katrin; Konar, Brenda; Benedetti-Cecchi, Lisandro; Cruz-Motta, Juan José; Knowlton, Ann; Pohle, Gerhard; Mead, Angela; Miloslavich, Patricia; Wong, Melisa; Trott, Thomas; Mieszkowska, Nova; Riosmena-Rodriguez, Rafael; Airoldi, Laura; Kimani, Edward; Shirayama, Yoshihisa; Fraschetti, Simonetta; Ortiz-Touzet, Manuel; Silva, Angelica

    2010-01-01

    This study examined echinoderm assemblages from nearshore rocky habitats for large-scale distribution patterns with specific emphasis on identifying latitudinal trends and large regional hotspots. Echinoderms were sampled from 76 globally-distributed sites within 12 ecoregions, following the standardized sampling protocol of the Census of Marine Life NaGISA project (www.nagisa.coml.org). Sample-based species richness was overall low (<1–5 species per site), with a total of 32 asteroid, 18 echinoid, 21 ophiuroid, and 15 holothuroid species. Abundance and species richness in intertidal assemblages sampled with visual methods (organisms >2 cm in 1 m2 quadrats) was highest in the Caribbean ecoregions and echinoids dominated these assemblages with an average of 5 ind m−2. In contrast, intertidal echinoderm assemblages collected from clearings of 0.0625 m2 quadrats had the highest abundance and richness in the Northeast Pacific ecoregions where asteroids and holothurians dominated with an average of 14 ind 0.0625 m−2. Distinct latitudinal trends existed for abundance and richness in intertidal assemblages with declines from peaks at high northern latitudes. No latitudinal trends were found for subtidal echinoderm assemblages with either sampling technique. Latitudinal gradients appear to be superseded by regional diversity hotspots. In these hotspots echinoderm assemblages may be driven by local and regional processes, such as overall productivity and evolutionary history. We also tested a set of 14 environmental variables (six natural and eight anthropogenic) as potential drivers of echinoderm assemblages by ecoregions. The natural variables of salinity, sea-surface temperature, chlorophyll a, and primary productivity were strongly correlated with echinoderm assemblages; the anthropogenic variables of inorganic pollution and nutrient contamination also contributed to correlations. Our results indicate that nearshore echinoderm assemblages appear to be shaped by a

  16. Large-Scale Spatial Distribution Patterns of Gastropod Assemblages in Rocky Shores

    PubMed Central

    Miloslavich, Patricia; Cruz-Motta, Juan José; Klein, Eduardo; Iken, Katrin; Weinberger, Vanessa; Konar, Brenda; Trott, Tom; Pohle, Gerhard; Bigatti, Gregorio; Benedetti-Cecchi, Lisandro; Shirayama, Yoshihisa; Mead, Angela; Palomo, Gabriela; Ortiz, Manuel; Gobin, Judith; Sardi, Adriana; Díaz, Juan Manuel; Knowlton, Ann; Wong, Melisa; Peralta, Ana C.

    2013-01-01

    Gastropod assemblages from nearshore rocky habitats were studied over large spatial scales to (1) describe broad-scale patterns in assemblage composition, including patterns by feeding modes, (2) identify latitudinal pattern of biodiversity, i.e., richness and abundance of gastropods and/or regional hotspots, and (3) identify potential environmental and anthropogenic drivers of these assemblages. Gastropods were sampled from 45 sites distributed within 12 Large Marine Ecosystem regions (LME) following the NaGISA (Natural Geography in Shore Areas) standard protocol (www.nagisa.coml.org). A total of 393 gastropod taxa from 87 families were collected. Eight of these families (9.2%) appeared in four or more different LMEs. Among these, the Littorinidae was the most widely distributed (8 LMEs) followed by the Trochidae and the Columbellidae (6 LMEs). In all regions, assemblages were dominated by few species, the most diverse and abundant of which were herbivores. No latitudinal gradients were evident in relation to species richness or densities among sampling sites. Highest diversity was found in the Mediterranean and in the Gulf of Alaska, while highest densities were found at different latitudes and represented by few species within one genus (e.g. Afrolittorina in the Agulhas Current, Littorina in the Scotian Shelf, and Lacuna in the Gulf of Alaska). No significant correlation was found between species composition and environmental variables (r≤0.355, p>0.05). Contributing variables to this low correlation included invasive species, inorganic pollution, SST anomalies, and chlorophyll-a anomalies. Despite data limitations in this study which restrict conclusions in a global context, this work represents the first effort to sample gastropod biodiversity on rocky shores using a standardized protocol across a wide scale. Our results will generate more work to build global databases allowing for large-scale diversity comparisons of rocky intertidal assemblages. PMID

  17. Large-scale spatial distribution patterns of echinoderms in nearshore rocky habitats.

    PubMed

    Iken, Katrin; Konar, Brenda; Benedetti-Cecchi, Lisandro; Cruz-Motta, Juan José; Knowlton, Ann; Pohle, Gerhard; Mead, Angela; Miloslavich, Patricia; Wong, Melisa; Trott, Thomas; Mieszkowska, Nova; Riosmena-Rodriguez, Rafael; Airoldi, Laura; Kimani, Edward; Shirayama, Yoshihisa; Fraschetti, Simonetta; Ortiz-Touzet, Manuel; Silva, Angelica

    2010-01-01

    This study examined echinoderm assemblages from nearshore rocky habitats for large-scale distribution patterns with specific emphasis on identifying latitudinal trends and large regional hotspots. Echinoderms were sampled from 76 globally-distributed sites within 12 ecoregions, following the standardized sampling protocol of the Census of Marine Life NaGISA project (www.nagisa.coml.org). Sample-based species richness was overall low (<1-5 species per site), with a total of 32 asteroid, 18 echinoid, 21 ophiuroid, and 15 holothuroid species. Abundance and species richness in intertidal assemblages sampled with visual methods (organisms >2 cm in 1 m(2) quadrats) was highest in the Caribbean ecoregions and echinoids dominated these assemblages with an average of 5 ind m(-2). In contrast, intertidal echinoderm assemblages collected from clearings of 0.0625 m(2) quadrats had the highest abundance and richness in the Northeast Pacific ecoregions where asteroids and holothurians dominated with an average of 14 ind 0.0625 m(-2). Distinct latitudinal trends existed for abundance and richness in intertidal assemblages with declines from peaks at high northern latitudes. No latitudinal trends were found for subtidal echinoderm assemblages with either sampling technique. Latitudinal gradients appear to be superseded by regional diversity hotspots. In these hotspots echinoderm assemblages may be driven by local and regional processes, such as overall productivity and evolutionary history. We also tested a set of 14 environmental variables (six natural and eight anthropogenic) as potential drivers of echinoderm assemblages by ecoregions. The natural variables of salinity, sea-surface temperature, chlorophyll a, and primary productivity were strongly correlated with echinoderm assemblages; the anthropogenic variables of inorganic pollution and nutrient contamination also contributed to correlations. Our results indicate that nearshore echinoderm assemblages appear to be shaped by

  18. Design and implementation of a distributed large-scale spatial database system based on J2EE

    NASA Astrophysics Data System (ADS)

    Gong, Jianya; Chen, Nengcheng; Zhu, Xinyan; Zhang, Xia

    2003-03-01

    With the increasing maturity of distributed object technology, CORBA, .NET and EJB are universally used in traditional IT field. However, theories and practices of distributed spatial database need farther improvement in virtue of contradictions between large scale spatial data and limited network bandwidth or between transitory session and long transaction processing. Differences and trends among of CORBA, .NET and EJB are discussed in details, afterwards the concept, architecture and characteristic of distributed large-scale seamless spatial database system based on J2EE is provided, which contains GIS client application, web server, GIS application server and spatial data server. Moreover the design and implementation of components of GIS client application based on JavaBeans, the GIS engine based on servlet, the GIS Application server based on GIS enterprise JavaBeans(contains session bean and entity bean) are explained.Besides, the experiments of relation of spatial data and response time under different conditions are conducted, which proves that distributed spatial database system based on J2EE can be used to manage, distribute and share large scale spatial data on Internet. Lastly, a distributed large-scale seamless image database based on Internet is presented.

  19. An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Xu, Xueli; von Davier, Matthias

    2015-01-01

    In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…

  20. Conducting the NLM/AHCPR Large Scale Vocabulary Test: a distributed Internet-based experiment.

    PubMed Central

    McCray, A. T.; Cheh, M. L.; Bangalore, A. K.; Rafei, K.; Razi, A. M.; Divita, G.; Stavri, P. Z.

    1997-01-01

    The Large Scale Vocabulary Test, sponsored by the National Library of Medicine (NLM) and the Agency for Health Care Policy and Research (AHCPR), was conducted to determine the extent to which a combination of existing health-related terminologies cover vocabulary needed in health care information systems. The test was conducted over the Internet using a sophisticated World Wide Web interface with over 60 participants and over 40,000 terms submitted. This paper discusses the issues encountered in the design and execution of the experiment, including the design of the interface and the issues of recruitment, training, and guidance of remote participants over the Internet. Test data are currently undergoing expert review. Upon completion of the expert review, the results of the test will be fully reported. PMID:9357688

  1. Detectability of large-scale power suppression in the galaxy distribution

    NASA Astrophysics Data System (ADS)

    Gibelyou, Cameron; Huterer, Dragan; Fang, Wenjuan

    2010-12-01

    Suppression in primordial power on the Universe’s largest observable scales has been invoked as a possible explanation for large-angle observations in the cosmic microwave background, and is allowed or predicted by some inflationary models. Here we investigate the extent to which such a suppression could be confirmed by the upcoming large-volume redshift surveys. For definiteness, we study a simple parametric model of suppression that improves the fit of the vanilla ΛCDM model to the angular correlation function measured by WMAP in cut-sky maps, and at the same time improves the fit to the angular power spectrum inferred from the maximum likelihood analysis presented by the WMAP team. We find that the missing power at large scales, favored by WMAP observations within the context of this model, will be difficult but not impossible to rule out with a galaxy redshift survey with large-volume (˜100Gpc3). A key requirement for success in ruling out power suppression will be having redshifts of most galaxies detected in the imaging survey.

  2. Large-Scale Genetic Structuring of a Widely Distributed Carnivore - The Eurasian Lynx (Lynx lynx)

    PubMed Central

    Rueness, Eli K.; Naidenko, Sergei; Trosvik, Pål; Stenseth, Nils Chr.

    2014-01-01

    Over the last decades the phylogeography and genetic structure of a multitude of species inhabiting Europe and North America have been described. The flora and fauna of the vast landmasses of north-eastern Eurasia are still largely unexplored in this respect. The Eurasian lynx is a large felid that is relatively abundant over much of the Russian sub-continent and the adjoining countries. Analyzing 148 museum specimens collected throughout its range over the last 150 years we have described the large-scale genetic structuring in this highly mobile species. We have investigated the spatial genetic patterns using mitochondrial DNA sequences (D-loop and cytochrome b) and 11 microsatellite loci, and describe three phylogenetic clades and a clear structuring along an east-west gradient. The most likely scenario is that the contemporary Eurasian lynx populations originated in central Asia and that parts of Europe were inhabited by lynx during the Pleistocene. After the Last Glacial Maximum (LGM) range expansions lead to colonization of north-western Siberia and Scandinavia from the Caucasus and north-eastern Siberia from a refugium further east. No evidence of a Berinigan refugium could be detected in our data. We observed restricted gene flow and suggest that future studies of the Eurasian lynx explore to what extent the contemporary population structure may be explained by ecological variables. PMID:24695745

  3. Advancing a distributed multi-scale computing framework for large-scale high-throughput discovery in materials science

    NASA Astrophysics Data System (ADS)

    Knap, J.; Spear, C. E.; Borodin, O.; Leiter, K. W.

    2015-10-01

    We describe the development of a large-scale high-throughput application for discovery in materials science. Our point of departure is a computational framework for distributed multi-scale computation. We augment the original framework with a specialized module whose role is to route evaluation requests needed by the high-throughput application to a collection of available computational resources. We evaluate the feasibility and performance of the resulting high-throughput computational framework by carrying out a high-throughput study of battery solvents. Our results indicate that distributed multi-scale computing, by virtue of its adaptive nature, is particularly well-suited for building high-throughput applications.

  4. A High-Level Framework for Distributed Processing of Large-Scale Graphs

    NASA Astrophysics Data System (ADS)

    Krepska, Elzbieta; Kielmann, Thilo; Fokkink, Wan; Bal, Henri

    Distributed processing of real-world graphs is challenging due to their size and the inherent irregular structure of graph computations. We present hipg, a distributed framework that facilitates high-level programming of parallel graph algorithms by expressing them as a hierarchy of distributed computations executed independently and managed by the user. hipg programs are in general short and elegant; they achieve good portability, memory utilization and performance.

  5. Tail-scope: Using friends to estimate heavy tails of degree distributions in large-scale complex networks.

    PubMed

    Eom, Young-Ho; Jo, Hang-Hyun

    2015-01-01

    Many complex networks in natural and social phenomena have often been characterized by heavy-tailed degree distributions. However, due to rapidly growing size of network data and concerns on privacy issues about using these data, it becomes more difficult to analyze complete data sets. Thus, it is crucial to devise effective and efficient estimation methods for heavy tails of degree distributions in large-scale networks only using local information of a small fraction of sampled nodes. Here we propose a tail-scope method based on local observational bias of the friendship paradox. We show that the tail-scope method outperforms the uniform node sampling for estimating heavy tails of degree distributions, while the opposite tendency is observed in the range of small degrees. In order to take advantages of both sampling methods, we devise the hybrid method that successfully recovers the whole range of degree distributions. Our tail-scope method shows how structural heterogeneities of large-scale complex networks can be used to effectively reveal the network structure only with limited local information. PMID:25959097

  6. DC-DC Converter Topology Assessment for Large Scale Distributed Photovoltaic Plant Architectures

    SciTech Connect

    Agamy, Mohammed S; Harfman-Todorovic, Maja; Elasser, Ahmed; Sabate, Juan A; Steigerwald, Robert L; Jiang, Yan; Essakiappan, Somasundaram

    2011-07-01

    Distributed photovoltaic (PV) plant architectures are emerging as a replacement for the classical central inverter based systems. However, power converters of smaller ratings may have a negative impact on system efficiency, reliability and cost. Therefore, it is necessary to design converters with very high efficiency and simpler topologies in order not to offset the benefits gained by using distributed PV systems. In this paper an evaluation of the selection criteria for dc-dc converters for distributed PV systems is performed; this evaluation includes efficiency, simplicity of design, reliability and cost. Based on this evaluation, recommendations can be made as to which class of converters is best fit for this application.

  7. Probing large scale homogeneity and periodicity in the LRG distribution using Shannon entropy

    NASA Astrophysics Data System (ADS)

    Pandey, Biswajit; Sarkar, Suman

    2016-05-01

    We quantify the degree of inhomogeneity in the Luminous Red Galaxy (LRG) distribution from the SDSS DR7 as a function of length scales by measuring the Shannon entropy in independent and regular cubic voxels of increasing grid sizes. We also analyze the data by carrying out measurements in overlapping spheres and find that it suppresses inhomogeneities by a factor of 5 to 10 on different length scales. Despite the differences observed in the degree of inhomogeneity both the methods show a decrease in inhomogeneity with increasing length scales which eventually settle down to a plateau at ˜ 150 h^{-1} {Mpc}. Considering the minuscule values of inhomogeneity at the plateaus and their expected variations we conclude that the LRG distribution becomes homogeneous at 150 h^{-1} {Mpc} and beyond. We also use the Kullback-Leibler divergence as an alternative measure of inhomogeneity which reaffirms our findings. We show that the method presented here can effectively capture the inhomogeneity in a truly inhomogeneous distribution at all length scales. We analyze a set of Monte Carlo simulations with certain periodicity in their spatial distributions and find periodic variations in their inhomogeneity which helps us to identify the underlying regularities present in such distributions and quantify the scale of their periodicity. We do not find any underlying regularities in the LRG distribution within the length scales probed.

  8. Probing large scale homogeneity and periodicity in the LRG distribution using Shannon entropy

    NASA Astrophysics Data System (ADS)

    Pandey, Biswajit; Sarkar, Suman

    2016-08-01

    We quantify the degree of inhomogeneity in the Luminous Red Galaxy (LRG) distribution from the SDSS DR7 as a function of length scales by measuring the Shannon entropy in independent and regular cubic voxels of increasing grid sizes. We also analyse the data by carrying out measurements in overlapping spheres and find that it suppresses inhomogeneities by a factor of 5-10 on different length scales. Despite the differences observed in the degree of inhomogeneity both the methods show a decrease in inhomogeneity with increasing length scales which eventually settle down to a plateau at ˜150 h-1 Mpc. Considering the minuscule values of inhomogeneity at the plateaus and their expected variations we conclude that the LRG distribution becomes homogeneous at 150 h-1 Mpc and beyond. We also use the Kullback-Leibler divergence as an alternative measure of inhomogeneity which reaffirms our findings. We show that the method presented here can effectively capture the inhomogeneity in a truly inhomogeneous distribution at all length scales. We analyse a set of Monte Carlo simulations with certain periodicity in their spatial distributions and find periodic variations in their inhomogeneity which helps us to identify the underlying regularities present in such distributions and quantify the scale of their periodicity. We do not find any underlying regularities in the LRG distribution within the length scales probed.

  9. Dark matter and formation of large scale structure in the universe - The test by distribution of quasars

    NASA Astrophysics Data System (ADS)

    Fang, L.; Chu, Y.; Zhu, X.

    1985-05-01

    According to the scenario, developed in the previous paper, on the formation of large scale structure in the universe, it would be expected that: (1) the distribution of quasars should differ from that of galaxies because it has no strong inhomogeneity on the scale of 10-100 Mpc; (2) the distributions of quasars with Z greater than 2 and Z less than 2 should differ from each other because of the absence of large structure in the former but its presence in the latter. Various analyses on quasar distribution are consistent with these predictions. Particularly, the nearest neighbor test for the complete quasar sample given by Savage and Bolton (1979) clearly shows that the distribution of Z greater than 2 quasars is rather homogeneous while the Z less than 2 quasars have a tendency to clustering.

  10. Empirical distributions of F(ST) from large-scale human polymorphism data.

    PubMed

    Elhaik, Eran

    2012-01-01

    Studies of the apportionment of human genetic variation have long established that most human variation is within population groups and that the additional variation between population groups is small but greatest when comparing different continental populations. These studies often used Wright's F(ST) that apportions the standardized variance in allele frequencies within and between population groups. Because local adaptations increase population differentiation, high-F(ST) may be found at closely linked loci under selection and used to identify genes undergoing directional or heterotic selection. We re-examined these processes using HapMap data. We analyzed 3 million SNPs on 602 samples from eight worldwide populations and a consensus subset of 1 million SNPs found in all populations. We identified four major features of the data: First, a hierarchically F(ST) analysis showed that only a paucity (12%) of the total genetic variation is distributed between continental populations and even a lesser genetic variation (1%) is found between intra-continental populations. Second, the global F(ST) distribution closely follows an exponential distribution. Third, although the overall F(ST) distribution is similarly shaped (inverse J), F(ST) distributions varies markedly by allele frequency when divided into non-overlapping groups by allele frequency range. Because the mean allele frequency is a crude indicator of allele age, these distributions mark the time-dependent change in genetic differentiation. Finally, the change in mean-F(ST) of these groups is linear in allele frequency. These results suggest that investigating the extremes of the F(ST) distribution for each allele frequency group is more efficient for detecting selection. Consequently, we demonstrate that such extreme SNPs are more clustered along the chromosomes than expected from linkage disequilibrium for each allele frequency group. These genomic regions are therefore likely candidates for natural selection

  11. Analysis of large-scale distributed knowledge sources via autonomous cooperative graph mining

    NASA Astrophysics Data System (ADS)

    Levchuk, Georgiy; Ortiz, Andres; Yan, Xifeng

    2014-05-01

    In this paper, we present a model for processing distributed relational data across multiple autonomous heterogeneous computing resources in environments with limited control, resource failures, and communication bottlenecks. Our model exploits dependencies in the data to enable collaborative distributed querying in noisy data. The collaboration policy for computational resources is efficiently constructed from the belief propagation algorithm. To scale to large data sizes, we employ a combination of priority-based filtering, incremental processing, and communication compression techniques. Our solution achieved high accuracy of analysis results and orders of magnitude improvements in computation time compared to the centralized graph matching solution.

  12. Distribution of large-scale contractional tectonic landforms on Mercury: Implications for the origin of global stresses

    NASA Astrophysics Data System (ADS)

    Watters, Thomas R.; Selvans, Michelle M.; Banks, Maria E.; Hauck, Steven A.; Becker, Kris J.; Robinson, Mark S.

    2015-05-01

    The surface of Mercury is dominated by contractional tectonic landforms that are evidence of global-scale crustal deformation. Using MESSENGER orbital high-incidence angle imaging and topographic data, large-scale lobate thrust fault scarps have been mapped globally. The spatial distribution and areal density of the contractional landforms are not uniform; concentrations occur in longitudinal bands and between the north and south hemispheres. Their orientations are generally north-south at low latitude to midlatitude and east-west at high latitudes. The spatial distribution and distribution of orientations of these large-scale contractional features suggest that planet-wide contraction due to interior cooling cannot be the sole source of global stresses. The nonrandom orientations are best explained by a combination of stresses from global contraction and tidal despinning combined with an equator-to-pole variation in lithospheric thickness, while the nonuniform areal density of the contractional features may indicate the influence of mantle downwelling or heterogeneities in lithospheric strength.

  13. Combining local- and large-scale models to predict the distributions of invasive plant species.

    PubMed

    Jones, Chad C; Acker, Steven A; Halpern, Charles B

    2010-03-01

    Habitat distribution models are increasingly used to predict the potential distributions of invasive species and to inform monitoring. However, these models assume that species are in equilibrium with the environment, which is clearly not true for most invasive species. Although this assumption is frequently acknowledged, solutions have not been adequately addressed. There are several potential methods for improving habitat distribution models. Models that require only presence data may be more effective for invasive species, but this assumption has rarely been tested. In addition, combining modeling types to form "ensemble" models may improve the accuracy of predictions. However, even with these improvements, models developed for recently invaded areas are greatly influenced by the current distributions of species and thus reflect near- rather than long-term potential for invasion. Larger scale models from species' native and invaded ranges may better reflect long-term invasion potential, but they lack finer scale resolution. We compared logistic regression (which uses presence/absence data) and two presence-only methods for modeling the potential distributions of three invasive plant species on the Olympic Peninsula in Washington, USA. We then combined the three methods to create ensemble models. We also developed climate envelope models for the same species based on larger scale distributions and combined models from multiple scales to create an index of near- and long-term invasion risk to inform monitoring in Olympic National Park (ONP). Neither presence-only nor ensemble models were more accurate than logistic regression for any of the species. Larger scale models predicted much greater areas at risk of invasion. Our index of near- and long-term invasion risk indicates that < 4% of ONP is at high near-term risk of invasion while 67-99% of the Park is at moderate or high long-term risk of invasion. We demonstrate how modeling results can be used to guide the

  14. Northern dwarf and low surface brightness galaxies. IV - The large-scale space distribution

    NASA Technical Reports Server (NTRS)

    Thuan, Trinh X.; Alimi, Jean-Michel; Gott, J. Richard, III; Schneider, Stephen E.

    1991-01-01

    Results are reported from a statistical analysis of published observational data on a sample of 860 northern dwarf and low-surface-brightness (D/LSB) galaxies with delta = 0 deg or greater and b between -40 and 40 deg, selected from the Uppsala General Catalogue of Galaxies (Nilson et al., 1973). The results are presented in extensive redshift/space maps, histograms, graphs and tables and characterized in detail. It is shown that the distribution of D/LSB galaxies closely resembles that of bright galaxies, apparently ruling out biased-star-formation models predicting a uniform distribution of D/LSBs. Although bright galaxies outside clusters are somewhat more clustered than the H I-rich D/LSBs, the latters' pairwise peculiar velocity (460 + or - 50 km/sec) is similar to that of the former.

  15. Anthropogenic aerosols and the distribution of past large-scale precipitation change

    NASA Astrophysics Data System (ADS)

    Wang, Chien

    2015-12-01

    The climate response of precipitation to the effects of anthropogenic aerosols is a critical while not yet fully understood aspect in climate science. Results of selected models that participated the Coupled Model Intercomparison Project Phase 5 and the data from the Twentieth Century Reanalysis Project suggest that, throughout the tropics and also in the extratropical Northern Hemisphere, aerosols have largely dominated the distribution of precipitation changes in reference to the preindustrial era in the second half of the last century. Aerosol-induced cooling has offset some of the warming caused by the greenhouse gases from the tropics to the Arctic and thus formed the gradients of surface temperature anomaly that enable the revealed precipitation change patterns to occur. Improved representation of aerosol-cloud interaction has been demonstrated as the key factor for models to reproduce consistent distributions of past precipitation change with the reanalysis data.

  16. Large scale patterns in vertical distribution and behaviour of mesopelagic scattering layers.

    PubMed

    Klevjer, T A; Irigoien, X; Røstad, A; Fraile-Nuez, E; Benítez-Barrios, V M; Kaartvedt, S

    2016-01-01

    Recent studies suggest that previous estimates of mesopelagic biomasses are severely biased, with the new, higher estimates underlining the need to unveil behaviourally mediated coupling between shallow and deep ocean habitats. We analysed vertical distribution and diel vertical migration (DVM) of mesopelagic acoustic scattering layers (SLs) recorded at 38 kHz across oceanographic regimes encountered during the circumglobal Malaspina expedition. Mesopelagic SLs were observed in all areas covered, but vertical distributions and DVM patterns varied markedly. The distribution of mesopelagic backscatter was deepest in the southern Indian Ocean (weighted mean daytime depth: WMD 590 m) and shallowest at the oxygen minimum zone in the eastern Pacific (WMD 350 m). DVM was evident in all areas covered, on average ~50% of mesopelagic backscatter made daily excursions from mesopelagic depths to shallow waters. There were marked differences in migrating proportions between the regions, ranging from ~20% in the Indian Ocean to ~90% in the Eastern Pacific. Overall the data suggest strong spatial gradients in mesopelagic DVM patterns, with implied ecological and biogeochemical consequences. Our results suggest that parts of this spatial variability can be explained by horizontal patterns in physical-chemical properties of water masses, such as oxygen, temperature and turbidity. PMID:26813333

  17. Large scale patterns in vertical distribution and behaviour of mesopelagic scattering layers

    PubMed Central

    Klevjer, T. A.; Irigoien, X.; Røstad, A.; Fraile-Nuez, E.; Benítez-Barrios, V. M.; Kaartvedt., S.

    2016-01-01

    Recent studies suggest that previous estimates of mesopelagic biomasses are severely biased, with the new, higher estimates underlining the need to unveil behaviourally mediated coupling between shallow and deep ocean habitats. We analysed vertical distribution and diel vertical migration (DVM) of mesopelagic acoustic scattering layers (SLs) recorded at 38 kHz across oceanographic regimes encountered during the circumglobal Malaspina expedition. Mesopelagic SLs were observed in all areas covered, but vertical distributions and DVM patterns varied markedly. The distribution of mesopelagic backscatter was deepest in the southern Indian Ocean (weighted mean daytime depth: WMD 590 m) and shallowest at the oxygen minimum zone in the eastern Pacific (WMD 350 m). DVM was evident in all areas covered, on average ~50% of mesopelagic backscatter made daily excursions from mesopelagic depths to shallow waters. There were marked differences in migrating proportions between the regions, ranging from ~20% in the Indian Ocean to ~90% in the Eastern Pacific. Overall the data suggest strong spatial gradients in mesopelagic DVM patterns, with implied ecological and biogeochemical consequences. Our results suggest that parts of this spatial variability can be explained by horizontal patterns in physical-chemical properties of water masses, such as oxygen, temperature and turbidity. PMID:26813333

  18. Large-Scale Merging of Histograms using Distributed In-Memory Computing

    NASA Astrophysics Data System (ADS)

    Blomer, Jakob; Ganis, Gerardo

    2015-12-01

    Most high-energy physics analysis jobs are embarrassingly parallel except for the final merging of the output objects, which are typically histograms. Currently, the merging of output histograms scales badly. The running time for distributed merging depends not only on the overall number of bins but also on the number partial histogram output files. That means, while the time to analyze data decreases linearly with the number of worker nodes, the time to merge the histograms in fact increases with the number of worker nodes. On the grid, merging jobs that take a few hours are not unusual. In order to improve the situation, we present a distributed and decentral merging algorithm whose running time is independent of the number of worker nodes. We exploit full bisection bandwidth of local networks and we keep all intermediate results in memory. We present benchmarks from an implementation using the parallel ROOT facility (PROOF) and RAMCloud, a distributed key-value store that keeps all data in DRAM.

  19. Exploring the Potential of Large Scale Distributed Modeling of Snow Accumulation and Melt on GPUs

    NASA Astrophysics Data System (ADS)

    Bisht, G.; Kumar, M.

    2010-12-01

    Water from snow melt is a critical resource in watersheds of the western US, Canada, and other similar regions of the world. The distribution of snow and melt-water controls the temporal and spatial distributions of soil moisture, evapo-transpiration (ET), recharge, stream-aquifer interaction and other hydrologic processes within the watershed. It also influences the quantity and timing of water availability in downstream areas. In spite of the serious impacts on the water resources at multiple scales, the knowledge base for prediction of snow accumulation and melt in mountainous watersheds is notably weak. Physics-based, distributed snow models such as UEB, SNTHERM, SHAW and ISNOBAL, have positioned themselves as an appropriate tool for understanding of snow-process interactions and prediction of melt, and have been applied in numerous watersheds to varying degrees of success. In spite of the significant advances in hardware speed and programming efficiency, the application of the above-mentioned snow models has mostly been limited to small watersheds. Application of these models at finer spatio-temporal resolution, in large domains, and for longer time periods, to address problems such as quantifying the response of snow-dominated watersheds to climate change scenarios, is restrictive due to the large computational cost involved. Additionally, the computational requirement of current generation snow models is expected to rise as improved snow-depth characterization and a tighter coupling with hydrologic processes are incorporated. This poses considerable challenge to their application in feasible time. We suggest alleviating this problem by taking advantage of high performance computing (HPC) systems based on Graphics Processing Unit (GPU) processors. High performance GPUs work like SIMD processors, but can take advantage of larger number of cores thus providing higher throughput. As of June 2010, the second fastest supercomputer in the world uses NVidia Tesla

  20. Large-Scale CORBA-Distributed Software Framework for NIF Controls

    SciTech Connect

    Carey, R W; Fong, K W; Sanchez, R J; Tappero, J D; Woodruff, J P

    2001-10-16

    The Integrated Computer Control System (ICCS) is based on a scalable software framework that is distributed over some 325 computers throughout the NIF facility. The framework provides templates and services at multiple levels of abstraction for the construction of software applications that communicate via CORBA (Common Object Request Broker Architecture). Various forms of object-oriented software design patterns are implemented as templates to be extended by application software. Developers extend the framework base classes to model the numerous physical control points, thereby sharing the functionality defined by the base classes. About 56,000 software objects each individually addressed through CORBA are to be created in the complete ICCS. Most objects have a persistent state that is initialized at system start-up and stored in a database. Additional framework services are provided by centralized server programs that implement events, alerts, reservations, message logging, database/file persistence, name services, and process management. The ICCS software framework approach allows for efficient construction of a software system that supports a large number of distributed control points representing a complex control application.

  1. Large-scale distribution and production of bacterioplankton in the Adriatic Sea

    NASA Astrophysics Data System (ADS)

    Gallina, Alessandra A.; Celussi, Mauro; Del Negro, Paola

    2011-08-01

    Two oceanographic cruises encompassing the whole Adriatic Sea were carried out during February and October 2008. Selected stations were sampled at several depths to determine total prokaryotes and picocyanobacteria abundance using epifluorescence microscopy, and to estimate prokaryotic carbon production by 3H-leucine incorporation. Biological data were related to physical parameters including temperature, salinity and fluorescence, and an attempt to associate bacterial dynamics to water mass characteristics was performed. In both seasons prokaryotic distribution and production showed a decreasing latitudinal gradient likely dependent on riverine inputs highlighted by a strong negative correlation with salinity ( P < 0.001). A vertical gradient with higher cell numbers at the surface and lower values at the bottom layer was also always detected. In the southern basin in February, however, picocyanobacteria were retrieved also in deep waters, probably linked to higher nutrient loads carried by the Levantine Intermediate Waters and/or the deep water ventilation known to occur in this area. From an oceanographic point of view, we sampled within four different water types, but no relationship between these water types and bacterioplankton abundances was found. The present work contributes to the acquisition of a more holistic overview of prokaryotic distribution and production in the Adriatic Sea, both on a spatial and temporal scale.

  2. Constraining Neutrino mass using the large scale HI distribution in the Post-reionization epoch

    NASA Astrophysics Data System (ADS)

    Pal, Ashis Kumar; Guha Sarkar, Tapomoy

    2016-04-01

    The neutral intergalactic medium in the post reionization epoch allows us to study cosmological structure formation through the observation of the redshifted 21 cm signal and the Lyman-alpha forest. We investigate the possibility of measuring the total neutrino mass through the suppression of power in the matter power spectrum. We investigate the possibility of measuring the neutrino mass through its imprint on the cross-correlation power spectrum of the 21-cm signal and the Lyman-alpha forest. We consider a radio-interferometric measurement of the 21 cm signal with a SKA1-mid like radio telescope and a BOSS like Lyman-alpha forest survey. A Fisher matrix analysis shows that at the fiducial redshift z = 2.5, a 10,000 hrs 21-cm observation distributed equally over 25 radio pointings and a Lyman-alpha forest survey with 30 quasars lines of sights in 1deg2, allows us to measure Ων at a 3.25% level. A total of 25,000 hrs radio-interferometric observation distributed equally over 25 radio pointings and a Lyman-alpha survey with n¯ = 60deg-2 will allow Ων to be measured at a 2.26% level. This corresponds to an idealized measurement of ∑mν at the precision of (100 ± 2.26meV and fν = Ων/Ωm at 2.49% level.

  3. Cost-Efficient and Multi-Functional Secure Aggregation in Large Scale Distributed Application

    PubMed Central

    Zhang, Ping; Li, Wenjun; Sun, Hua

    2016-01-01

    Secure aggregation is an essential component of modern distributed applications and data mining platforms. Aggregated statistical results are typically adopted in constructing a data cube for data analysis at multiple abstraction levels in data warehouse platforms. Generating different types of statistical results efficiently at the same time (or referred to as enabling multi-functional support) is a fundamental requirement in practice. However, most of the existing schemes support a very limited number of statistics. Securely obtaining typical statistical results simultaneously in the distribution system, without recovering the original data, is still an open problem. In this paper, we present SEDAR, which is a SEcure Data Aggregation scheme under the Range segmentation model. Range segmentation model is proposed to reduce the communication cost by capturing the data characteristics, and different range uses different aggregation strategy. For raw data in the dominant range, SEDAR encodes them into well defined vectors to provide value-preservation and order-preservation, and thus provides the basis for multi-functional aggregation. A homomorphic encryption scheme is used to achieve data privacy. We also present two enhanced versions. The first one is a Random based SEDAR (REDAR), and the second is a Compression based SEDAR (CEDAR). Both of them can significantly reduce communication cost with the trade-off lower security and lower accuracy, respectively. Experimental evaluations, based on six different scenes of real data, show that all of them have an excellent performance on cost and accuracy. PMID:27551747

  4. Large-scale distribution of hybridogenetic lineages in a Spanish desert ant

    PubMed Central

    Darras, Hugo; Leniaud, Laurianne; Aron, Serge

    2014-01-01

    Recently, a unique case of hybridogenesis at a social level was reported in local populations of the desert ants Cataglyphis. Queens mate with males originating from a different genetic lineage than their own to produce hybrid workers, but they use parthenogenesis for the production of reproductive offspring (males and females). As a result, non-reproductive workers are all inter-lineage hybrids, whereas the sexual line is purely maternal. Here, we show that this unorthodox reproductive system occurs in all populations of the ant Cataglyphis hispanica. Remarkably, workers are hybrids of the same two genetic lineages along a 400 km transect crossing the whole distribution range of the species. These results indicate that social hybridogenesis in C. hispanica allows their maintenance over time and across a large geographical scale of two highly divergent genetic lineages, despite their constant hybridization. The widespread distribution of social hybridogenesis in C. hispanica supports that this reproductive strategy has been evolutionarily conserved over a long period. PMID:24225458

  5. Cost-Efficient and Multi-Functional Secure Aggregation in Large Scale Distributed Application.

    PubMed

    Zhang, Ping; Li, Wenjun; Sun, Hua

    2016-01-01

    Secure aggregation is an essential component of modern distributed applications and data mining platforms. Aggregated statistical results are typically adopted in constructing a data cube for data analysis at multiple abstraction levels in data warehouse platforms. Generating different types of statistical results efficiently at the same time (or referred to as enabling multi-functional support) is a fundamental requirement in practice. However, most of the existing schemes support a very limited number of statistics. Securely obtaining typical statistical results simultaneously in the distribution system, without recovering the original data, is still an open problem. In this paper, we present SEDAR, which is a SEcure Data Aggregation scheme under the Range segmentation model. Range segmentation model is proposed to reduce the communication cost by capturing the data characteristics, and different range uses different aggregation strategy. For raw data in the dominant range, SEDAR encodes them into well defined vectors to provide value-preservation and order-preservation, and thus provides the basis for multi-functional aggregation. A homomorphic encryption scheme is used to achieve data privacy. We also present two enhanced versions. The first one is a Random based SEDAR (REDAR), and the second is a Compression based SEDAR (CEDAR). Both of them can significantly reduce communication cost with the trade-off lower security and lower accuracy, respectively. Experimental evaluations, based on six different scenes of real data, show that all of them have an excellent performance on cost and accuracy. PMID:27551747

  6. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex, Hydrogeologic Systems

    NASA Astrophysics Data System (ADS)

    Wolfsberg, A.; Kang, Q.; Li, C.; Ruskauff, G.; Bhark, E.; Freeman, E.; Prothro, L.; Drellack, S.

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  7. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    SciTech Connect

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  8. Investigation of Homogeneity and Matter Distribution on Large Scales Using Large Quasar Groups

    NASA Astrophysics Data System (ADS)

    Li, Ming-Hua

    2015-12-01

    We use 20 large quasar group (LQG) samples in Park et al. (2015) to investigate the homogeneity of the 0.3 ≲ z ≲ 1.6 Universe (z denotes the redshift). For comparison, we also employ the 12 LQGs samples at 0.5 ≲ z ≲ 2 in Komberg et al. (1996) to do the analysis. We calculate the bias factor b and the two-point correlation function ξLQG for such groups for three different density profiles of the LQG dark matter halos, i.e. the isothermal profile, the Navarro-Frenk-White (NFW) profile, and the (gravitational) lensing profile. We consider the ΛCDM (cold dark matter plus a cosmological constant Λ) underlying matter power spectrum with Ωm = 0.28, ΩΛ = 0.72, the Hubble constant H0 = 100 h·km·s-1· Mpc-1 with h = 0.72. Dividing the samples into three redshift bins, we find that the LQGs with higher redshift are more biased and correlated than those with lower redshift. The homogeneity scale RH of the LQG distribution is also deduced from theory. It is defined as the comoving radius of the sphere inside which the number of LQGs N(< r) is proportional to r3 within 1%, or equivalently above which the correlation dimension of the sample D2 is within 1% of D2 = 3. For Park et al.'s samples and the NFW dark matter halo profile, the homogeneity scales of the LQG distribution are RH ⋍ 247 h-1· Mpc for 0.2 < z ≤ 0.6, RH ⋍ 360 h-1· Mpc for 0.6 < z ≤ 1.2, and RH ⋍ 480 h-1· Mpc for 1.2 < z ≲ 1.6. The maximum extent of the LQG samples are beyond RH in each bin, showing that the LQG samples are not homogeneously distributed on such a scale, i.e. a length range of ˜ 500 h-1. Mpc and a mass scale of ˜1014M⊙. The possibilities of a top-down structure formation process as was predicted by the hot/warm dark matter (WDM) scenarios and the redshift evolution of bias factor b and correlation amplitude ξLQG of the LQGs as a consequence of the cosmic expansion are both discussed. Different results were obtained based on the LQG sample in Komberg et al. (1996

  9. The large-scale surface brightness distribution of the x ray background

    NASA Technical Reports Server (NTRS)

    Mushotzky, Richard

    1991-01-01

    The x-ray background (XRB) and the microwave background are the dominant isotropic radiation fields available for measurement. There was extensive work on trying to determine the physical origin of the background. That is, whether it is due to a superposition of numerous faint well-known sources, such as active galaxies, an early unidentified population of AGN at high redshift, a new population of objects, or to truly diffuse processes or to a superposition of these. However, while of great intrinsic interest, these studies were not aimed at using the XRB to provide the cosmological information that was gleaned from the microwave background. An alternate approach is presented, which uses the available information on the large, greater than 5 deg, scale distribution of the sky flux to see if the XRB can provide such constraints.

  10. Interoperable mesh components for large-scale, distributed-memory simulations

    NASA Astrophysics Data System (ADS)

    Devine, K.; Diachin, L.; Kraftcheck, J.; Jansen, K. E.; Leung, V.; Luo, X.; Miller, M.; Ollivier-Gooch, C.; Ovcharenko, A.; Sahni, O.; Shephard, M. S.; Tautges, T.; Xie, T.; Zhou, M.

    2009-07-01

    SciDAC applications have a demonstrated need for advanced software tools to manage the complexities associated with sophisticated geometry, mesh, and field manipulation tasks, particularly as computer architectures move toward the petascale. In this paper, we describe a software component - an abstract data model and programming interface - designed to provide support for parallel unstructured mesh operations. We describe key issues that must be addressed to successfully provide high-performance, distributed-memory unstructured mesh services and highlight some recent research accomplishments in developing new load balancing and MPI-based communication libraries appropriate for leadership class computing. Finally, we give examples of the use of parallel adaptive mesh modification in two SciDAC applications.

  11. Large-scale distributed deformation controlled topography along the western Africa-Eurasia limit: Tectonic constraints

    NASA Astrophysics Data System (ADS)

    de Vicente, G.; Vegas, R.

    2009-09-01

    In the interior of the Iberian Peninsula, the main geomorphic features, mountain ranges and basins, seems to be arranged in several directions whose origin can be related to the N-S plate convergence which occurred along the Cantabro-Pyrenean border during the Eocene-Lower Miocene time span. The Iberian Variscan basement accommodated part of this plate convergence in three E-W trending crustal folds as well as in the reactivation of two left-lateral NNE-SSW strike-slip belts. The rest of the convergence was assumed through the inversion of the Iberian Mesozoic Rift to form the Iberian Chain. This inversion gave rise to a process of oblique crustal shortening involving the development of two right lateral NW-SE shear zones. Crustal folds, strike-slip corridors and one inverted rift compose a tectonic mechanism of pure shear in which the shortening is solved vertically by the development of mountain ranges and related sedimentary basins. This model can be expanded to NW Africa, up to the Atlasic System, where N-S plate convergence seems also to be accommodated in several basement uplifts, Anti-Atlas and Meseta, and through the inversion of two Mesozoic rifts, High and Middle Atlas. In this tectonic situation, the microcontinent Iberia used to be firmly attached to Africa during most part of the Tertiary, in such a way that N-S compressive stresses could be transmitted from the collision of the Pyrenean boundary. This tectonic scenario implies that most part of the Tertiary Eurasia-Africa convergence was not accommodated along the Iberia-Africa interface, but in the Pyrenean plateboundary. A broad zone of distributed deformation resulted from the transmission of compressive stresses from the collision at the Pyrenean border. This distributed, intraplate deformation, can be easily related to the topographic pattern of the Africa-Eurasia interface at the longitude of the Iberian Peninsula. Shortening in the Rif-Betics external zones - and their related topographic

  12. Distribution of circular proteins in plants: large-scale mapping of cyclotides in the Violaceae

    PubMed Central

    Burman, Robert; Yeshak, Mariamawit Y.; Larsson, Sonny; Craik, David J.; Rosengren, K. Johan; Göransson, Ulf

    2015-01-01

    During the last decade there has been increasing interest in small circular proteins found in plants of the violet family (Violaceae). These so-called cyclotides consist of a circular chain of approximately 30 amino acids, including six cysteines forming three disulfide bonds, arranged in a cyclic cystine knot (CCK) motif. In this study we map the occurrence and distribution of cyclotides throughout the Violaceae. Plant material was obtained from herbarium sheets containing samples up to 200 years of age. Even the oldest specimens contained cyclotides in the preserved leaves, with no degradation products observable, confirming their place as one of the most stable proteins in nature. Over 200 samples covering 17 of the 23–31 genera in Violaceae were analyzed, and cyclotides were positively identified in 150 species. Each species contained a unique set of between one and 25 cyclotides, with many exclusive to individual plant species. We estimate the number of different cyclotides in the Violaceae to be 5000–25,000, and propose that cyclotides are ubiquitous among all Violaceae species. Twelve new cyclotides from six phylogenetically dispersed genera were sequenced. Furthermore, the first glycosylated derivatives of cyclotides were identified and characterized, further increasing the diversity and complexity of this unique protein family. PMID:26579135

  13. Large scale distribution of bacterial communities in the upper Paraná River floodplain

    PubMed Central

    Chiaramonte, Josiane Barros; Roberto, Maria do Carmo; Pagioro, Thomaz Aurélio

    2014-01-01

    A bacterial community has a central role in nutrient cycle in aquatic habitats. Therefore, it is important to analyze how this community is distributed throughout different locations. Thirty-six different sites in the upper Paraná River floodplain were surveyed to determine the influence of environmental variable in bacterial community composition. The sites are classified as rivers, channels, and floodplain lakes connected or unconnected to the main river channel. The bacterial community structure was analyzed by fluorescent in situ hybridization (FISH) technique, based on frequency of the main domains Bacteria and Archaea, and subdivisions of the phylum Proteobacteria (Alpha-proteobacteria, Beta-proteobacteria, Gamma-proteobacteria) and the Cytophaga-Flavobacterium cluster. It has been demonstrated that the bacterial community differed in density and frequency of the studied groups. And these differences responded to distinct characteristics of the three main rivers of the floodplain as well as to the classification of the environments found in this floodplain. We conclude that dissimilarities in the bacterial community structure are related to environmental heterogeneity, and the limnological variables that most predicted bacterial communities in the upper Paraná River floodplain was total and ammoniacal nitrogen, orthophosphate and chlorophyll-a. PMID:25763022

  14. Response Time Distributions in Rapid Chess: A Large-Scale Decision Making Experiment

    PubMed Central

    Sigman, Mariano; Etchemendy, Pablo; Slezak, Diego Fernández; Cecchi, Guillermo A.

    2010-01-01

    Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times (RTs) and position value in rapid chess games. We measured robust emergent statistical observables: (1) RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, (2) RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state-function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation. PMID:21031032

  15. Reconstruction of air-shower parameters for large-scale radio detectors using the lateral distribution

    NASA Astrophysics Data System (ADS)

    Kostunin, D.; Bezyazeekov, P. A.; Hiller, R.; Schröder, F. G.; Lenok, V.; Levinson, E.

    2016-02-01

    We investigate features of the lateral distribution function (LDF) of the radio signal emitted by cosmic ray air-showers with primary energies Epr > 0.1 EeV and its connection to air-shower parameters such as energy and shower maximum using CoREAS simulations made for the configuration of the Tunka-Rex antenna array. Taking into account all significant contributions to the total radio emission, such as by the geomagnetic effect, the charge excess, and the atmospheric refraction we parameterize the radio LDF. This parameterization is two-dimensional and has several free parameters. The large number of free parameters is not suitable for experiments of sparse arrays operating at low SNR (signal-to-noise ratios). Thus, exploiting symmetries, we decrease the number of free parameters based on the shower geometry and reduce the LDF to a simple one-dimensional function. The remaining parameters can be fit with a small number of points, i.e. as few as the signal from three antennas above detection threshold. Finally, we present a method for the reconstruction of air-shower parameters, in particular, energy and Xmax (shower maximum), which can be reached with a theoretical accuracy of better than 15% and 30 g/cm2, respectively.

  16. Electron Transport and Related Nonequilibrium Distribution Functions in Large Scale ICF Plasma

    NASA Astrophysics Data System (ADS)

    Rozmus, W.; Chapman, T.; Brantov, A. V.; Winjum, B.; Berger, R.; Brunner, S.; Bychenkov, V. Yu.; Tableman, A.

    2014-10-01

    Using the Vlasov-Fokker Fokker-Planck (VFP) code OSHUN and higher order perturbative solutions to the VFP equation, we have studied electron distribution functions (EDF) in inhomogeneous and hot hohlraum plasmas of relevant to the current ICF experiments. For these inhomogeneous ICF plasmas characterized by with the temperature and density gradients consistent with the high flux model [M. D. Rosen et al., HEDP 7, 180 (2011)], nonequilibrium EDF often display unphysical properties related to first and second order derivatives at larger velocities. These EDF strongly modify the linear plasma response, including Lanadau damping of Langmuir waves, electrostatic fluctuation levels, and instability gain coefficients We have found that Langmuir waves propagating in the direction of the heat flow have increased Lanadau damping compared to damping calculated from a Maxwellian EDF, while Langmuir waves propagating in the direction of the temperature gradients are far less damped as compared to damping calculated from the Maxwellian EDF. These effects have been discussed in the context of stimulated Raman scattering, Langmuir decay instability and Thomson scattering experiments.

  17. An incremental and distributed inference method for large-scale ontologies based on MapReduce paradigm.

    PubMed

    Liu, Bo; Huang, Keman; Li, Jianqiang; Zhou, MengChu

    2015-01-01

    With the upcoming data deluge of semantic data, the fast growth of ontology bases has brought significant challenges in performing efficient and scalable reasoning. Traditional centralized reasoning methods are not sufficient to process large ontologies. Distributed reasoning methods are thus required to improve the scalability and performance of inferences. This paper proposes an incremental and distributed inference method for large-scale ontologies by using MapReduce, which realizes high-performance reasoning and runtime searching, especially for incremental knowledge base. By constructing transfer inference forest and effective assertional triples, the storage is largely reduced and the reasoning process is simplified and accelerated. Finally, a prototype system is implemented on a Hadoop framework and the experimental results validate the usability and effectiveness of the proposed approach. PMID:24816632

  18. Large Scale Computing

    NASA Astrophysics Data System (ADS)

    Capiluppi, Paolo

    2005-04-01

    Large Scale Computing is acquiring an important role in the field of data analysis and treatment for many Sciences and also for some Social activities. The present paper discusses the characteristics of Computing when it becomes "Large Scale" and the current state of the art for some particular application needing such a large distributed resources and organization. High Energy Particle Physics (HEP) Experiments are discussed in this respect; in particular the Large Hadron Collider (LHC) Experiments are analyzed. The Computing Models of LHC Experiments represent the current prototype implementation of Large Scale Computing and describe the level of maturity of the possible deployment solutions. Some of the most recent results on the measurements of the performances and functionalities of the LHC Experiments' testing are discussed.

  19. Nonuniform steam generator U-tube flow distribution during natural circulation tests in ROSA-IV large scale test facility

    SciTech Connect

    Kukita, Y.; Nakamura, H.; Tasaka, K. ); Chauliac, C. )

    1988-08-01

    Natural circulation experiments were conducted in a large-scale (1/48 scale in volume) full-height simulator of a Westinghouse-type pressurized water reactor. This facility has two steam generators each containing 141 full-size U-tubes of 9 different heights. Transition of the natural circulation mode was observed in the experiments as the primary of side mass inventory was decreased. Three major circulation modes were observed: single-phase liquid natural circulation, two-phase natural circulation, and reflux condensation. For all these circulation modes, and during the transitions between the modes, the mass flow distribution among the steam generator U-tubes was significantly nonuniform. The longer U-tubes indicated reversed flow at higher primary side mass inventories and also tended to empty earlier than the shorter U-tubes when the primary side mass inventory was decreased.

  20. Assessing Impact of Large-Scale Distributed Residential HVAC Control Optimization on Electricity Grid Operation and Renewable Energy Integration

    NASA Astrophysics Data System (ADS)

    Corbin, Charles D.

    Demand management is an important component of the emerging Smart Grid, and a potential solution to the supply-demand imbalance occurring increasingly as intermittent renewable electricity is added to the generation mix. Model predictive control (MPC) has shown great promise for controlling HVAC demand in commercial buildings, making it an ideal solution to this problem. MPC is believed to hold similar promise for residential applications, yet very few examples exist in the literature despite a growing interest in residential demand management. This work explores the potential for residential buildings to shape electric demand at the distribution feeder level in order to reduce peak demand, reduce system ramping, and increase load factor using detailed sub-hourly simulations of thousands of buildings coupled to distribution power flow software. More generally, this work develops a methodology for the directed optimization of residential HVAC operation using a distributed but directed MPC scheme that can be applied to today's programmable thermostat technologies to address the increasing variability in electric supply and demand. Case studies incorporating varying levels of renewable energy generation demonstrate the approach and highlight important considerations for large-scale residential model predictive control.

  1. A Capacity Design Method of Distributed Battery Storage for Controlling Power Variation with Large-Scale Photovoltaic Sources in Distribution Network

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yasuhiro; Sawa, Toshiyuki; Gunji, Keiko; Yamazaki, Jun; Watanabe, Masahiro

    A design method for distributed battery storage capacity has been developed for evaluating battery storage advantage on demand-supply imbalance control in distribution systems with which large-scale home photovoltaic powers connected. The proposed method is based on a linear storage capacity minimization model with design basis demand load and photovoltaic output time series subjective to battery management constraints. The design method has been experimentally applied to a sample distribution system with substation storage and terminal area storage. From the numerical results, the developed method successfully clarifies the charge-discharge control and stored power variation, satisfies peak cut requirement, and pinpoints the minimum distributed storage capacity.

  2. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems.

    PubMed

    Albattat, Ali; Gruenwald, Benjamin C; Yucelen, Tansel

    2016-01-01

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches. PMID:27537894

  3. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  4. Large-scale determinants of intestinal schistosomiasis and intermediate host snail distribution across Africa: does climate matter?

    PubMed

    Stensgaard, Anna-Sofie; Utzinger, Jürg; Vounatsou, Penelope; Hürlimann, Eveline; Schur, Nadine; Saarnak, Christopher F L; Simoonga, Christopher; Mubita, Patricia; Kabatereine, Narcis B; Tchuem Tchuenté, Louis-Albert; Rahbek, Carsten; Kristensen, Thomas K

    2013-11-01

    The geographical ranges of most species, including many infectious disease agents and their vectors and intermediate hosts, are assumed to be constrained by climatic tolerances, mainly temperature. It has been suggested that global warming will cause an expansion of the areas potentially suitable for infectious disease transmission. However, the transmission of infectious diseases is governed by a myriad of ecological, economic, evolutionary and social factors. Hence, a deeper understanding of the total disease system (pathogens, vectors and hosts) and its drivers is important for predicting responses to climate change. Here, we combine a growing degree day model for Schistosoma mansoni with species distribution models for the intermediate host snail (Biomphalaria spp.) to investigate large-scale environmental determinants of the distribution of the African S. mansoni-Biomphalaria system and potential impacts of climatic changes. Snail species distribution models included several combinations of climatic and habitat-related predictors; the latter divided into "natural" and "human-impacted" habitat variables to measure anthropogenic influence. The predictive performance of the combined snail-parasite model was evaluated against a comprehensive compilation of historical S. mansoni parasitological survey records, and then examined for two climate change scenarios of increasing severity for 2080. Future projections indicate that while the potential S. mansoni transmission area expands, the snail ranges are more likely to contract and/or move into cooler areas in the south and east. Importantly, we also note that even though climate per se matters, the impact of humans on habitat play a crucial role in determining the distribution of the intermediate host snails in Africa. Thus, a future contraction in the geographical range size of the intermediate host snails caused by climatic changes does not necessarily translate into a decrease or zero-sum change in human

  5. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  6. Rainfall hotspots over the southern tropical Andes: Spatial distribution, rainfall intensity, and relations with large-scale atmospheric circulation

    NASA Astrophysics Data System (ADS)

    Espinoza, Jhan Carlo; Chavez, Steven; Ronchail, Josyane; Junquas, Clémentine; Takahashi, Ken; Lavado, Waldo

    2015-05-01

    The Andes/Amazon transition is among the rainiest regions of the world and the interactions between large-scale circulation and the topography that determine its complex rainfall distribution remain poorly known. This work provides an in-depth analysis of the spatial distribution, variability, and intensity of rainfall in the southern Andes/Amazon transition, at seasonal and intraseasonal time scales. The analysis is based on comprehensive daily rainfall data sets from meteorological stations in Peru and Bolivia. We compare our results with high-resolution rainfall TRMM-PR 2A25 estimations. Hotspot regions are identified at low elevations in the Andean foothills (400-700 masl) and in windward conditions at Quincemil and Chipiriri, where more than 4000 mm rainfall per year are recorded. Orographic effects and exposure to easterly winds produce a strong annual rainfall gradient between the lowlands and the Andes that can reach 190 mm/km. Although TRMM-PR reproduces the spatial distribution satisfactorily, it underestimates rainfall by 35% in the hotspot regions. In the Peruvian hotspot, exceptional rainfall occurs during the austral dry season (around 1000 mm in June-July-August; JJA), but not in the Bolivian hotspot. The direction of the low-level winds over the Andean foothills partly explains this difference in the seasonal rainfall cycle. At intraseasonal scales in JJA, we found that, during northerly wind regimes, positive rainfall anomalies predominate over the lowland and the eastern flank of the Andes, whereas less rain falls at higher altitudes. On the other hand, during southerly regimes, rainfall anomalies are negative in the hotspot regions. The influence of cross-equatorial winds is particularly clear below 2000 masl.

  7. Global direct pressures on biodiversity by large-scale metal mining: Spatial distribution and implications for conservation.

    PubMed

    Murguía, Diego I; Bringezu, Stefan; Schaldach, Rüdiger

    2016-09-15

    Biodiversity loss is widely recognized as a serious global environmental change process. While large-scale metal mining activities do not belong to the top drivers of such change, these operations exert or may intensify pressures on biodiversity by adversely changing habitats, directly and indirectly, at local and regional scales. So far, analyses of global spatial dynamics of mining and its burden on biodiversity focused on the overlap between mines and protected areas or areas of high value for conservation. However, it is less clear how operating metal mines are globally exerting pressure on zones of different biodiversity richness; a similar gap exists for unmined but known mineral deposits. By using vascular plants' diversity as a proxy to quantify overall biodiversity, this study provides a first examination of the global spatial distribution of mines and deposits for five key metals across different biodiversity zones. The results indicate that mines and deposits are not randomly distributed, but concentrated within intermediate and high diversity zones, especially bauxite and silver. In contrast, iron, gold, and copper mines and deposits are closer to a more proportional distribution while showing a high concentration in the intermediate biodiversity zone. Considering the five metals together, 63% and 61% of available mines and deposits, respectively, are located in intermediate diversity zones, comprising 52% of the global land terrestrial surface. 23% of mines and 20% of ore deposits are located in areas of high plant diversity, covering 17% of the land. 13% of mines and 19% of deposits are in areas of low plant diversity, comprising 31% of the land surface. Thus, there seems to be potential for opening new mines in areas of low biodiversity in the future. PMID:27262340

  8. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  9. Robust scalable stabilisability conditions for large-scale heterogeneous multi-agent systems with uncertain nonlinear interactions: towards a distributed computing architecture

    NASA Astrophysics Data System (ADS)

    Manfredi, Sabato

    2016-06-01

    Large-scale dynamic systems are becoming highly pervasive in their occurrence with applications ranging from system biology, environment monitoring, sensor networks, and power systems. They are characterised by high dimensionality, complexity, and uncertainty in the node dynamic/interactions that require more and more computational demanding methods for their analysis and control design, as well as the network size and node system/interaction complexity increase. Therefore, it is a challenging problem to find scalable computational method for distributed control design of large-scale networks. In this paper, we investigate the robust distributed stabilisation problem of large-scale nonlinear multi-agent systems (briefly MASs) composed of non-identical (heterogeneous) linear dynamical systems coupled by uncertain nonlinear time-varying interconnections. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, new conditions are given for the distributed control design of large-scale MASs that can be easily solved by the toolbox of MATLAB. The stabilisability of each node dynamic is a sufficient assumption to design a global stabilising distributed control. The proposed approach improves some of the existing LMI-based results on MAS by both overcoming their computational limits and extending the applicative scenario to large-scale nonlinear heterogeneous MASs. Additionally, the proposed LMI conditions are further reduced in terms of computational requirement in the case of weakly heterogeneous MASs, which is a common scenario in real application where the network nodes and links are affected by parameter uncertainties. One of the main advantages of the proposed approach is to allow to move from a centralised towards a distributed computing architecture so that the expensive computation workload spent to solve LMIs may be shared among processors located at the networked nodes, thus increasing the scalability of the approach than the network

  10. Spatial distribution of large-scale solar magnetic fields and their relation to the interplanetary magnetic field

    NASA Technical Reports Server (NTRS)

    Levine, R. H.

    1979-01-01

    The spatial organization of the observed photospheric magnetic field as well as its relation to the polarity of the IMF have been studied using high resolution magnetograms from the Kitt Peak National Observatory. Systematic patterns in the large scale field are due to contributions from both concentrated flux and more diffuse flux. The polarity of the photospheric field, determined on various spatial scales, correlates with the polarity of the IMF. Analyses based on several spatial scales in the photosphere suggest that new flux in the interplanetary medium is often due to relatively small photospheric features which appear in the photosphere up to one month before they are manifest at the earth.

  11. Control of tectonic setting and large-scale faults on the basin-scale distribution of deformation bands in porous sandstone (Provence, France)

    NASA Astrophysics Data System (ADS)

    Ballas, G.; Soliva, R.; Benedicto, A.; Sizun, J.

    2013-12-01

    From outcrops located in Provence (South-East France), we describe the distribution, the microstructures, and the petrophysical properties of deformation bands networks related to different tectonic events. In contractional setting, pervasively distributed networks of reverse-sense compactional-shear bands are observed in all the folded-sand units of the foreland, whereas localized networks of clustered reverse-sense shear bands are only observed close to a large-scale thrust. In extensional setting, networks of clustered normal-sense shear bands are generally observed adjacent to large-scale faults, although few and randomly distributed bands are also observed between these faults. Normal-sense cataclastic faults are also observed restricted to sand units, suggesting that faults can initiate in the sands in extension, which is not observed in contraction. Shear bands and faults show cataclastic microstructures of low-permeability whereas compactional-shear bands show crush microbreccia or protocataclastic microstructures of moderate permeability. This basin-scale analysis underlines the major role of tectonic settings (thrust-fault versus normal-fault andersonian-stress regime) and the influence of inherited large-scale faults on the formation of low-permeability shear bands. We also provide a geometrical analysis of the band network properties (spacing, thickness, shear/compaction ratio, degree of cataclasis, petrophysical properties) with respect to the host sand granulometry. This analysis suggests that granulometry, although less important than tectonic setting and the presence of large-scale faults, has however a non-negligible effect on the band networks geometry.

  12. Impact of Distribution-Connected Large-Scale Wind Turbines on Transmission System Stability during Large Disturbances: Preprint

    SciTech Connect

    Zhang, Y.; Allen, A.; Hodge, B. M.

    2014-02-01

    This work examines the dynamic impacts of distributed utility-scale wind power during contingency events on both the distribution system and the transmission system. It is the first step toward investigating high penetrations of distribution-connected wind power's impact on both distribution and transmission stability.

  13. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  14. An Efficient Framework for Large Scale Multimedia Content Distribution in P2P Network: I2NC.

    PubMed

    Anandaraj, M; Ganeshkumar, P; Vijayakumar, K P; Selvaraj, K

    2015-01-01

    Network coding (NC) makes content distribution more effective and easier in P2P content distribution network and reduces the burden of the original seeder. It generalizes traditional network routing by allowing the intermediate nodes to generate new coded packet by combining the received packets. The randomization introduced by network coding makes all packets equally important and resolves the problem of locating the rarest block. Further, it reduces traffic in the network. In this paper, we analyze the performance of traditional network coding in P2P content distribution network by using a mathematical model and it is proved that traffic reduction has not been fully achieved in P2P network using traditional network coding. It happens due to the redundant transmission of noninnovative information block among the peers in the network. Hence, we propose a new framework, called I2NC (intelligent-peer selection and incremental-network coding), to eliminate the unnecessary flooding of noninnovative coded packets and thereby to improve the performance of network coding in P2P content distribution further. A comparative study and analysis of the proposed system is made through various related implementations and the results show that 10-15% of traffic reduced and improved the average and maximum download time by reducing original seeder's workload. PMID:26605375

  15. An Efficient Framework for Large Scale Multimedia Content Distribution in P2P Network: I2NC

    PubMed Central

    Anandaraj, M.; Ganeshkumar, P.; Vijayakumar, K. P.; Selvaraj, K.

    2015-01-01

    Network coding (NC) makes content distribution more effective and easier in P2P content distribution network and reduces the burden of the original seeder. It generalizes traditional network routing by allowing the intermediate nodes to generate new coded packet by combining the received packets. The randomization introduced by network coding makes all packets equally important and resolves the problem of locating the rarest block. Further, it reduces traffic in the network. In this paper, we analyze the performance of traditional network coding in P2P content distribution network by using a mathematical model and it is proved that traffic reduction has not been fully achieved in P2P network using traditional network coding. It happens due to the redundant transmission of noninnovative information block among the peers in the network. Hence, we propose a new framework, called I2NC (intelligent-peer selection and incremental-network coding), to eliminate the unnecessary flooding of noninnovative coded packets and thereby to improve the performance of network coding in P2P content distribution further. A comparative study and analysis of the proposed system is made through various related implementations and the results show that 10–15% of traffic reduced and improved the average and maximum download time by reducing original seeder's workload. PMID:26605375

  16. Modelling the role of marine particle on large scale 231Pa, 230Th, Iron and Aluminium distributions

    NASA Astrophysics Data System (ADS)

    Dutay, J.-C.; Tagliabue, A.; Kriest, I.; van Hulten, M. M. P.

    2015-04-01

    The distribution of trace elements in the ocean is governed by the combined effects of various processes, and by exchanges with external sources. Modelling these represents an opportunity to better understand and quantify the mechanisms that regulate the oceanic tracer cycles. Observations collected during the GEOTRACES program provide an opportunity to improve our knowledge regarding processes that should be considered in biogeochemical models to adequately represent the distributions of trace elements in the ocean. Here we present a synthesis about the state of the art for simulating selected trace elements in biogeochemical models: Protactinium, Thorium, Iron and Aluminium. In this contribution we pay particular attention on the role of particles in the cycling of these tracers and how they may provide additional constraints on the transfer of matter in the ocean.

  17. Reconstruction of the One-Point Distribution of Convergence from Weak Lensing by Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Zhang, Tong-Jie; Pen, Ue-Li

    2005-12-01

    Weak-lensing measurements are starting to provide statistical maps of the distribution of matter in the universe that are increasingly precise and complementary to cosmic microwave background maps. The probability distribution function (PDF) provides a powerful tool to test non-Gaussian features in the convergence field and to discriminate the different cosmological models. In this paper, we present a new PDF space Wiener filter approach to reconstruct the probability density function of the convergence from the noisy convergence field. We find that for parameters comparable to the CFHT Legacy Survey, the averaged PDF of the convergence in a 3° field can be reconstructed with an uncertainty of about 10%, although the pointwise PDF is noise dominated.

  18. Prototyping a large-scale distributed system for the Great Observatories era - NASA Astrophysics Data System (ADS)

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1990-01-01

    The NASA Astrophysics Data System (ADS) is a distributed information system intended to support research in the Great Observatories era, to simplify access to data, and to enable simultaneous analyses of multispectral data sets. Here, the user agent and interface, its functions, and system components are examined, and the system architecture and infrastructure is addressed. The present status of the system and related future activities are examined.

  19. Study of the Large-Scale Distribution of Gamma-Ray Burst Sources by the Method of Pairwise Distances

    NASA Astrophysics Data System (ADS)

    Gerasim, R. V.; Orlov, V. V.; Raikov, A. A.

    2015-06-01

    The method of pairwise distances developed earlier by the authors is used to study the spatial distribution of 352 sources of gamma-ray bursts with measured redshifts. Three cosmological models are considered: a model with a Euclidean metric, the "tired light" model, and the standard ΛCDM model. It is found that this set has fractal features and may be multifractal. The fractal dimensionalities are estimated.

  20. ‘Oorja’ in India: Assessing a large-scale commercial distribution of advanced biomass stoves to households

    PubMed Central

    Thurber, Mark C.; Phadke, Himani; Nagavarapu, Sriniketh; Shrimali, Gireesh; Zerriffi, Hisham

    2015-01-01

    Replacing traditional stoves with advanced alternatives that burn more cleanly has the potential to ameliorate major health problems associated with indoor air pollution in developing countries. With a few exceptions, large government and charitable programs to distribute advanced stoves have not had the desired impact. Commercially-based distributions that seek cost recovery and even profits might plausibly do better, both because they encourage distributors to supply and promote products that people want and because they are based around properly-incentivized supply chains that could more be scalable, sustainable, and replicable. The sale in India of over 400,000 “Oorja” stoves to households from 2006 onwards represents the largest commercially-based distribution of a gasification-type advanced biomass stove. BP's Emerging Consumer Markets (ECM) division and then successor company First Energy sold this stove and the pelletized biomass fuel on which it operates. We assess the success of this effort and the role its commercial aspect played in outcomes using a survey of 998 households in areas of Maharashtra and Karnataka where the stove was sold as well as detailed interviews with BP and First Energy staff. Statistical models based on this data indicate that Oorja purchase rates were significantly influenced by the intensity of Oorja marketing in a region as well as by pre-existing stove mix among households. The highest rate of adoption came from LPG-using households for which Oorja's pelletized biomass fuel reduced costs. Smoke- and health-related messages from Oorja marketing did not significantly influence the purchase decision, although they did appear to affect household perceptions about smoke. By the time of our survey, only 9% of households that purchased Oorja were still using the stove, the result in large part of difficulties First Energy encountered in developing a viable supply chain around low-cost procurement of “agricultural waste” to

  1. Constraining neutrino mass using the large-scale H I distribution in the post-reionization epoch

    NASA Astrophysics Data System (ADS)

    Pal, Ashis Kumar; Guha Sarkar, Tapomoy

    2016-07-01

    The neutral intergalactic medium in the post-reionization epoch allows us to study cosmological structure formation through the observation of the redshifted 21 cm signal and the Lyman α forest. We investigate the possibility of measuring the total neutrino mass through the suppression of power in the matter power spectrum. We investigate the possibility of measuring the neutrino mass through its imprint on the cross-correlation power spectrum of the 21 cm signal and the Lyman α forest. We consider a radio-interferometric measurement of the 21 cm signal with a SKA1-mid-like radio telescope and a BOSS-like Lyman α forest survey. A Fisher matrix analysis shows that at the fiducial redshift z = 2.5, a 10 000 h 21 cm observation distributed equally over 25 radio pointings and a Lyman α forest survey with 30 quasars lines of sights in 1deg2, allows us to measure Ων at a 3.25 per cent level. A total of 25 000 h radio-interferometric observation distributed equally over 25 radio pointings and a Lyman α survey with bar{n} = 60 deg^{-2} will allow Ων to be measured at a 2.26 per cent level. This corresponds to an idealized measurement of ∑mν at the precision of (100 ± 2.26)meV and fν = Ων/Ωm at 2.49 per cent level.

  2. The importance of large scale sea ice drift and ice type distribution on ice extent in the Weddell Sea

    NASA Astrophysics Data System (ADS)

    Schwegmann, S.; Haas, C.; Timmermann, R.; Gerdes, R.; Lemke, P.

    2009-12-01

    In austral winter large parts of Antarctic Seas are covered by sea ice. This modifies the exchange of heat, mass and momentum between ocean and atmosphere. The knowledge of ice extent and its variability is necessary for an adequate simulation of those fluxes and thus for climate modelling. The goal of this study is the observation of interannual and seasonal ice extent variations and their underlying causes. Variability is analysed by using monthly means of microwave and scatterometer satellite data. Results are correlated with ice drift variations calculated from a Finite Element Sea ice-Ocean Model (FESOM) and with satellite derived sea ice drift products to determine the dependency of ice extent on sea ice drift. An additional cause for changing ice extent could be the variability of ice type distribution, i.e. the contribution of first and second year ice to the total ice covered area. These ice types are determined on monthly time scales from scatterometer satellite data. Ice class distribution and sea ice drift variability are compared with the characteristics and variability of the Southern Annular Mode (SAM) to evaluate the relative importance of different sea ice parameters for shaping Weddell Sea ice extent and its variability.

  3. Galaxy clustering on large scales.

    PubMed

    Efstathiou, G

    1993-06-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe. PMID:11607400

  4. Development of a low-power, low-cost front end electronics module for large scale distributed neutrino detectors

    SciTech Connect

    James J. Beatty Richard D. Kass

    2008-03-08

    A number of concepts have been presented for distributed neutrino detectors formed of large numbers of autonomous detectors. Examples include the Antarctic Ross Ice Shelf Antenna Neutrino Array (ARIANNA) [Barwick 2006], as well as proposed radio extensions to the IceCube detector at South Pole Station such as AURA and IceRay. [Besson 2008]. We have focused on key enabling technical developments required by this class of experiments. The radio Cherenkov signal, generated by the Askaryan mechanism [Askaryan 1962, 1965], is impulsive and coherent up to above 1 GHz. In the frequency domain, the impulsive character of the emission results in simultaneous increase of the power detected in multiple frequency bands. This multiband triggering approach has proven fruitful, especially as anthropogenic interference often results from narrowband communications signals. A typical distributed experiment of this type consists of a station responsible for the readout of a cluster of antennas either near the surface of the ice or deployed in boreholes. Each antenna is instrumented with a broadband low-noise amplifier, followed by an array of filters to facilitate multi-band coincidence trigger schemes at the antenna level. The power in each band is detected at the output of each band filter, using either square-law diode detectors or log-power detectors developed for the cellular telephone market. The use of multiple antennas per station allows a local coincidence among antennas to be used as the next stage of the trigger. Station triggers can then be combined into an array trigger by comparing timestamps of triggers among stations and identifying space-time clusters of station triggers. Data from each station is buffered and can be requested from the individual stations when a multi-station coincidence occurs. This approach has been successfully used in distributed experiments such as the Pierre Auger Observatory. [Abraham et al. 2004] We identified the filters as being especially

  5. Determining organic carbon distributions in soil particle size fractions as a precondition of lateral carbon transport modeling at large scales

    NASA Astrophysics Data System (ADS)

    Schindewolf, Marcus; Seher, Wiebke; Pfeffer, Eduard; Schultze, Nico; Amorim, Ricardo S. S.; Schmidt, Jürgen

    2016-04-01

    The erosional transport of organic carbon has an effect on the global carbon budget, however, it is uncertain, whether erosion is a sink or a source for carbon in the atmosphere. Continuous erosion leads to a massive loss of top soils including the loss of organic carbon historically accumulated in the soil humus fraction. The colluvial organic carbon could be protected from further degradation depending on the depth of the colluvial cover and local decomposing conditions. Another part of eroded soils and organic carbon will enter surface water bodies and might be transported over long distances. The selective nature of soil erosion results in a preferential transport of fine particles while less carbonic larger particles remain on site. Consequently organic carbon is enriched in the eroded sediment compared to the origin soil. As a precondition of process based lateral carbon flux modeling, carbon distribution on soil particle size fractions has to be known. In this regard the present study refers to the determination of organic carbon contents on soil particle size separates by a combined sieve-sedimentation method for different tropical and temperate soils Our results suggest high influences of parent material and climatic conditions on carbon distribution on soil particle separates. By applying these results in erosion modeling a test slope was simulated with the EROSION 2D simulation software covering certain land use and soil management scenarios referring to different rainfall events. These simulations allow first insights on carbon loss and depletion on sediment delivery areas as well as carbon gains and enrichments on deposition areas on the landscape scale and could be used as a step forward in landscape scaled carbon redistribution modeling.

  6. Coronal mass ejection rate and the evolution of the large-scale K-coronal density distribution

    SciTech Connect

    Sime, D.G.

    1989-01-01

    Recently reported occurrence rates of coronal mass ejections (CMEs) are compared with the time scale for the long-term evolution of the global white light coronal density distribution. This time scale is estimated from the synoptic observations of the corona made from Mauna Loa, Hawaii, by a seies of K-coronameters. The data span a period of more than 20 years and show evolution rates which vary with time roughly in phase with the solar activity cycle. However, there are detailed differences between the sunspot number curve and the long-term behavior of this quantity. When the occurrence rates of CMEs observed from orbiting coronagraphs, available mainly during the descending phase of the activity cycle, are compared with this evolution time, it is found that the two quantities are inversely proportional. From energy considerations, it is unlikely that there is a causal relationship between CMEs and this coronal evolution. Rather, the result indicates that the processes which lead to the global evolution are intimately related to those which give rise to CMEs, a hypothesis consistent with current theories that CMEs arise from preexisting magnetic structures which become stressed by the global magnetic field rearrangement to the point of instability. copyright American Geophysical Union 1989

  7. LARGE-SCALE DISTRIBUTION OF ARRIVAL DIRECTIONS OF COSMIC RAYS DETECTED ABOVE 10{sup 18} eV AT THE PIERRE AUGER OBSERVATORY

    SciTech Connect

    Abreu, P.; Andringa, S.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Alves Batista, R.; Ambrosio, M.; Aramo, C.; Aminaei, A.; Anchordoqui, L.; Antici'c, T.; Arganda, E.; Collaboration: Pierre Auger Collaboration; and others

    2012-12-15

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 10{sup 18} eV at the Pierre Auger Observatory is presented. This search is performed as a function of both declination and right ascension in several energy ranges above 10{sup 18} eV, and reported in terms of dipolar and quadrupolar coefficients. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Assuming that any cosmic-ray anisotropy is dominated by dipole and quadrupole moments in this energy range, upper limits on their amplitudes are derived. These upper limits allow us to test the origin of cosmic rays above 10{sup 18} eV from stationary Galactic sources densely distributed in the Galactic disk and predominantly emitting light particles in all directions.

  8. The nature, origins and distribution of ash aggregates in a large-scale wet eruption deposit: Oruanui, New Zealand

    NASA Astrophysics Data System (ADS)

    Van Eaton, Alexa R.; Wilson, Colin J. N.

    2013-01-01

    This study documents the processes and products of volcanic ash aggregation in phreatomagmatic phases of the 25.4 ka Oruanui supereruption from Taupo volcano, New Zealand. Detailed textural and stratigraphic relationships of aggregates are examined in six of the ten erupted units, which range from relatively dry styles of eruption and deposition (units 2, 5) to mixed (units 6, 7, 8) and dominantly wet (unit 3). Aggregate structures and grain size distributions shift abruptly over vertical scales of cm to dm, providing diagnostic features to identify deposits emplaced primarily as vertical fallout or pyroclastic density currents (PDCs). The six categories of ash aggregates documented here are used to infer distinct volcanic and meteorological interactions in the eruption cloud related to dispersal characteristics and mode of emplacement. Our field observations support the notion of Brown et al. (2010, Origin of accretionary lapilli within ground-hugging density currents: evidence from pyroclastic couplets on Tenerife. Geol. Soc. Am. Bull. 122, 305-320) that deposits bearing matrix-supported accretionary lapilli with concentric internal structure and abundant rim fragments are associated with emplacement of PDCs. However, on the basis of grain size distributions and field relationships, it is inferred that these types of ash aggregates formed their ultrafine ash (dominantly < 10 μm) outer layers in the buoyant plumes of fine ash lofted from PDCs, rather than during lateral transport in ground-hugging density currents. The propagation of voluminous PDCs beneath an overriding buoyant cloud - whether coignimbrite or vent-derived in origin - is proposed to generate the observed, concentrically structured accretionary lapilli by producing multiple updrafts of convectively unstable, ash-laden air. The apparent coarsening of mean grain size with distance from source, which is observed in aggregate-bearing fall facies, reflects a combination of multi-level plume transport

  9. Large-Scale Distributions of Tropospheric Nitric, Formic, and Acetic acids Over the Westerm Pacific Basin During Wintertime

    NASA Technical Reports Server (NTRS)

    Talbot, R. W.; Dibb, J. E.; Lefer, B. L.; Scheuer, E. M.; Bradshaw, J. D.; Sandholm, S. T.; Smyth, S.; Blake, D. R.; Blake, N. J.; Sachse, G. W.; Collins, J. E.; Gregory, G. L.

    1997-01-01

    We report here measurements of the acidic gases nitric (HNO3), formic (HCOOH), and acetic (CH3COOH) over the western Pacific basin during the February-March 1994 Pacific Exploratory Mission-West (PEM-West B). These data were obtained aboard the NASA DC-8 research aircraft as it flew missions in the altitude range of 0.3 - 12.5 km over equatorial regions near Guam and then further westward encompassing the entire Pacific Rim arc. Aged marine air over the equatorial Pacific generally exhibited mixing ratios of acidic gases less than 100 parts per trillion by volume (pptv). Near the Asian continent, discrete plumes encountered below 6 km altitude contained up to 8 parts per billion by volume (ppbv) HNO3 and 10 ppbv HCOOH and CH3COOH. Overall there was a general correlation between mixing ratios of acidic gases with those of CO, C2H2, and C2Cl4, indicative of emissions from combustion and industrial sources. The latitudinal distributions of HNO3 and CO showed that the largest mixing ratios were centered around 15 deg N, while HCOOH, CH3COOH, and C2Cl4 peaked at 25 deg N. The mixing ratios of HCOOH and CH3COOH were highly correlated (r(sup 2) = 0.87) below 6 km altitude, with a slope (0.89) characteristic of the nongrowing season at midlatitudes in the northern hemisphere. Above 6 km altitude, HCOOH and CH3COOH were marginally correlated (r(sup 2) = 0.50), and plumes well defined by CO, C2H2, and C2Cl4 were depleted in acidic gases, most likely due to scavenging during vertical transport of air masses through convective cloud systems over the Asian continent. In stratospheric air masses, HNO, mixing ratios were several parts per billion by volume (ppbv), yielding relationships with 03 and N2O consistent with those previously reported for NO(y).

  10. Large-scale copy number variants (CNVs): Distribution in normal subjects and FISH/real-time qPCR analysis

    PubMed Central

    Qiao, Ying; Liu, Xudong; Harvard, Chansonette; Nolin, Sarah L; Brown, W Ted; Koochek, Maryam; Holden, Jeanette JA; Lewis, ME Suzanne; Rajcan-Separovic, Evica

    2007-01-01

    Background Genomic copy number variants (CNVs) involving >1 kb of DNA have recently been found to be widely distributed throughout the human genome. They represent a newly recognized form of DNA variation in normal populations, discovered through screening of the human genome using high-throughput and high resolution methods such as array comparative genomic hybridization (array-CGH). In order to understand their potential significance and to facilitate interpretation of array-CGH findings in constitutional disorders and cancers, we studied 27 normal individuals (9 Caucasian; 9 African American; 9 Hispanic) using commercially available 1 Mb resolution BAC array (Spectral Genomics). A selection of CNVs was further analyzed by FISH and real-time quantitative PCR (RT-qPCR). Results A total of 42 different CNVs were detected in 27 normal subjects. Sixteen (38%) were not previously reported. Thirteen of the 42 CNVs (31%) contained 28 genes listed in OMIM. FISH analysis of 6 CNVs (4 previously reported and 2 novel CNVs) in normal subjects resulted in the confirmation of copy number changes for 1 of 2 novel CNVs and 2 of 4 known CNVs. Three CNVs tested by FISH were further validated by RT-qPCR and comparable data were obtained. This included the lack of copy number change by both RT-qPCR and FISH for clone RP11-100C24, one of the most common known copy number variants, as well as confirmation of deletions for clones RP11-89M16 and RP5-1011O17. Conclusion We have described 16 novel CNVs in 27 individuals. Further study of a small selection of CNVs indicated concordant and discordant array vs. FISH/RT-qPCR results. Although a large number of CNVs has been reported to date, quantification using independent methods and detailed cellular and/or molecular assessment has been performed on a very small number of CNVs. This information is, however, very much needed as it is currently common practice to consider CNVs reported in normal subjects as benign changes when detected in

  11. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  12. Large-Scale Distribution and Activity of Prokaryotes in Deep-Sea Surface Sediments of the Mediterranean Sea and the Adjacent Atlantic Ocean

    PubMed Central

    Giovannelli, Donato; Molari, Massimiliano; d’Errico, Giuseppe; Baldrighi, Elisa; Pala, Claudia; Manini, Elena

    2013-01-01

    The deep-sea represents a substantial portion of the biosphere and has a major influence on carbon cycling and global biogeochemistry. Benthic deep-sea prokaryotes have crucial roles in this ecosystem, with their recycling of organic matter from the photic zone. Despite this, little is known about the large-scale distribution of prokaryotes in the surface deep-sea sediments. To assess the influence of environmental and trophic variables on the large-scale distribution of prokaryotes, we investigated the prokaryotic assemblage composition (Bacteria to Archaea and Euryarchaeota to Crenarchaeota ratio) and activity in the surface deep-sea sediments of the Mediterranean Sea and the adjacent North Atlantic Ocean. Prokaryotic abundance and biomass did not vary significantly across the Mediterranean Sea; however, there were depth-related trends in all areas. The abundance of prokaryotes was positively correlated with the sedimentary concentration of protein, an indicator of the quality and bioavailability of organic matter. Moving eastwards, the Bacteria contribution to the total prokaryotes decreased, which appears to be linked to the more oligotrophic conditions of the Eastern Mediterranean basins. Despite the increased importance of Archaea, the contributions of Crenarchaeota Marine Group I to the total pool was relatively constant across the investigated stations, with the exception of Matapan-Vavilov Deep, in which Euryarchaeota Marine Group II dominated. Overall, our data suggest that deeper areas of the Mediterranean Sea share more similar communities with each other than with shallower sites. Freshness and quality of sedimentary organic matter were identified through Generalized Additive Model analysis as the major factors for describing the variation in the prokaryotic community structure and activity in the surface deep-sea sediments. Longitude was also important in explaining the observed variability, which suggests that the overlying water masses might have a

  13. Large scale tracking algorithms.

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  14. Large scale traffic simulations

    SciTech Connect

    Nagel, K.; Barrett, C.L.; Rickert, M.

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  15. The Geographic Distribution of Loa loa in Africa: Results of Large-Scale Implementation of the Rapid Assessment Procedure for Loiasis (RAPLOA)

    PubMed Central

    Zouré, Honorat Gustave Marie; Wanji, Samuel; Noma, Mounkaïla; Amazigo, Uche Veronica; Diggle, Peter J.; Tekle, Afework Hailemariam; Remme, Jan H. F.

    2011-01-01

    Background Loiasis is a major obstacle to ivermectin treatment for onchocerciasis control and lymphatic filariasis elimination in central Africa. In communities with a high level of loiasis endemicity, there is a significant risk of severe adverse reactions to ivermectin treatment. Information on the geographic distribution of loiasis in Africa is urgently needed but available information is limited. The African Programme for Onchocerciasis Control (APOC) undertook large scale mapping of loiasis in 11 potentially endemic countries using a rapid assessment procedure for loiasis (RAPLOA) that uses a simple questionnaire on the history of eye worm. Methodology/Principal Findings RAPLOA surveys were done in a spatial sample of 4798 villages covering an area of 2500×3000 km centred on the heartland of loiasis in Africa. The surveys showed high risk levels of loiasis in 10 countries where an estimated 14.4 million people live in high risk areas. There was a strong spatial correlation among RAPLOA data, and kriging was used to produce spatially smoothed contour maps of the interpolated prevalence of eye worm and the predictive probability that the prevalence exceeds 40%. Conclusion/Significance The contour map of eye worm prevalence provides the first global map of loiasis based on actual survey data. It shows a clear distribution with two zones of hyper endemicity, large areas that are free of loiasis and several borderline or intermediate zones. The surveys detected several previously unknown hyperendemic foci, clarified the distribution of loiasis in the Central African Republic and large parts of the Republic of Congo and the Democratic Republic of Congo for which hardly any information was available, and confirmed known loiasis foci. The new maps of the prevalence of eye worm and the probability that the prevalence exceeds the risk threshold of 40% provide critical information for ivermectin treatment programs among millions of people in Africa. PMID:21738809

  16. The large-scale distribution of ammonia oxidizers in paddy soils is driven by soil pH, geographic distance, and climatic factors.

    PubMed

    Hu, Hang-Wei; Zhang, Li-Mei; Yuan, Chao-Lei; Zheng, Yong; Wang, Jun-Tao; Chen, Deli; He, Ji-Zheng

    2015-01-01

    Paddy soils distribute widely from temperate to tropical regions, and are characterized by intensive nitrogen fertilization practices in China. Mounting evidence has confirmed the functional importance of ammonia-oxidizing archaea (AOA) and bacteria (AOB) in soil nitrification, but little is known about their biogeographic distribution patterns in paddy ecosystems. Here, we used barcoded pyrosequencing to characterize the effects of climatic, geochemical and spatial factors on the distribution of ammonia oxidizers from 11 representative rice-growing regions (75-1945 km apart) of China. Potential nitrification rates varied greatly by more than three orders of magnitude, and were significantly correlated with the abundances of AOA and AOB. The community composition of ammonia oxidizer was affected by multiple factors, but changes in relative abundances of the major lineages could be best predicted by soil pH. The alpha diversity of AOA and AOB displayed contrasting trends over the gradients of latitude and atmospheric temperature, indicating a possible niche separation between AOA and AOB along the latitude. The Bray-Curtis dissimilarities in ammonia-oxidizing community structure significantly increased with increasing geographical distance, indicating that more geographically distant paddy fields tend to harbor more dissimilar ammonia oxidizers. Variation partitioning analysis revealed that spatial, geochemical and climatic factors could jointly explain majority of the data variation, and were important drivers defining the ecological niches of AOA and AOB. Our findings suggest that both AOA and AOB are of functional importance in paddy soil nitrification, and ammonia oxidizers in paddy ecosystems exhibit large-scale biogeographic patterns shaped by soil pH, geographic distance, and climatic factors. PMID:26388866

  17. The large-scale distribution of ammonia oxidizers in paddy soils is driven by soil pH, geographic distance, and climatic factors

    PubMed Central

    Hu, Hang-Wei; Zhang, Li-Mei; Yuan, Chao-Lei; Zheng, Yong; Wang, Jun-Tao; Chen, Deli; He, Ji-Zheng

    2015-01-01

    Paddy soils distribute widely from temperate to tropical regions, and are characterized by intensive nitrogen fertilization practices in China. Mounting evidence has confirmed the functional importance of ammonia-oxidizing archaea (AOA) and bacteria (AOB) in soil nitrification, but little is known about their biogeographic distribution patterns in paddy ecosystems. Here, we used barcoded pyrosequencing to characterize the effects of climatic, geochemical and spatial factors on the distribution of ammonia oxidizers from 11 representative rice-growing regions (75–1945 km apart) of China. Potential nitrification rates varied greatly by more than three orders of magnitude, and were significantly correlated with the abundances of AOA and AOB. The community composition of ammonia oxidizer was affected by multiple factors, but changes in relative abundances of the major lineages could be best predicted by soil pH. The alpha diversity of AOA and AOB displayed contrasting trends over the gradients of latitude and atmospheric temperature, indicating a possible niche separation between AOA and AOB along the latitude. The Bray–Curtis dissimilarities in ammonia-oxidizing community structure significantly increased with increasing geographical distance, indicating that more geographically distant paddy fields tend to harbor more dissimilar ammonia oxidizers. Variation partitioning analysis revealed that spatial, geochemical and climatic factors could jointly explain majority of the data variation, and were important drivers defining the ecological niches of AOA and AOB. Our findings suggest that both AOA and AOB are of functional importance in paddy soil nitrification, and ammonia oxidizers in paddy ecosystems exhibit large-scale biogeographic patterns shaped by soil pH, geographic distance, and climatic factors. PMID:26388866

  18. Large-scale analysis of the prevalence and geographic distribution of HIV-1 non-B variants in the United States.

    PubMed

    Pyne, Michael T; Hackett, John; Holzmayer, Vera; Hillyard, David R

    2013-08-01

    The genetic diversity of human immunodeficiency virus type 1 (HIV-1) has significant implications for diagnosis, vaccine development, and clinical management of patients. Although HIV-1 subtype B is predominant in the United States, factors such as global travel, immigration, and military deployment have the potential to increase the proportion of non-subtype B infections. Limited data are available on the prevalence and distribution of non-B HIV-1 strains in the United States. We sought to retrospectively examine the prevalence, geographic distribution, diversity, and temporal trends of HIV-1 non-B infections in samples obtained by ARUP Laboratories, a national reference laboratory, from all regions of the United States. HIV-1 pol sequences from 24,386 specimens collected from 46 states between 2004 and September 2011 for drug resistance genotyping were analyzed using the REGA HIV-1 Subtyping Tool, version 2.0. Sequences refractory to subtype determination or reported as non-subtype B by this tool were analyzed by PHYLIP version 3.5 and Simplot version 3.5.1. Non-subtype B strains accounted for 3.27% (798/24,386) of specimens. The 798 non-B specimens were received from 37 states and included 5 subtypes, 23 different circulating recombinant forms (CRFs), and 39 unique recombinant forms (URFs). The non-subtype B prevalence varied from 0% in 2004 (0/54) to 4.12% in 2011 (201/4,884). This large-scale analysis reveals that the diversity of HIV-1 in the United States is high, with multiple subtypes, CRFs, and URFs circulating. Moreover, the geographic distribution of non-B variants is widespread. Data from HIV-1 drug resistance testing have the potential to significantly enhance the surveillance of HIV-1 variants in the United States. PMID:23761148

  19. The effects of Reynolds number, rotor incidence angle and surface roughness on the heat transfer distribution in a large-scale turbine rotor passage

    NASA Technical Reports Server (NTRS)

    Blair, M. F.

    1991-01-01

    A combined experimental and computational program was conducted to examine the heat transfer distribution in a turbine rotor passage geometrically similar to the Space Shuttle Main Engine (SSME) High Pressure Fuel Turbopump (HPFTP). Heat transfer was measured and computed for both the full span suction and pressure surfaces of the rotor airfoil as well as for the hub endwall surface. The objective of the program was to provide a benchmark-quality database for the assessment of rotor heat transfer computational techniques. The experimental portion of the study was conducted in a large scale, ambient temperature, rotating turbine model. The computational portion consisted of the application of a well-posed parabolized Navier-Stokes analysis of the calculation of the three-dimensional viscous flow through ducts simulating a gas turbine package. The results of this assessment indicate that the procedure has the potential to predict the aerodynamics and the heat transfer in a gas turbine passage and can be used to develop detailed three dimensional turbulence models for the prediction of skin friction and heat transfer in complex three dimensional flow passages.

  20. Impact of interannual changes of large scale circulation and hydrography on the spatial distribution of beaked redfish (Sebastes mentella) in the Irminger Sea

    NASA Astrophysics Data System (ADS)

    Núñez-Riboni, Ismael; Kristinsson, Kristján; Bernreuther, Matthias; van Aken, Hendrik M.; Stransky, Christoph; Cisewski, Boris; Rolskiy, Alexey

    2013-12-01

    This study provides evidence of the influence of hydrography and large scale ocean circulation on the geographical distribution of beaked redfish (Sebastes mentella) in the Irminger Sea on the interannual time scale, from 1992 to 2011. The results reveal the average relationship of adult pelagic redfish to their physical habitat from 100 to 800 m depth: the most preferred latitude, longitude, depth, temperature and salinity for redfish are approximately 58°N, 41°W, 557 m, 4.5 °C and 34.87, respectively. The redfish habitat corresponds in a temperature-salinity (TS) diagram to a mixing triangle between East Greenland Current Water (EGCW), Labrador Sea Water (LSW) and Irminger Current Water (ICW). The geographical centre of mass of the redfish distribution (as revealed by acoustic fish density) indicates displacements from year to year. Changes in hydrographic conditions were investigated in detail for possible reasons for these displacements. Empirical Orthogonal Analysis reveals that maximum variations of water mass volume on an interannual time-scale in the study region correspond to ICW and LSW changes, while EGCW remains comparatively stable. Indices of redfish geographical centroid, LSW volume, ICW temperature and Subpolar Gyre (SPG) intensity suggest that the geographical redfish displacements are closely related to interannual changes of ICW modulated by the SPG intensity with a lag of 1 or 2 years. In comparison, LSW seems to have no impact on the redfish distribution at the studied depth range. The time lag between ICW and redfish displacements indicates an indirect influence of temperature on redfish. Hence, changes of chlorophyll-a (from satellite imagery), as a proxy for primary production, were used in a first approach to study the role of food availability. The analysis is based on acoustic and trawl data from nine expeditions coordinated by the International Council for the Exploration of the Sea (ICES), around 71,000 hydrographic stations from the

  1. The effects of Reynolds number, rotor incidence angle, and surface roughness on the heat transfer distribution in a large-scale turbine rotor passage

    NASA Technical Reports Server (NTRS)

    Blair, Michael F.; Anderson, Olof L.

    1989-01-01

    A combined experimental and computational program was conducted to examine the heat transfer distribution in a turbine rotor passage geometrically similiar to the Space Shuttle Main Engine (SSME) High Pressure Fuel Turbopump (HPFTP). Heat transfer was measured and computed for both the full-span suction and pressure surfaces of the rotor airfoil as well as for the hub endwall surface. The primary objective of the program was to provide a benchmark-quality data base for the assessment of rotor passage heat transfer computational procedures. The experimental portion of the study was conducted in a large-scale, ambient temperature, rotating turbine model. Heat transfer data were obtained using thermocouple and liquid-crystal techniques to measure temperature distributions on the thin, electrically-heated skin of the rotor passage model. Test data were obtained for various combinations of Reynolds number, rotor incidence angle and model surface roughness. The data are reported in the form of contour maps of Stanton number. These heat distribution maps revealed numerous local effects produced by the three-dimensional flows within the rotor passage. Of particular importance were regions of local enhancement produced on the airfoil suction surface by the main-passage and tip-leakage vortices and on the hub endwall by the leading-edge horseshoe vortex system. The computational portion consisted of the application of a well-posed parabolized Navier-Stokes analysis to the calculation of the three-dimensional viscous flow through ducts simulating the a gas turbine passage. These cases include a 90 deg turning duct, a gas turbine cascade simulating a stator passage, and a gas turbine rotor passage including Coriolis forces. The calculated results were evaluated using experimental data of the three-dimensional velocity fields, wall static pressures, and wall heat transfer on the suction surface of the turbine airfoil and on the end wall. Particular attention was paid to an

  2. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  3. Large-scale inhomogeneities and galaxy statistics

    NASA Technical Reports Server (NTRS)

    Schaeffer, R.; Silk, J.

    1984-01-01

    The density fluctuations associated with the formation of large-scale cosmic pancake-like and filamentary structures are evaluated using the Zel'dovich approximation for the evolution of nonlinear inhomogeneities in the expanding universe. It is shown that the large-scale nonlinear density fluctuations in the galaxy distribution due to pancakes modify the standard scale-invariant correlation function xi(r) at scales comparable to the coherence length of adiabatic fluctuations. The typical contribution of pancakes and filaments to the J3 integral, and more generally to the moments of galaxy counts in a volume of approximately (15-40 per h Mpc)exp 3, provides a statistical test for the existence of large scale inhomogeneities. An application to several recent three dimensional data sets shows that despite large observational uncertainties over the relevant scales characteristic features may be present that can be attributed to pancakes in most, but not all, of the various galaxy samples.

  4. Variations over time in latitudinal distribution of the large-scale magnetic fields in the solar atmosphere at heights from the photosphere to the source surface

    NASA Astrophysics Data System (ADS)

    Akhtemov, Z. S.; Andreyeva, O. A.; Rudenko, G. V.; Stepanian, N. N.; Fainshtein, V. G.

    2015-02-01

    Calculations of magnetic field in the solar atmosphere and the "potential field-source surface" model have been used to study time variations in several parameters of the large-scale magnetic field at various heights during the last four solar cycles. At ten heights from the solar surface (R = Ro) to the source surface (R = 2.5Ro), we have constructed synoptic charts (SC) of the radial component Br of the estimated magnetic field. For these SC, we have identified 10-degree latitudinal zones. Within these zones, we found values of Sp (positive Br values averaged within the latitudinal zone over latitude and longitude), Sm (averaged modulus of negative Br values) and S + fields (a part of the latitudinal zone area (in %) occupied by positive Br values). At lower latitudes, cyclic variations in the Sp + Sm parameter are demonstrated to be similar (but not in detail) to time variations in Wolf numbers. Latitudes of 55° and higher exhibited virtually no cyclic peculiarities of time variations in this parameter. The authors believe that this indicates the diverse nature of the large-scale magnetic field in the near-equatorial and polar regions of the solar atmosphere. At R = 2.5Ro, Sp + Sm cyclic variations are almost invisible at all latitudes and only slightly apparent near the equator. The analysis of S + fields variations revealed that at low latitudes at R = 2.5Ro during solar cycles 21, 22 and ascending phase of cycle 23 there were almost no mixed-polarity periods. However, beginning from the maximum of cycle 23, in the near-equatorial region the mixed polarity was observed until the end of the long solar activity minimum. An assumption has been made that this might have been one of the forerunners and manifestations of the prolonged minimum between cycles 23 and 24. It has been found that during solar activity minima poleward there appears motion of magnetic fields with polarity opposite to that of the field at the pole. We have estimated the velocity of such a

  5. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  6. Fractals and cosmological large-scale structure

    NASA Technical Reports Server (NTRS)

    Luo, Xiaochun; Schramm, David N.

    1992-01-01

    Observations of galaxy-galaxy and cluster-cluster correlations as well as other large-scale structure can be fit with a 'limited' fractal with dimension D of about 1.2. This is not a 'pure' fractal out to the horizon: the distribution shifts from power law to random behavior at some large scale. If the observed patterns and structures are formed through an aggregation growth process, the fractal dimension D can serve as an interesting constraint on the properties of the stochastic motion responsible for limiting the fractal structure. In particular, it is found that the observed fractal should have grown from two-dimensional sheetlike objects such as pancakes, domain walls, or string wakes. This result is generic and does not depend on the details of the growth process.

  7. Modeling of Carbon Tetrachloride Flow and Transport in the Subsurface of the 200 West Disposal Sites: Large-Scale Model Configuration and Prediction of Future Carbon Tetrachloride Distribution Beneath the 216-Z-9 Disposal Site

    SciTech Connect

    Oostrom, Mart; Thorne, Paul D.; Zhang, Z. F.; Last, George V.; Truex, Michael J.

    2008-12-17

    Three-dimensional simulations considered migration of dense, nonaqueous phase liquid (DNAPL) consisting of CT and co disposed organics in the subsurface as a function of the properties and distribution of subsurface sediments and of the properties and disposal history of the waste. Simulations of CT migration were conducted using the Water-Oil-Air mode of Subsurface Transport Over Multiple Phases (STOMP) simulator. A large-scale model was configured to model CT and waste water discharge from the major CT and waste-water disposal sites.

  8. Microfluidic large-scale integration.

    PubMed

    Thorsen, Todd; Maerkl, Sebastian J; Quake, Stephen R

    2002-10-18

    We developed high-density microfluidic chips that contain plumbing networks with thousands of micromechanical valves and hundreds of individually addressable chambers. These fluidic devices are analogous to electronic integrated circuits fabricated using large-scale integration. A key component of these networks is the fluidic multiplexor, which is a combinatorial array of binary valve patterns that exponentially increases the processing power of a network by allowing complex fluid manipulations with a minimal number of inputs. We used these integrated microfluidic networks to construct the microfluidic analog of a comparator array and a microfluidic memory storage device whose behavior resembles random-access memory. PMID:12351675

  9. Large scale distribution of ultra high energy cosmic rays detected at the Pierre Auger Observatory with zenith angles up to 80°

    SciTech Connect

    Aab, Alexander

    2015-03-30

    In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in the $E\\gt 8$ EeV energy bin, with an amplitude for the first harmonic in right ascension $r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$, that has a chance probability $P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$, reinforcing the hint previously reported with vertical events alone.

  10. Large scale distribution of ultra high energy cosmic rays detected at the Pierre Auger Observatory with zenith angles up to 80°

    DOE PAGESBeta

    Aab, Alexander

    2015-03-30

    In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in themore » $$E\\gt 8$$ EeV energy bin, with an amplitude for the first harmonic in right ascension $$r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$$, that has a chance probability $$P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$$, reinforcing the hint previously reported with vertical events alone.« less

  11. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation

    PubMed Central

    Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-01-01

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems.

  12. Changes in the distribution of the grey mangrove Avicennia marina (Forsk.) using large scale aerial color infrared photographs: are the changes related to habitat modification for mosquito control?

    NASA Astrophysics Data System (ADS)

    Jones, J.; Dale, P. E. R.; Chandica, A. L.; Breitfuss, M. J.

    2004-09-01

    Runnelling, a method of habitat modification used for mosquito management in intertidal saltmarshes in Australia, alters marsh hydrology. The objective of this research was to assess if runnelling had affected the distribution of the grey mangrove ( Avicennia marina (Forsk.)) at a study site in southeast Queensland. Since runnelling is carried out in diverse marshes a second aim was to assess differences in mangrove colonisation in the two main saltmarsh species in the area. These are marine couch [ Sporobolus virginicus (L.) Kunth.] and samphire [ Sarcocornia quinqueflora (Bunge ex Ung.-Stern.)]. Runnels at the study site were in an area dominated by Sporobolus. The mangrove area was measured by classifying digital color infrared (CIR) data obtained from aerial photographs acquired in 1982, which was 3 years before runnelling, and in 1987, 1991 and 1999, 2-14 years after. Changes in the spatial extent of A. marina were identified using difference images produced from post-classification change detection. The results showed that runnels did not significantly influence the distribution of A. marina at the study site. At a more detailed level differences in A. marina establishment in the Sporobolus and Sarcocornia areas were determined from counts of trees on the aerial photographs. There was a greater proportion of mangroves in Sarcocornia than Sporobolus and this increased over time. This may be related to differences in density between the plant species, to grapsid crab activity or to other edaphic conditions. There may be implications for runnelling in Sarcocornia marshes. The large increase observed in A. marina in the area generally is likely to be related to factors such as catchment modification or tidal/sea-level changes. It is concluded that runnelling has not led to mangrove establishment in the Sporobolus dominated saltmarsh.

  13. The role of fine material and grain size distribution on excess pore pressure dissipation and particle support mechanisms in granular deposits based in large-scale physical experiments

    NASA Astrophysics Data System (ADS)

    Palucis, M. C.; Kaitna, R.; Tewoldebrhan, B.; Hill, K. M.; Dietrich, W. E.

    2011-12-01

    The dominant mechanisms behind sustained mobilization in granular debris flows are poorly understood, and experiments are needed to determine the conditions under which the fluid can fully support the coarse fraction. However, field-scale studies are difficult to instrument and constrain and laboratory studies suffer from scaling issues. A 4-m rotating drum located at UC Berkeley's Richmond Field Station allowed us to perform reproducible experiments with materials similar to those in the field to explore mechanisms relevant to slow pore fluid pressure dissipation. Specifically, we performed a series of experiments to assess the role of fines and grain size distribution on the rate of pore fluid pressure dissipation upon deposition of a granular mass. For each experiment we kept the total mass of the gravel particles constant and varied the amount of fines (from no fines to amounts found in an actual debris flow deposit) and the gravel particle size distribution (from a single grain size to a range found in natural flows). We first rotated each mixture in the drum, during which we monitored fluid pressures at the base of the flows (near the wall of the drum and at the center). Then we stopped the drum and continued to monitor the fluid pressures. Immediately upon stopping, the pore fluid pressure was nearly hydrostatic for the gravel-water flows, and any elevated pore pressure quickly dissipated. On the other hand, the mixtures with fines contents close to those found in actual debris flows had elevated pore pressures indicating they were almost fully liquefied. Furthermore, the rate of pore pressure dissipation was an order of magnitude slower than when no fines were present; the grain size distribution of the coarse fraction did not strongly influence the dissipation rates in either case. We also placed a cobble upon a fines-rich mixture after cessation of motion above the center pressure sensor, and observed that the pore fluid pressure rose instantly, bearing

  14. Spatial and temporal distributions of contaminant body burden and disease in Gulf of Mexico oyster populations: The role of local and large-scale climatic controls

    NASA Astrophysics Data System (ADS)

    Wilson, E. A.; Powell, E. N.; Wade, T. L.; Taylor, R. J.; Presley, B. J.; Brooks, J. M.

    1992-06-01

    As part of NOAA's Status and Trends Program, oysters were sampled from 43 sites throughout the Gulf of Mexico from Brownsville, Texas, to the Florida Everglades from 1986 to 1989. Oysters were analysed for body burden of a suite of metals and petroleum aromatic hydrocarbons (PAHs), the prevalence and intensity of the oyster pathogen, Perkinsus marinus, and condition index. The contaminants fell into two groups based on the spatial distribution of body burden throughout the Gulf. Arsenic, selenium, mercury and cadmium were characterized by clinal reduction in similarity with distance reminiscent of that followed by mean monthly temperature and precipitation. Zinc, copper, PAHs and silver showed no consistent geographic trend. Within local regions, industrial and agricultural and use and P. marinus prevalence and infection intensity frequently correlated with body burden. Contaminants and biological attributes followed one of three temporal trends. Zinc, copper and PAHs showed concordant shifts over 4 years throughout the eastern and southern Gulf. Mercury and cadmium showed concordant shifts in the northwestern Gulf. Selenium, arsenic, length, condition index and P. marinus prevalence and infection intensity showed concordant shifts throughout most of the entire Gulf. Concordant shifts suggest that climatic factors, the El Niño/Southern Oscillation being one example, exert a strong influence on biological attributes and contaminant body burdens in the Gulf. Correlative factors are those that probably affect or indicate the rate of tissue turnover and the frequency of reproduction; namely, temperature, disease intensity, condition index and length.

  15. Large scale topography of Io

    NASA Technical Reports Server (NTRS)

    Gaskell, R. W.; Synnott, S. P.

    1987-01-01

    To investigate the large scale topography of the Jovian satellite Io, both limb observations and stereographic techniques applied to landmarks are used. The raw data for this study consists of Voyager 1 images of Io, 800x800 arrays of picture elements each of which can take on 256 possible brightness values. In analyzing this data it was necessary to identify and locate landmarks and limb points on the raw images, remove the image distortions caused by the camera electronics and translate the corrected locations into positions relative to a reference geoid. Minimizing the uncertainty in the corrected locations is crucial to the success of this project. In the highest resolution frames, an error of a tenth of a pixel in image space location can lead to a 300 m error in true location. In the lowest resolution frames, the same error can lead to an uncertainty of several km.

  16. Challenges for Large Scale Simulations

    NASA Astrophysics Data System (ADS)

    Troyer, Matthias

    2010-03-01

    With computational approaches becoming ubiquitous the growing impact of large scale computing on research influences both theoretical and experimental work. I will review a few examples in condensed matter physics and quantum optics, including the impact of computer simulations in the search for supersolidity, thermometry in ultracold quantum gases, and the challenging search for novel phases in strongly correlated electron systems. While only a decade ago such simulations needed the fastest supercomputers, many simulations can now be performed on small workstation clusters or even a laptop: what was previously restricted to a few experts can now potentially be used by many. Only part of the gain in computational capabilities is due to Moore's law and improvement in hardware. Equally impressive is the performance gain due to new algorithms - as I will illustrate using some recently developed algorithms. At the same time modern peta-scale supercomputers offer unprecedented computational power and allow us to tackle new problems and address questions that were impossible to solve numerically only a few years ago. While there is a roadmap for future hardware developments to exascale and beyond, the main challenges are on the algorithmic and software infrastructure side. Among the problems that face the computational physicist are: the development of new algorithms that scale to thousands of cores and beyond, a software infrastructure that lifts code development to a higher level and speeds up the development of new simulation programs for large scale computing machines, tools to analyze the large volume of data obtained from such simulations, and as an emerging field provenance-aware software that aims for reproducibility of the complete computational workflow from model parameters to the final figures. Interdisciplinary collaborations and collective efforts will be required, in contrast to the cottage-industry culture currently present in many areas of computational

  17. ADHydro: A Parallel Implementation of a Large-scale High-Resolution Multi-Physics Distributed Water Resources Model Using the Charm++ Run Time System

    NASA Astrophysics Data System (ADS)

    Steinke, R. C.; Ogden, F. L.; Lai, W.; Moreno, H. A.; Pureza, L. G.

    2014-12-01

    Physics-based watershed models are useful tools for hydrologic studies, water resources management and economic analyses in the contexts of climate, land-use, and water-use changes. This poster presents a parallel implementation of a quasi 3-dimensional, physics-based, high-resolution, distributed water resources model suitable for simulating large watersheds in a massively parallel computing environment. Developing this model is one of the objectives of the NSF EPSCoR RII Track II CI-WATER project, which is joint between Wyoming and Utah EPSCoR jurisdictions. The model, which we call ADHydro, is aimed at simulating important processes in the Rocky Mountain west, including: rainfall and infiltration, snowfall and snowmelt in complex terrain, vegetation and evapotranspiration, soil heat flux and freezing, overland flow, channel flow, groundwater flow, water management and irrigation. Model forcing is provided by the Weather Research and Forecasting (WRF) model, and ADHydro is coupled with the NOAH-MP land-surface scheme for calculating fluxes between the land and atmosphere. The ADHydro implementation uses the Charm++ parallel run time system. Charm++ is based on location transparent message passing between migrateable C++ objects. Each object represents an entity in the model such as a mesh element. These objects can be migrated between processors or serialized to disk allowing the Charm++ system to automatically provide capabilities such as load balancing and checkpointing. Objects interact with each other by passing messages that the Charm++ system routes to the correct destination object regardless of its current location. This poster discusses the algorithms, communication patterns, and caching strategies used to implement ADHydro with Charm++. The ADHydro model code will be released to the hydrologic community in late 2014.

  18. Large-Scale PV Integration Study

    SciTech Connect

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  19. The large-scale distribution and internal geometry of the fall 2000 Po River flood deposit: Evidence from digital X-radiography

    USGS Publications Warehouse

    Wheatcroft, R.A.; Stevens, A.W.; Hunt, L.M.; Milligan, T.G.

    2006-01-01

    Event-response coring on the Po River prodelta (northern Adriatic Sea) coupled with shipboard digital X-radiography, resistivity profiling, and grain-size analyses permitted documentation of the initial distribution and physical properties of the October 2000 flood deposit. The digital X-radiography system comprises a constant-potential X-ray source and an amorphous silicon imager with an active area of 29??42 cm and 12-bit depth resolution. Objective image segmentation algorithms based on bulk density (brightness), layer contacts (edge detection) and small-scale texture (fabric) were used to identify the flood deposit. Results indicate that the deposit formed in water depths of 6-29 m immediately adjacent to the three main distributary mouths of the Po (Pila, Tolle and Gnocca/Goro). Maximal thickness was 36 cm at a 20-m site off the main mouth (Pila), but many other sites hadthicknesses >20 cm. The Po flood deposit has a complex internal stratigraphy, with multiple layers, a diverse suite of physical sedimentary structures (e.g., laminations, ripple cross bedding, lenticular bedding, soft-sediment deformation structures), and dramatic changes in grain size that imply rapid deposition and fluctuations in energy during emplacement. Based on the flood deposit volume and well-constrained measurements of deposit bulk density the mass of the flood deposit was estimated to be 16??109 kg, which is about two-thirds of the estimated suspended sediment load delivered by the river during the event. The locus of deposition, overall thickness, and stratigraphic complexity of the flood deposit can best be explained by the relatively long sediment throughput times of the Po River, whereby sediment is delivered to the ocean during a range of conditions (i.e., the storm responsible for the precipitation is long gone), the majority of which are reflective of the fair-weather condition. Sediment is therefore deposited proximal to the river mouths, where it can form thick, but

  20. Large-scale distribution and activity patterns of an extremely low-light-adapted population of green sulfur bacteria in the Black Sea.

    PubMed

    Marschall, Evelyn; Jogler, Mareike; Hessge, Uta; Overmann, Jörg

    2010-05-01

    The Black Sea chemocline represents the largest extant habitat of anoxygenic phototrophic bacteria and harbours a monospecific population of Chlorobium phylotype BS-1. High-sensitivity measurements of underwater irradiance and sulfide revealed that the optical properties of the overlying water column were similar across the Black Sea basin, whereas the vertical profiles of sulfide varied strongly between sampling sites and caused a dome-shaped three-dimensional distribution of the green sulfur bacteria. In the centres of the western and eastern basins the population of BS-1 reached upward to depths of 80 and 95 m, respectively, but were detected only at 145 m depth close to the shelf. Using highly concentrated chemocline samples from the centres of the western and eastern basins, the cells were found to be capable of anoxygenic photosynthesis under in situ light conditions and exhibited a photosynthesis-irradiance curve similar to low-light-adapted laboratory cultures of Chlorobium BS-1. Application of a highly specific RT-qPCR method which targets the internal transcribed spacer (ITS) region of the rrn operon of BS-1 demonstrated that only cells at the central station are physiologically active in contrast to those at the Black Sea periphery. Based on the detection of ITS-DNA sequences in the flocculent surface layer of deep-sea sediments across the Black Sea, the population of BS-1 has occupied the major part of the basin for the last decade. The continued presence of intact but non-growing BS-1 cells at the periphery of the Black Sea indicates that the cells can survive long-distant transport and exhibit unusually low maintenance energy requirements. According to laboratory measurements, Chlorobium BS-1 has a maintenance energy requirement of approximately 1.6-4.9.10(-15) kJ cell(-1) day(-1) which is the lowest value determined for any bacterial culture so far. Chlorobium BS-1 thus is particularly well adapted to survival under the extreme low-light conditions

  1. Large Scale Homing in Honeybees

    PubMed Central

    Pahl, Mario; Zhu, Hong; Tautz, Jürgen; Zhang, Shaowu

    2011-01-01

    Honeybee foragers frequently fly several kilometres to and from vital resources, and communicate those locations to their nest mates by a symbolic dance language. Research has shown that they achieve this feat by memorizing landmarks and the skyline panorama, using the sun and polarized skylight as compasses and by integrating their outbound flight paths. In order to investigate the capacity of the honeybees' homing abilities, we artificially displaced foragers to novel release spots at various distances up to 13 km in the four cardinal directions. Returning bees were individually registered by a radio frequency identification (RFID) system at the hive entrance. We found that homing rate, homing speed and the maximum homing distance depend on the release direction. Bees released in the east were more likely to find their way back home, and returned faster than bees released in any other direction, due to the familiarity of global landmarks seen from the hive. Our findings suggest that such large scale homing is facilitated by global landmarks acting as beacons, and possibly the entire skyline panorama. PMID:21602920

  2. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  3. Large Scale Nanolaminate Deformable Mirror

    SciTech Connect

    Papavasiliou, A; Olivier, S; Barbee, T; Miles, R; Chang, K

    2005-11-30

    This work concerns the development of a technology that uses Nanolaminate foils to form light-weight, deformable mirrors that are scalable over a wide range of mirror sizes. While MEMS-based deformable mirrors and spatial light modulators have considerably reduced the cost and increased the capabilities of adaptive optic systems, there has not been a way to utilize the advantages of lithography and batch-fabrication to produce large-scale deformable mirrors. This technology is made scalable by using fabrication techniques and lithography that are not limited to the sizes of conventional MEMS devices. Like many MEMS devices, these mirrors use parallel plate electrostatic actuators. This technology replicates that functionality by suspending a horizontal piece of nanolaminate foil over an electrode by electroplated nickel posts. This actuator is attached, with another post, to another nanolaminate foil that acts as the mirror surface. Most MEMS devices are produced with integrated circuit lithography techniques that are capable of very small line widths, but are not scalable to large sizes. This technology is very tolerant of lithography errors and can use coarser, printed circuit board lithography techniques that can be scaled to very large sizes. These mirrors use small, lithographically defined actuators and thin nanolaminate foils allowing them to produce deformations over a large area while minimizing weight. This paper will describe a staged program to develop this technology. First-principles models were developed to determine design parameters. Three stages of fabrication will be described starting with a 3 x 3 device using conventional metal foils and epoxy to a 10-across all-metal device with nanolaminate mirror surfaces.

  4. Measurement of the steady surface pressure distribution on a single rotation large scale advanced prop-fan blade at Mach numbers from 0.03 to 0.78

    NASA Technical Reports Server (NTRS)

    Bushnell, Peter

    1988-01-01

    The aerodynamic pressure distribution was determined on a rotating Prop-Fan blade at the S1-MA wind tunnel facility operated by the Office National D'Etudes et de Recherches Aerospatiale (ONERA) in Modane, France. The pressure distributions were measured at thirteen radial stations on a single rotation Large Scale Advanced Prop-Fan (LAP/SR7) blade, for a sequence of operating conditions including inflow Mach numbers ranging from 0.03 to 0.78. Pressure distributions for more than one power coefficient and/or advanced ratio setting were measured for most of the inflow Mach numbers investigated. Due to facility power limitations the Prop-Fan test installation was a two bladed version of the eight design configuration. The power coefficient range investigated was therefore selected to cover typical power loading per blade conditions which occur within the Prop-Fan operating envelope. The experimental results provide an extensive source of information on the aerodynamic behavior of the swept Prop-Fan blade, including details which were elusive to current computational models and do not appear in the two-dimensional airfoil data.

  5. Needs, opportunities, and options for large scale systems research

    SciTech Connect

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  6. Supporting large scale applications on networks of workstations

    NASA Technical Reports Server (NTRS)

    Cooper, Robert; Birman, Kenneth P.

    1989-01-01

    Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.

  7. Large-Scale Reform Comes of Age

    ERIC Educational Resources Information Center

    Fullan, Michael

    2009-01-01

    This article reviews the history of large-scale education reform and makes the case that large-scale or whole system reform policies and strategies are becoming increasingly evident. The review briefly addresses the pre 1997 period concluding that while the pressure for reform was mounting that there were very few examples of deliberate or…

  8. Large-scale infrared scene projectors

    NASA Astrophysics Data System (ADS)

    Murray, Darin A.

    1999-07-01

    Large-scale infrared scene projectors, typically have unique opto-mechanical characteristics associated to their application. This paper outlines two large-scale zoom lens assemblies with different environmental and package constraints. Various challenges and their respective solutions are discussed and presented.

  9. Gravity and large-scale nonlocal bias

    NASA Astrophysics Data System (ADS)

    Chan, Kwan Chuen; Scoccimarro, Román; Sheth, Ravi K.

    2012-04-01

    For Gaussian primordial fluctuations the relationship between galaxy and matter overdensities, bias, is most often assumed to be local at the time of observation in the large-scale limit. This hypothesis is however unstable under time evolution, we provide proofs under several (increasingly more realistic) sets of assumptions. In the simplest toy model galaxies are created locally and linearly biased at a single formation time, and subsequently move with the dark matter (no velocity bias) conserving their comoving number density (no merging). We show that, after this formation time, the bias becomes unavoidably nonlocal and nonlinear at large scales. We identify the nonlocal gravitationally induced fields in which the galaxy overdensity can be expanded, showing that they can be constructed out of the invariants of the deformation tensor (Galileons), the main signature of which is a quadrupole field in second-order perturbation theory. In addition, we show that this result persists if we include an arbitrary evolution of the comoving number density of tracers. We then include velocity bias, and show that new contributions appear; these are related to the breaking of Galilean invariance of the bias relation, a dipole field being the signature at second order. We test these predictions by studying the dependence of halo overdensities in cells of fixed dark matter density: measurements in simulations show that departures from the mean bias relation are strongly correlated with the nonlocal gravitationally induced fields identified by our formalism, suggesting that the halo distribution at the present time is indeed more closely related to the mass distribution at an earlier rather than present time. However, the nonlocality seen in the simulations is not fully captured by assuming local bias in Lagrangian space. The effects on nonlocal bias seen in the simulations are most important for the most biased halos, as expected from our predictions. Accounting for these

  10. Synthesis of small and large scale dynamos

    NASA Astrophysics Data System (ADS)

    Subramanian, Kandaswamy

    Using a closure model for the evolution of magnetic correlations, we uncover an interesting plausible saturated state of the small-scale fluctuation dynamo (SSD) and a novel analogy between quantum mechanical tunnelling and the generation of large-scale fields. Large scale fields develop via the α-effect, but as magnetic helicity can only change on a resistive timescale, the time it takes to organize the field into large scales increases with magnetic Reynolds number. This is very similar to the results which obtain from simulations using the full MHD equations.

  11. Curvature constraints from large scale structure

    NASA Astrophysics Data System (ADS)

    Di Dio, Enea; Montanari, Francesco; Raccanelli, Alvise; Durrer, Ruth; Kamionkowski, Marc; Lesgourgues, Julien

    2016-06-01

    We modified the CLASS code in order to include relativistic galaxy number counts in spatially curved geometries; we present the formalism and study the effect of relativistic corrections on spatial curvature. The new version of the code is now publicly available. Using a Fisher matrix analysis, we investigate how measurements of the spatial curvature parameter ΩK with future galaxy surveys are affected by relativistic effects, which influence observations of the large scale galaxy distribution. These effects include contributions from cosmic magnification, Doppler terms and terms involving the gravitational potential. As an application, we consider angle and redshift dependent power spectra, which are especially well suited for model independent cosmological constraints. We compute our results for a representative deep, wide and spectroscopic survey, and our results show the impact of relativistic corrections on spatial curvature parameter estimation. We show that constraints on the curvature parameter may be strongly biased if, in particular, cosmic magnification is not included in the analysis. Other relativistic effects turn out to be subdominant in the studied configuration. We analyze how the shift in the estimated best-fit value for the curvature and other cosmological parameters depends on the magnification bias parameter, and find that significant biases are to be expected if this term is not properly considered in the analysis.

  12. Large-scale regions of antimatter

    SciTech Connect

    Grobov, A. V. Rubin, S. G.

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  13. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  14. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  15. Contribution of peculiar shear motions to large-scale structure

    NASA Technical Reports Server (NTRS)

    Mueler, Hans-Reinhard; Treumann, Rudolf A.

    1994-01-01

    Self-gravitating shear flow instability simulations in a cold dark matter-dominated expanding Einstein-de Sitter universe have been performed. When the shear flow speed exceeds a certain threshold, self-gravitating Kelvin-Helmoholtz instability occurs, forming density voids and excesses along the shear flow layer which serve as seeds for large-scale structure formation. A possible mechanism for generating shear peculiar motions are velocity fluctuations induced by the density perturbations of the postinflation era. In this scenario, short scales grow earlier than large scales. A model of this kind may contribute to the cellular structure of the luminous mass distribution in the universe.

  16. EINSTEIN'S SIGNATURE IN COSMOLOGICAL LARGE-SCALE STRUCTURE

    SciTech Connect

    Bruni, Marco; Hidalgo, Juan Carlos; Wands, David

    2014-10-10

    We show how the nonlinearity of general relativity generates a characteristic nonGaussian signal in cosmological large-scale structure that we calculate at all perturbative orders in a large-scale limit. Newtonian gravity and general relativity provide complementary theoretical frameworks for modeling large-scale structure in ΛCDM cosmology; a relativistic approach is essential to determine initial conditions, which can then be used in Newtonian simulations studying the nonlinear evolution of the matter density. Most inflationary models in the very early universe predict an almost Gaussian distribution for the primordial metric perturbation, ζ. However, we argue that it is the Ricci curvature of comoving-orthogonal spatial hypersurfaces, R, that drives structure formation at large scales. We show how the nonlinear relation between the spatial curvature, R, and the metric perturbation, ζ, translates into a specific nonGaussian contribution to the initial comoving matter density that we calculate for the simple case of an initially Gaussian ζ. Our analysis shows the nonlinear signature of Einstein's gravity in large-scale structure.

  17. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  18. Three-Dimensional Numerical Simulation on Passively Excited Flows by Distributed Local Hot Sources Settled at the D" Layer Below Hotspots and/or Large-Scale Cool Masses at Subduction Zones Within the Static Layered Mantle

    NASA Astrophysics Data System (ADS)

    Eguchi, T.; Matsubara, K.; Ishida, M.

    2001-12-01

    To unveil dynamic process associated with three-dimensional unsteady mantle convection, we carried out numerical simulation on passively exerted flows by simplified local hot sources just above the CMB and large-scale cool masses beneath smoothed subduction zones. During the study, we used our individual code developed with the finite difference method. The basic three equations are for the continuity, the motion with the Boussinesq (incompressible) approximation, and the (thermal) energy conservation. The viscosity of our model is sensitive to temperature. To get time integration with high precision, we used the Newton method. In detail, the size and thermal energy of the hot or cool sources are not uniform along the latitude, because we could not select uniform local volumes assigned for the sources within the finite difference grids throughout the mantle. Our results, thus, accompany some latitude dependence. First, we treated the case of the hotspots, neglecting the contribution of the subduction zones. The local hot sources below the currently active hotspots were settled as dynamic driving forces included in the initial condition. Before starting the calculation, we assumed that the mantle was statically layered with zero velocity component. The thermal anomalies inserted instantaneously in the initial condition do excite dynamically passive flows. The type of the initial hot sources was not 'plume' but 'thermal.' The simulation results represent that local upwelling flows which were directly excited over the initial heat sources reached the upper mantle by approximately 30 My during the calculation. Each of the direct upwellings above the hotspots has its own dynamic potential to exert concentric down- and up-welling flows, alternately, at large distances. Simultaneously, the direct upwellings interact mutually within the spherical mantle. As an interesting feature, we numerically observed secondary upwellings somewhere in a wide region covering east Eurasia

  19. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  20. A Large Scale Computer Terminal Output Controller.

    ERIC Educational Resources Information Center

    Tucker, Paul Thomas

    This paper describes the design and implementation of a large scale computer terminal output controller which supervises the transfer of information from a Control Data 6400 Computer to a PLATO IV data network. It discusses the cost considerations leading to the selection of educational television channels rather than telephone lines for…

  1. Large Scale Commodity Clusters for Lattice QCD

    SciTech Connect

    A. Pochinsky; W. Akers; R. Brower; J. Chen; P. Dreher; R. Edwards; S. Gottlieb; D. Holmgren; P. Mackenzie; J. Negele; D. Richards; J. Simone; W. Watson

    2002-06-01

    We describe the construction of large scale clusters for lattice QCD computing being developed under the umbrella of the U.S. DoE SciDAC initiative. We discuss the study of floating point and network performance that drove the design of the cluster, and present our plans for future multi-Terascale facilities.

  2. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  3. ARPACK: Solving large scale eigenvalue problems

    NASA Astrophysics Data System (ADS)

    Lehoucq, Rich; Maschhoff, Kristi; Sorensen, Danny; Yang, Chao

    2013-11-01

    ARPACK is a collection of Fortran77 subroutines designed to solve large scale eigenvalue problems. The package is designed to compute a few eigenvalues and corresponding eigenvectors of a general n by n matrix A. It is most appropriate for large sparse or structured matrices A where structured means that a matrix-vector product w

  4. Large-scale CFB combustion demonstration project

    SciTech Connect

    Nielsen, P.T.; Hebb, J.L.; Aquino, R.

    1998-07-01

    The Jacksonville Electric Authority's large-scale CFB demonstration project is described. Given the early stage of project development, the paper focuses on the project organizational structure, its role within the Department of Energy's Clean Coal Technology Demonstration Program, and the projected environmental performance. A description of the CFB combustion process in included.

  5. Large-scale CFB combustion demonstration project

    SciTech Connect

    Nielsen, P.T.; Hebb, J.L.; Aquino, R.

    1998-04-01

    The Jacksonville Electric Authority`s large-scale CFB demonstration project is described. Given the early stage of project development, the paper focuses on the project organizational structure, its role within the Department of Energy`s Clean Coal Technology Demonstration Program, and the projected environmental performance. A description of the CFB combustion process is included.

  6. Decomposition and coordination of large-scale operations optimization

    NASA Astrophysics Data System (ADS)

    Cheng, Ruoyu

    Nowadays, highly integrated manufacturing has resulted in more and more large-scale industrial operations. As one of the most effective strategies to ensure high-level operations in modern industry, large-scale engineering optimization has garnered a great amount of interest from academic scholars and industrial practitioners. Large-scale optimization problems frequently occur in industrial applications, and many of them naturally present special structure or can be transformed to taking special structure. Some decomposition and coordination methods have the potential to solve these problems at a reasonable speed. This thesis focuses on three classes of large-scale optimization problems: linear programming, quadratic programming, and mixed-integer programming problems. The main contributions include the design of structural complexity analysis for investigating scaling behavior and computational efficiency of decomposition strategies, novel coordination techniques and algorithms to improve the convergence behavior of decomposition and coordination methods, as well as the development of a decentralized optimization framework which embeds the decomposition strategies in a distributed computing environment. The complexity study can provide fundamental guidelines to practical applications of the decomposition and coordination methods. In this thesis, several case studies imply the viability of the proposed decentralized optimization techniques for real industrial applications. A pulp mill benchmark problem is used to investigate the applicability of the LP/QP decentralized optimization strategies, while a truck allocation problem in the decision support of mining operations is used to study the MILP decentralized optimization strategies.

  7. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  8. Large-scale extraction of proteins.

    PubMed

    Cunha, Teresa; Aires-Barros, Raquel

    2002-01-01

    The production of foreign proteins using selected host with the necessary posttranslational modifications is one of the key successes in modern biotechnology. This methodology allows the industrial production of proteins that otherwise are produced in small quantities. However, the separation and purification of these proteins from the fermentation media constitutes a major bottleneck for the widespread commercialization of recombinant proteins. The major production costs (50-90%) for typical biological product resides in the purification strategy. There is a need for efficient, effective, and economic large-scale bioseparation techniques, to achieve high purity and high recovery, while maintaining the biological activity of the molecule. Aqueous two-phase systems (ATPS) allow process integration as simultaneously separation and concentration of the target protein is achieved, with posterior removal and recycle of the polymer. The ease of scale-up combined with the high partition coefficients obtained allow its potential application in large-scale downstream processing of proteins produced by fermentation. The equipment and the methodology for aqueous two-phase extraction of proteins on a large scale using mixer-settlerand column contractors are described. The operation of the columns, either stagewise or differential, are summarized. A brief description of the methods used to account for mass transfer coefficients, hydrodynamics parameters of hold-up, drop size, and velocity, back mixing in the phases, and flooding performance, required for column design, is also provided. PMID:11876297

  9. Large scale processes in the solar nebula.

    NASA Astrophysics Data System (ADS)

    Boss, A. P.

    Most proposed chondrule formation mechanisms involve processes occurring inside the solar nebula, so the large scale (roughly 1 to 10 AU) structure of the nebula is of general interest for any chrondrule-forming mechanism. Chondrules and Ca, Al-rich inclusions (CAIs) might also have been formed as a direct result of the large scale structure of the nebula, such as passage of material through high temperature regions. While recent nebula models do predict the existence of relatively hot regions, the maximum temperatures in the inner planet region may not be high enough to account for chondrule or CAI thermal processing, unless the disk mass is considerably greater than the minimum mass necessary to restore the planets to solar composition. Furthermore, it does not seem to be possible to achieve both rapid heating and rapid cooling of grain assemblages in such a large scale furnace. However, if the accretion flow onto the nebula surface is clumpy, as suggested by observations of variability in young stars, then clump-disk impacts might be energetic enough to launch shock waves which could propagate through the nebula to the midplane, thermally processing any grain aggregates they encounter, and leaving behind a trail of chondrules.

  10. Large-scale functional connectivity networks in the rodent brain.

    PubMed

    Gozzi, Alessandro; Schwarz, Adam J

    2016-02-15

    Resting-state functional Magnetic Resonance Imaging (rsfMRI) of the human brain has revealed multiple large-scale neural networks within a hierarchical and complex structure of coordinated functional activity. These distributed neuroanatomical systems provide a sensitive window on brain function and its disruption in a variety of neuropathological conditions. The study of macroscale intrinsic connectivity networks in preclinical species, where genetic and environmental conditions can be controlled and manipulated with high specificity, offers the opportunity to elucidate the biological determinants of these alterations. While rsfMRI methods are now widely used in human connectivity research, these approaches have only relatively recently been back-translated into laboratory animals. Here we review recent progress in the study of functional connectivity in rodent species, emphasising the ability of this approach to resolve large-scale brain networks that recapitulate neuroanatomical features of known functional systems in the human brain. These include, but are not limited to, a distributed set of regions identified in rats and mice that may represent a putative evolutionary precursor of the human default mode network (DMN). The impact and control of potential experimental and methodological confounds are also critically discussed. Finally, we highlight the enormous potential and some initial application of connectivity mapping in transgenic models as a tool to investigate the neuropathological underpinnings of the large-scale connectional alterations associated with human neuropsychiatric and neurological conditions. We conclude by discussing the translational potential of these methods in basic and applied neuroscience. PMID:26706448

  11. Quantum Noise in Large-Scale Coherent Nonlinear Photonic Circuits

    NASA Astrophysics Data System (ADS)

    Santori, Charles; Pelc, Jason S.; Beausoleil, Raymond G.; Tezak, Nikolas; Hamerly, Ryan; Mabuchi, Hideo

    2014-06-01

    A semiclassical simulation approach is presented for studying quantum noise in large-scale photonic circuits incorporating an ideal Kerr nonlinearity. A circuit solver is used to generate matrices defining a set of stochastic differential equations, in which the resonator field variables represent random samplings of the Wigner quasiprobability distributions. Although the semiclassical approach involves making a large-photon-number approximation, tests on one- and two-resonator circuits indicate satisfactory agreement between the semiclassical and full-quantum simulation results in the parameter regime of interest. The semiclassical model is used to simulate random errors in a large-scale circuit that contains 88 resonators and hundreds of components in total and functions as a four-bit ripple counter. The error rate as a function of on-state photon number is examined, and it is observed that the quantum fluctuation amplitudes do not increase as signals propagate through the circuit, an important property for scalability.

  12. Population generation for large-scale simulation

    NASA Astrophysics Data System (ADS)

    Hannon, Andrew C.; King, Gary; Morrison, Clayton; Galstyan, Aram; Cohen, Paul

    2005-05-01

    Computer simulation is used to research phenomena ranging from the structure of the space-time continuum to population genetics and future combat.1-3 Multi-agent simulations in particular are now commonplace in many fields.4, 5 By modeling populations whose complex behavior emerges from individual interactions, these simulations help to answer questions about effects where closed form solutions are difficult to solve or impossible to derive.6 To be useful, simulations must accurately model the relevant aspects of the underlying domain. In multi-agent simulation, this means that the modeling must include both the agents and their relationships. Typically, each agent can be modeled as a set of attributes drawn from various distributions (e.g., height, morale, intelligence and so forth). Though these can interact - for example, agent height is related to agent weight - they are usually independent. Modeling relations between agents, on the other hand, adds a new layer of complexity, and tools from graph theory and social network analysis are finding increasing application.7, 8 Recognizing the role and proper use of these techniques, however, remains the subject of ongoing research. We recently encountered these complexities while building large scale social simulations.9-11 One of these, the Hats Simulator, is designed to be a lightweight proxy for intelligence analysis problems. Hats models a "society in a box" consisting of many simple agents, called hats. Hats gets its name from the classic spaghetti western, in which the heroes and villains are known by the color of the hats they wear. The Hats society also has its heroes and villains, but the challenge is to identify which color hat they should be wearing based on how they behave. There are three types of hats: benign hats, known terrorists, and covert terrorists. Covert terrorists look just like benign hats but act like terrorists. Population structure can make covert hat identification significantly more

  13. Large-Scale Optimization for Bayesian Inference in Complex Systems

    SciTech Connect

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their

  14. On the analysis of large-scale genomic structures.

    PubMed

    Oiwa, Nestor Norio; Goldman, Carla

    2005-01-01

    We apply methods from statistical physics (histograms, correlation functions, fractal dimensions, and singularity spectra) to characterize large-scale structure of the distribution of nucleotides along genomic sequences. We discuss the role of the extension of noncoding segments ("junk DNA") for the genomic organization, and the connection between the coding segment distribution and the high-eukaryotic chromatin condensation. The following sequences taken from GenBank were analyzed: complete genome of Xanthomonas campestri, complete genome of yeast, chromosome V of Caenorhabditis elegans, and human chromosome XVII around gene BRCA1. The results are compared with the random and periodic sequences and those generated by simple and generalized fractal Cantor sets. PMID:15858230

  15. Colloquium: Large scale simulations on GPU clusters

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  16. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  17. Nonthermal Components in the Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Miniati, Francesco

    2004-12-01

    I address the issue of nonthermal processes in the large scale structure of the universe. After reviewing the properties of cosmic shocks and their role as particle accelerators, I discuss the main observational results, from radio to γ-ray and describe the processes that are thought be responsible for the observed nonthermal emissions. Finally, I emphasize the important role of γ-ray astronomy for the progress in the field. Non detections at these photon energies have already allowed us important conclusions. Future observations will tell us more about the physics of the intracluster medium, shocks dissipation and CR acceleration.

  18. Large-scale planar lightwave circuits

    NASA Astrophysics Data System (ADS)

    Bidnyk, Serge; Zhang, Hua; Pearson, Matt; Balakrishnan, Ashok

    2011-01-01

    By leveraging advanced wafer processing and flip-chip bonding techniques, we have succeeded in hybrid integrating a myriad of active optical components, including photodetectors and laser diodes, with our planar lightwave circuit (PLC) platform. We have combined hybrid integration of active components with monolithic integration of other critical functions, such as diffraction gratings, on-chip mirrors, mode-converters, and thermo-optic elements. Further process development has led to the integration of polarization controlling functionality. Most recently, all these technological advancements have been combined to create large-scale planar lightwave circuits that comprise hundreds of optical elements integrated on chips less than a square inch in size.

  19. Neutrinos and large-scale structure

    SciTech Connect

    Eisenstein, Daniel J.

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  20. Large scale phononic metamaterials for seismic isolation

    SciTech Connect

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  1. Large-scale Globally Propagating Coronal Waves

    NASA Astrophysics Data System (ADS)

    Warmuth, Alexander

    2015-09-01

    Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the "classical" interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which "pseudo waves" are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  2. Upscaling of elastic properties for large scale geomechanical simulations

    NASA Astrophysics Data System (ADS)

    Chalon, F.; Mainguy, M.; Longuemare, P.; Lemonnier, P.

    2004-09-01

    Large scale geomechanical simulations are being increasingly used to model the compaction of stress dependent reservoirs, predict the long term integrity of under-ground radioactive waste disposals, and analyse the viability of hot-dry rock geothermal sites. These large scale simulations require the definition of homogenous mechanical properties for each geomechanical cell whereas the rock properties are expected to vary at a smaller scale. Therefore, this paper proposes a new methodology that makes possible to define the equivalent mechanical properties of the geomechanical cells using the fine scale information given in the geological model. This methodology is implemented on a synthetic reservoir case and two upscaling procedures providing the effective elastic properties of the Hooke's law are tested. The first upscaling procedure is an analytical method for perfectly stratified rock mass, whereas the second procedure computes lower and upper bounds of the equivalent properties with no assumption on the small scale heterogeneity distribution. Both procedures are applied to one geomechanical cell extracted from the reservoir structure. The results show that the analytical and numerical upscaling procedures provide accurate estimations of the effective parameters. Furthermore, a large scale simulation using the homogenized properties of each geomechanical cell calculated with the analytical method demonstrates that the overall behaviour of the reservoir structure is well reproduced for two different loading cases. Copyright

  3. Large-scale flow generation by inhomogeneous helicity

    NASA Astrophysics Data System (ADS)

    Yokoi, N.; Brandenburg, A.

    2016-03-01

    The effect of kinetic helicity (velocity-vorticity correlation) on turbulent momentum transport is investigated. The turbulent kinetic helicity (pseudoscalar) enters the Reynolds stress (mirror-symmetric tensor) expression in the form of a helicity gradient as the coupling coefficient for the mean vorticity and/or the angular velocity (axial vector), which suggests the possibility of mean-flow generation in the presence of inhomogeneous helicity. This inhomogeneous helicity effect, which was previously confirmed at the level of a turbulence- or closure-model simulation, is examined with the aid of direct numerical simulations of rotating turbulence with nonuniform helicity sustained by an external forcing. The numerical simulations show that the spatial distribution of the Reynolds stress is in agreement with the helicity-related term coupled with the angular velocity, and that a large-scale flow is generated in the direction of angular velocity. Such a large-scale flow is not induced in the case of homogeneous turbulent helicity. This result confirms the validity of the inhomogeneous helicity effect in large-scale flow generation and suggests that a vortex dynamo is possible even in incompressible turbulence where there is no baroclinicity effect.

  4. Reliability assessment for components of large scale photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Ahadi, Amir; Ghadimi, Noradin; Mirabbasi, Davar

    2014-10-01

    Photovoltaic (PV) systems have significantly shifted from independent power generation systems to a large-scale grid-connected generation systems in recent years. The power output of PV systems is affected by the reliability of various components in the system. This study proposes an analytical approach to evaluate the reliability of large-scale, grid-connected PV systems. The fault tree method with an exponential probability distribution function is used to analyze the components of large-scale PV systems. The system is considered in the various sequential and parallel fault combinations in order to find all realistic ways in which the top or undesired events can occur. Additionally, it can identify areas that the planned maintenance should focus on. By monitoring the critical components of a PV system, it is possible not only to improve the reliability of the system, but also to optimize the maintenance costs. The latter is achieved by informing the operators about the system component's status. This approach can be used to ensure secure operation of the system by its flexibility in monitoring system applications. The implementation demonstrates that the proposed method is effective and efficient and can conveniently incorporate more system maintenance plans and diagnostic strategies.

  5. Lateral stirring of large-scale tracer fields by altimetry

    NASA Astrophysics Data System (ADS)

    Dencausse, Guillaume; Morrow, Rosemary; Rogé, Marine; Fleury, Sara

    2014-01-01

    Ocean surface fronts and filaments have a strong impact on the global ocean circulation and biogeochemistry. Surface Lagrangian advection with time-evolving altimetric geostrophic velocities can be used to simulate the submesoscale front and filament structures in large-scale tracer fields. We study this technique in the Southern Ocean region south of Tasmania, a domain marked by strong meso- to submesoscale features such as the fronts of the Antarctic Circumpolar Current (ACC). Starting with large-scale surface tracer fields that we stir with altimetric velocities, we determine `advected' fields which compare well with high-resolution in situ or satellite tracer data. We find that fine scales are best represented in a statistical sense after an optimal advection time of ˜2 weeks, with enhanced signatures of the ACC fronts and better spectral energy. The technique works best in moderate to high EKE regions where lateral advection dominates. This technique may be used to infer the distribution of unresolved small scales in any physical or biogeochemical surface tracer that is dominated by lateral advection. Submesoscale dynamics also impact the subsurface of the ocean, and the Lagrangian advection at depth shows promising results. Finally, we show that climatological tracer fields computed from the advected large-scale fields display improved fine-scale mean features, such as the ACC fronts, which can be useful in the context of ocean modelling.

  6. Multitree Algorithms for Large-Scale Astrostatistics

    NASA Astrophysics Data System (ADS)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  7. Large-Scale Organization of Glycosylation Networks

    NASA Astrophysics Data System (ADS)

    Kim, Pan-Jun; Lee, Dong-Yup; Jeong, Hawoong

    2009-03-01

    Glycosylation is a highly complex process to produce a diverse repertoire of cellular glycans that are frequently attached to proteins and lipids. Glycans participate in fundamental biological processes including molecular trafficking and clearance, cell proliferation and apoptosis, developmental biology, immune response, and pathogenesis. N-linked glycans found on proteins are formed by sequential attachments of monosaccharides with the help of a relatively small number of enzymes. Many of these enzymes can accept multiple N-linked glycans as substrates, thus generating a large number of glycan intermediates and their intermingled pathways. Motivated by the quantitative methods developed in complex network research, we investigate the large-scale organization of such N-glycosylation pathways in a mammalian cell. The uncovered results give the experimentally-testable predictions for glycosylation process, and can be applied to the engineering of therapeutic glycoproteins.

  8. Large-scale databases of proper names.

    PubMed

    Conley, P; Burgess, C; Hage, D

    1999-05-01

    Few tools for research in proper names have been available--specifically, there is no large-scale corpus of proper names. Two corpora of proper names were constructed, one based on U.S. phone book listings, the other derived from a database of Usenet text. Name frequencies from both corpora were compared with human subjects' reaction times (RTs) to the proper names in a naming task. Regression analysis showed that the Usenet frequencies contributed to predictions of human RT, whereas phone book frequencies did not. In addition, semantic neighborhood density measures derived from the HAL corpus were compared with the subjects' RTs and found to be a better predictor of RT than was frequency in either corpus. These new corpora are freely available on line for download. Potentials for these corpora range from using the names as stimuli in experiments to using the corpus data in software applications. PMID:10495803

  9. Estimation of large-scale dimension densities.

    PubMed

    Raab, C; Kurths, J

    2001-07-01

    We propose a technique to calculate large-scale dimension densities in both higher-dimensional spatio-temporal systems and low-dimensional systems from only a few data points, where known methods usually have an unsatisfactory scaling behavior. This is mainly due to boundary and finite-size effects. With our rather simple method, we normalize boundary effects and get a significant correction of the dimension estimate. This straightforward approach is based on rather general assumptions. So even weak coherent structures obtained from small spatial couplings can be detected with this method, which is impossible by using the Lyapunov-dimension density. We demonstrate the efficiency of our technique for coupled logistic maps, coupled tent maps, the Lorenz attractor, and the Roessler attractor. PMID:11461376

  10. The challenge of large-scale structure

    NASA Astrophysics Data System (ADS)

    Gregory, S. A.

    1996-03-01

    The tasks that I have assumed for myself in this presentation include three separate parts. The first, appropriate to the particular setting of this meeting, is to review the basic work of the founding of this field; the appropriateness comes from the fact that W. G. Tifft made immense contributions that are not often realized by the astronomical community. The second task is to outline the general tone of the observational evidence for large scale structures. (Here, in particular, I cannot claim to be complete. I beg forgiveness from any workers who are left out by my oversight for lack of space and time.) The third task is to point out some of the major aspects of the field that may represent the clues by which some brilliant sleuth will ultimately figure out how galaxies formed.

  11. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  12. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.

  13. Batteries for Large Scale Energy Storage

    SciTech Connect

    Soloveichik, Grigorii L.

    2011-07-15

    In recent years, with the deployment of renewable energy sources, advances in electrified transportation, and development in smart grids, the markets for large-scale stationary energy storage have grown rapidly. Electrochemical energy storage methods are strong candidate solutions due to their high energy density, flexibility, and scalability. This review provides an overview of mature and emerging technologies for secondary and redox flow batteries. New developments in the chemistry of secondary and flow batteries as well as regenerative fuel cells are also considered. Advantages and disadvantages of current and prospective electrochemical energy storage options are discussed. The most promising technologies in the short term are high-temperature sodium batteries with β”-alumina electrolyte, lithium-ion batteries, and flow batteries. Regenerative fuel cells and lithium metal batteries with high energy density require further research to become practical.

  14. Large scale water lens for solar concentration.

    PubMed

    Mondol, A S; Vogel, B; Bastian, G

    2015-06-01

    Properties of large scale water lenses for solar concentration were investigated. These lenses were built from readily available materials, normal tap water and hyper-elastic linear low density polyethylene foil. Exposed to sunlight, the focal lengths and light intensities in the focal spot were measured and calculated. Their optical properties were modeled with a raytracing software based on the lens shape. We have achieved a good match of experimental and theoretical data by considering wavelength dependent concentration factor, absorption and focal length. The change in light concentration as a function of water volume was examined via the resulting load on the foil and the corresponding change of shape. The latter was extracted from images and modeled by a finite element simulation. PMID:26072893

  15. Large Scale Quantum Simulations of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 < ρ < 0 . 10 fm-3, proton fractions 0 . 05

  16. Large-scale simulations of reionization

    SciTech Connect

    Kohler, Katharina; Gnedin, Nickolay Y.; Hamilton, Andrew J.S.; /JILA, Boulder

    2005-11-01

    We use cosmological simulations to explore the large-scale effects of reionization. Since reionization is a process that involves a large dynamic range--from galaxies to rare bright quasars--we need to be able to cover a significant volume of the universe in our simulation without losing the important small scale effects from galaxies. Here we have taken an approach that uses clumping factors derived from small scale simulations to approximate the radiative transfer on the sub-cell scales. Using this technique, we can cover a simulation size up to 1280h{sup -1} Mpc with 10h{sup -1} Mpc cells. This allows us to construct synthetic spectra of quasars similar to observed spectra of SDSS quasars at high redshifts and compare them to the observational data. These spectra can then be analyzed for HII region sizes, the presence of the Gunn-Peterson trough, and the Lyman-{alpha} forest.

  17. The XMM Large Scale Structure Survey

    NASA Astrophysics Data System (ADS)

    Pierre, Marguerite

    2005-10-01

    We propose to complete, by an additional 5 deg2, the XMM-LSS Survey region overlying the Spitzer/SWIRE field. This field already has CFHTLS and Integral coverage, and will encompass about 10 deg2. The resulting multi-wavelength medium-depth survey, which complements XMM and Chandra deep surveys, will provide a unique view of large-scale structure over a wide range of redshift, and will show active galaxies in the full range of environments. The complete coverage by optical and IR surveys provides high-quality photometric redshifts, so that cosmological results can quickly be extracted. In the spirit of a Legacy survey, we will make the raw X-ray data immediately public. Multi-band catalogues and images will also be made available on short time scales.

  18. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  19. Large-Scale Astrophysical Visualization on Smartphones

    NASA Astrophysics Data System (ADS)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  20. Estimation of large-scale dimension densities

    NASA Astrophysics Data System (ADS)

    Raab, Corinna; Kurths, Jürgen

    2001-07-01

    We propose a technique to calculate large-scale dimension densities in both higher-dimensional spatio-temporal systems and low-dimensional systems from only a few data points, where known methods usually have an unsatisfactory scaling behavior. This is mainly due to boundary and finite-size effects. With our rather simple method, we normalize boundary effects and get a significant correction of the dimension estimate. This straightforward approach is based on rather general assumptions. So even weak coherent structures obtained from small spatial couplings can be detected with this method, which is impossible by using the Lyapunov-dimension density. We demonstrate the efficiency of our technique for coupled logistic maps, coupled tent maps, the Lorenz attractor, and the Roessler attractor.

  1. Supporting large-scale computational science

    SciTech Connect

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  2. Supporting large-scale computational science

    SciTech Connect

    Musick, R., LLNL

    1998-02-19

    Business needs have driven the development of commercial database systems since their inception. As a result, there has been a strong focus on supporting many users, minimizing the potential corruption or loss of data, and maximizing performance metrics like transactions per second, or TPC-C and TPC-D results. It turns out that these optimizations have little to do with the needs of the scientific community, and in particular have little impact on improving the management and use of large-scale high-dimensional data. At the same time, there is an unanswered need in the scientific community for many of the benefits offered by a robust DBMS. For example, tying an ad-hoc query language such as SQL together with a visualization toolkit would be a powerful enhancement to current capabilities. Unfortunately, there has been little emphasis or discussion in the VLDB community on this mismatch over the last decade. The goal of the paper is to identify the specific issues that need to be resolved before large-scale scientific applications can make use of DBMS products. This topic is addressed in the context of an evaluation of commercial DBMS technology applied to the exploration of data generated by the Department of Energy`s Accelerated Strategic Computing Initiative (ASCI). The paper describes the data being generated for ASCI as well as current capabilities for interacting with and exploring this data. The attraction of applying standard DBMS technology to this domain is discussed, as well as the technical and business issues that currently make this an infeasible solution.

  3. Large-scale sequential quadratic programming algorithms

    SciTech Connect

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  4. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  5. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-02-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  6. Large-scale sodium spray fire code validation (SOFICOV) test

    SciTech Connect

    Jeppson, D.W.; Muhlestein, L.D.

    1985-01-01

    A large-scale, sodium, spray fire code validation test was performed in the HEDL 850-m/sup 3/ Containment System Test Facility (CSTF) as part of the Sodium Spray Fire Code Validation (SOFICOV) program. Six hundred fifty eight kilograms of sodium spray was sprayed in an air atmosphere for a period of 2400 s. The sodium spray droplet sizes and spray pattern distribution were estimated. The containment atmosphere temperature and pressure response, containment wall temperature response and sodium reaction rate with oxygen were measured. These results are compared to post-test predictions using SPRAY and NACOM computer codes.

  7. The large-scale morphology of IRAS galaxies

    NASA Technical Reports Server (NTRS)

    Babul, Arif; Starkman, Glenn D.; Strauss, Michael A.

    1993-01-01

    At present, visual inspection is the only method for comparing the large-scale morphologies in the distribution of galaxies to those in model universes generated by N-body simulations. To remedy the situation, we have developed a set of three structure functions (S1, S2, S3) that quantify the degree of large-scale prolateness, oblateness, and sphericity/uniformity of a 3-D particle distribution and have applied them to a volume-limited (less than = 4000 km/s) sample of 699 IRAS galaxies with f sub 60 greater than 1.2 Jy. To determine the structure functions, we randomly select 500 galaxies as origins of spherical windows of radius R sub w, locate the centroid of the galaxies in the window (assuming all galaxies have equal mass) and then, compute the principal moments of inertia (I sub 1, I sub 2, I sub 3) about the centroid. Each S sub i is a function of (I sub 2)/(I sub 1) and (I sub 3)/I sub 1). S1, S2, and S3 tend to unity for highly prolate, oblate, and uniform distributions, respectively and tend to zero otherwise. The resulting 500 values of S sub i at each scale R sub w are used to construct a histogram.

  8. Large-Scale Statistics for Cu Electromigration

    NASA Astrophysics Data System (ADS)

    Hauschildt, M.; Gall, M.; Hernandez, R.

    2009-06-01

    Even after the successful introduction of Cu-based metallization, the electromigration failure risk has remained one of the important reliability concerns for advanced process technologies. The observation of strong bimodality for the electron up-flow direction in dual-inlaid Cu interconnects has added complexity, but is now widely accepted. The failure voids can occur both within the via ("early" mode) or within the trench ("late" mode). More recently, bimodality has been reported also in down-flow electromigration, leading to very short lifetimes due to small, slit-shaped voids under vias. For a more thorough investigation of these early failure phenomena, specific test structures were designed based on the Wheatstone Bridge technique. The use of these structures enabled an increase of the tested sample size close to 675000, allowing a direct analysis of electromigration failure mechanisms at the single-digit ppm regime. Results indicate that down-flow electromigration exhibits bimodality at very small percentage levels, not readily identifiable with standard testing methods. The activation energy for the down-flow early failure mechanism was determined to be 0.83±0.02 eV. Within the small error bounds of this large-scale statistical experiment, this value is deemed to be significantly lower than the usually reported activation energy of 0.90 eV for electromigration-induced diffusion along Cu/SiCN interfaces. Due to the advantages of the Wheatstone Bridge technique, we were also able to expand the experimental temperature range down to 150° C, coming quite close to typical operating conditions up to 125° C. As a result of the lowered activation energy, we conclude that the down-flow early failure mode may control the chip lifetime at operating conditions. The slit-like character of the early failure void morphology also raises concerns about the validity of the Blech-effect for this mechanism. A very small amount of Cu depletion may cause failure even before a

  9. Large Scale Computer Simulation of Erthocyte Membranes

    NASA Astrophysics Data System (ADS)

    Harvey, Cameron; Revalee, Joel; Laradji, Mohamed

    2007-11-01

    The cell membrane is crucial to the life of the cell. Apart from partitioning the inner and outer environment of the cell, they also act as a support of complex and specialized molecular machinery, important for both the mechanical integrity of the cell, and its multitude of physiological functions. Due to its relative simplicity, the red blood cell has been a favorite experimental prototype for investigations of the structural and functional properties of the cell membrane. The erythrocyte membrane is a composite quasi two-dimensional structure composed essentially of a self-assembled fluid lipid bilayer and a polymerized protein meshwork, referred to as the cytoskeleton or membrane skeleton. In the case of the erythrocyte, the polymer meshwork is mainly composed of spectrin, anchored to the bilayer through specialized proteins. Using a coarse-grained model, recently developed by us, of self-assembled lipid membranes with implicit solvent and using soft-core potentials, we simulated large scale red-blood-cells bilayers with dimensions ˜ 10-1 μm^2, with explicit cytoskeleton. Our aim is to investigate the renormalization of the elastic properties of the bilayer due to the underlying spectrin meshwork.

  10. Large-scale carbon fiber tests

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A realistic release of carbon fibers was established by burning a minimum of 45 kg of carbon fiber composite aircraft structural components in each of five large scale, outdoor aviation jet fuel fire tests. This release was quantified by several independent assessments with various instruments developed specifically for these tests. The most likely values for the mass of single carbon fibers released ranged from 0.2 percent of the initial mass of carbon fiber for the source tests (zero wind velocity) to a maximum of 0.6 percent of the initial carbon fiber mass for dissemination tests (5 to 6 m/s wind velocity). Mean fiber lengths for fibers greater than 1 mm in length ranged from 2.5 to 3.5 mm. Mean diameters ranged from 3.6 to 5.3 micrometers which was indicative of significant oxidation. Footprints of downwind dissemination of the fire released fibers were measured to 19.1 km from the fire.

  11. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  12. Food appropriation through large scale land acquisitions

    NASA Astrophysics Data System (ADS)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  13. Large-scale wind turbine structures

    NASA Technical Reports Server (NTRS)

    Spera, David A.

    1988-01-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  14. Large-scale wind turbine structures

    NASA Astrophysics Data System (ADS)

    Spera, David A.

    1988-05-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  15. Large-scale electromagnetic modeling for multiple inhomogeneous domains

    NASA Astrophysics Data System (ADS)

    Zhdanov, M. S.; Endo, M.; Cuma, M.

    2008-12-01

    We develop a new formulation of the integral equation (IE) method for three-dimensional (3D) electromagnetic (EM) field computation in large-scale models with multiple inhomogeneous domains. This problem arises in many practical applications including modeling the EM fields within the complex geoelectrical structures in geophysical exploration. In geophysical applications, it is difficult to describe an earth structure using a horizontally layered background conductivity model, which is required for the efficient implementation of the conventional IE approach. As a result, a large domain of interest with anomalous conductivity distribution needs to be discretized, which complicates the computations. The new method allows us to consider multiple inhomogeneous domains, where the conductivity distribution is different from that of the background, and to use independent discretizations for different domains. This reduces dramatically the computational resources required for large-scale modeling. In addition, by using this method, we can analyze the response of each domain separately without an inappropriate use of the superposition principle for the EM field calculations. The method was carefully tested for modeling the marine controlled-source electromagnetic (MCSEM) fields for complex geoelectrical structures with multiple inhomogeneous domains, such as a seafloor with rough bathymetry, salt domes, and reservoirs. We have also used this technique to investigate the return induction effects from regional geoelectrical structures, e.g., seafloor bathymetry and salt domes, which can distort the EM response from the geophysical exploration target.

  16. An informal paper on large-scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Ho, Y. C.

    1975-01-01

    Large scale systems are defined as systems requiring more than one decision maker to control the system. Decentralized control and decomposition are discussed for large scale dynamic systems. Information and many-person decision problems are analyzed.

  17. Sensitivity technologies for large scale simulation.

    SciTech Connect

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  18. International space station. Large scale integration approach

    NASA Astrophysics Data System (ADS)

    Cohen, Brad

    The International Space Station is the most complex large scale integration program in development today. The approach developed for specification, subsystem development, and verification lay a firm basis on which future programs of this nature can be based. International Space Station is composed of many critical items, hardware and software, built by numerous International Partners, NASA Institutions, and U.S. Contractors and is launched over a period of five years. Each launch creates a unique configuration that must be safe, survivable, operable, and support ongoing assembly (assemblable) to arrive at the assembly complete configuration in 2003. The approaches to integrating each of the modules into a viable spacecraft and continue the assembly is a challenge in itself. Added to this challenge are the severe schedule constraints and lack of an "Iron Bird", which prevents assembly and checkout of each on-orbit configuration prior to launch. This paper will focus on the following areas: 1) Specification development process explaining how the requirements and specifications were derived using a modular concept driven by launch vehicle capability. Each module is composed of components of subsystems versus completed subsystems. 2) Approach to stage (each stage consists of the launched module added to the current on-orbit spacecraft) specifications. Specifically, how each launched module and stage ensures support of the current and future elements of the assembly. 3) Verification approach, due to the schedule constraints, is primarily analysis supported by testing. Specifically, how are the interfaces ensured to mate and function on-orbit when they cannot be mated before launch. 4) Lessons learned. Where can we improve this complex system design and integration task?

  19. Large Scale Flame Spread Environmental Characterization Testing

    NASA Technical Reports Server (NTRS)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  20. Synchronization of coupled large-scale Boolean networks

    SciTech Connect

    Li, Fangfei

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  1. Large-scale quantum networks based on graphs

    NASA Astrophysics Data System (ADS)

    Epping, Michael; Kampermann, Hermann; Bruß, Dagmar

    2016-05-01

    Society relies and depends increasingly on information exchange and communication. In the quantum world, security and privacy is a built-in feature for information processing. The essential ingredient for exploiting these quantum advantages is the resource of entanglement, which can be shared between two or more parties. The distribution of entanglement over large distances constitutes a key challenge for current research and development. Due to losses of the transmitted quantum particles, which typically scale exponentially with the distance, intermediate quantum repeater stations are needed. Here we show how to generalise the quantum repeater concept to the multipartite case, by describing large-scale quantum networks, i.e. network nodes and their long-distance links, consistently in the language of graphs and graph states. This unifying approach comprises both the distribution of multipartite entanglement across the network, and the protection against errors via encoding. The correspondence to graph states also provides a tool for optimising the architecture of quantum networks.

  2. Scalable parallel distance field construction for large-scale applications

    SciTech Connect

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan -Liu; Kolla, Hemanth; Chen, Jacqueline H.

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. Anew distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking overtime, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. In conclusion, our work greatly extends the usability of distance fields for demanding applications.

  3. Scalable Parallel Distance Field Construction for Large-Scale Applications.

    PubMed

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan-Liu; Kolla, Hemanth; Chen, Jacqueline H

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. A new distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking over time, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. Our work greatly extends the usability of distance fields for demanding applications. PMID:26357251

  4. Systematic renormalization of the effective theory of Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Akbar Abolhasani, Ali; Mirbabayi, Mehrdad; Pajer, Enrico

    2016-05-01

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we show that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to large scale density contrast δ and momentum density π(k) scale as k2 and k, respectively. Third, we demonstrate that the common practice of introducing counterterms only in the Euler equation when one is interested in correlators of δ is indeed valid to all orders.

  5. Semantic overlay network for large-scale spatial information indexing

    NASA Astrophysics Data System (ADS)

    Zou, Zhiqiang; Wang, Yue; Cao, Kai; Qu, Tianshan; Wang, Zhongmin

    2013-08-01

    The increased demand for online services of spatial information poses new challenges to the combined filed of Computer Science and Geographic Information Science. Amongst others, these include fast indexing of spatial data in distributed networks. In this paper we propose a novel semantic overlay network for large-scale multi-dimensional spatial information indexing, called SON_LSII, which has a hybrid structure integrating a semantic quad-tree and Chord ring. The SON_LSII is a small world overlay network that achieves a very competitive trade-off between indexing efficiency and maintenance overhead. To create SON_LSII, we use an effective semantic clustering strategy that considers two aspects, i.e., the semantic of spatial information that peer holds in overlay network and physical network performances. Based on SON_LSII, a mapping method is used to reduce the multi-dimensional features into a single dimension and an efficient indexing algorithm is presented to support complex range queries of the spatial information with a massive number of concurrent users. The results from extensive experiments demonstrate that SON_LSII is superior to existing overlay networks in various respects, including scalability, maintenance, rate of indexing hits, indexing logical hops, and adaptability. Thus, the proposed SON_LSII can be used for large-scale spatial information indexing.

  6. Simulating subsurface heterogeneity improves large-scale water resources predictions

    NASA Astrophysics Data System (ADS)

    Hartmann, A. J.; Gleeson, T.; Wagener, T.; Wada, Y.

    2014-12-01

    Heterogeneity is abundant everywhere across the hydrosphere. It exists in the soil, the vadose zone and the groundwater. In large-scale hydrological models, subsurface heterogeneity is usually not considered. Instead average or representative values are chosen for each of the simulated grid cells, not incorporating any sub-grid variability. This may lead to unreliable predictions when the models are used for assessing future water resources availability, floods or droughts, or when they are used for recommendations for more sustainable water management. In this study we use a novel, large-scale model that takes into account sub-grid heterogeneity for the simulation of groundwater recharge by using statistical distribution functions. We choose all regions over Europe that are comprised by carbonate rock (~35% of the total area) because the well understood dissolvability of carbonate rocks (karstification) allows for assessing the strength of subsurface heterogeneity. Applying the model with historic data and future climate projections we show that subsurface heterogeneity lowers the vulnerability of groundwater recharge on hydro-climatic extremes and future changes of climate. Comparing our simulations with the PCR-GLOBWB model we can quantify the deviations of simulations for different sub-regions in Europe.

  7. The combustion behavior of large scale lithium titanate battery

    PubMed Central

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-01

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(NixCoyMnz)O2/Li4Ti5O12 batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(NixCoyMnz)O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112–121°C on anode tab and 139 to 147°C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li+ distribution are the main causes that lead to the difference. PMID:25586064

  8. Effects of subsurface heterogeneity on large-scale hydrological predictions

    NASA Astrophysics Data System (ADS)

    Hartmann, Andreas; Gleeson, Tom; Wagener, Thorsten; Wada, Yoshihide

    2015-04-01

    Heterogeneity is abundant everywhere across the hydrosphere. It exists in the soil, the vadose zone and the groundwater producing preferential flow and complex threshold behavior. In large-scale hydrological models, subsurface heterogeneity is usually not considered. Instead average or representative values are chosen for each of the simulated grid cells, not incorporating any sub-grid variability. This may lead to unreliable predictions when the models are used for assessing future water resources availability, floods or droughts, or when they are used for recommendations for more sustainable water management. In this study we use a novel, large-scale model that takes into account sub-grid heterogeneity for the simulation of groundwater recharge by using statistical distribution functions. We choose all regions over Europe that are comprised by carbonate rock (~35% of the total area) because the well understood dissolvability of carbonate rocks (karstification) allows for assessing the strength of subsurface heterogeneity. Applying the model with historic data and future climate projections we show that subsurface heterogeneity results (1) in larger present-day groundwater recharge and (2) a greater vulnerability to climate in terms of long-term decrease and hydrological extremes.

  9. The combustion behavior of large scale lithium titanate battery

    NASA Astrophysics Data System (ADS)

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-01

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(NixCoyMnz)O2/Li4Ti5O12 batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(NixCoyMnz)O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112-121°C on anode tab and 139 to 147°C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li+ distribution are the main causes that lead to the difference.

  10. Ecohydrological modeling for large-scale environmental impact assessment.

    PubMed

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model. PMID:26595397

  11. Large-Scale Spacecraft Fire Safety Tests

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; Toth, Balazs; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Jomaas, Grunde

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  12. Autonomic Computing Paradigm For Large Scale Scientific And Engineering Applications

    NASA Astrophysics Data System (ADS)

    Hariri, S.; Yang, J.; Zhang, Y.

    2005-12-01

    Large-scale distributed scientific applications are highly adaptive and heterogeneous in terms of their computational requirements. The computational complexity associated with each computational region or domain varies continuously and dramatically both in space and time throughout the whole life cycle of the application execution. Furthermore, the underlying distributed computing environment is similarly complex and dynamic in the availabilities and capacities of the computing resources. These challenges combined together make the current paradigms, which are based on passive components and static compositions, ineffectual. Autonomic Computing paradigm is an approach that efficiently addresses the complexity and dynamism of large scale scientific and engineering applications and realizes the self-management of these applications. In this presentation, we present an Autonomic Runtime Manager (ARM) that supports the development of autonomic applications. The ARM includes two modules: online monitoring and analysis module and autonomic planning and scheduling module. The ARM behaves as a closed-loop control system that dynamically controls and manages the execution of the applications at runtime. It regularly senses the state changes of both the applications and the underlying computing resources. It then uses these runtime information and prior knowledge about the application behavior and its physics to identify the appropriate solution methods as well as the required computing and storage resources. Consequently this approach enables us to develop autonomic applications, which are capable of self-management and self-optimization. We have developed and implemented the autonomic computing paradigms for several large scale applications such as wild fire simulations, simulations of flow through variably saturated geologic formations, and life sciences. The distributed wildfire simulation models the wildfire spread behavior by considering such factors as fuel

  13. Biased galaxy formation and large-scale structure

    NASA Astrophysics Data System (ADS)

    Berlind, Andreas Alan

    The biased relation between the galaxy and mass distributions lies at the intersection of large scale structure in the universe and the process of galaxy formation. I study the nature of galaxy bias and its connections to galaxy clustering and galaxy formation physics. Galaxy bias has traditionally been viewed as an obstacle to constraining cosmological parameters by studying galaxy clustering. I examine the effect of bias on measurements of the cosmological density parameter Wm by techniques that exploit the gravity-induced motions of galaxies. Using a variety of environmental bias models applied to N-body simulations, I find that, in most cases, the quantity estimated by these techniques is the value of W0.6m/bs , where bs is the ratio of rms galaxy fluctuations to rms mass fluctuations on large scales. Moreover, I find that different methods should, in principle, agree with each other and it is thus unlikely that non-linear or scale-dependent bias is responsible for the discrepancies that exist among current measurements. One can also view the influence of bias on galaxy clustering as a strength rather than a weakness, since it provides us with a potentially powerful way to constrain galaxy formation theories. With this goal in mind, I develop the "Halo Occupation Distribution" (HOD), a physically motivated and complete formulation of bias that is based on the distribution of galaxies within virialized dark matter halos. I explore the sensitivity of galaxy clustering statistics to features of the HOD and focus on how the HOD may be empirically constrained from galaxy clustering data. I make the connection to the physics of galaxy formation by studying the HOD predicted by the two main theoretical methods of modeling galaxy formation. I find that, despite many differences between them, the two methods predict the same HOD, suggesting that galaxy bias is determined by robust features of the hierarchical galaxy formation process rather than details of gas cooling

  14. Avanced Large-scale Integrated Computational Environment

    Energy Science and Technology Software Center (ESTSC)

    1998-10-27

    The ALICE Memory Snooper is a software applications programming interface (API) and library for use in implementing computational steering systems. It allows distributed memory parallel programs to publish variables in the computation that may be accessed over the Internet. In this way, users can examine and even change the variables in their running application remotely. The API and library ensure the consistency of the variables across the distributed memory system.

  15. Large-scale assembly of colloidal particles

    NASA Astrophysics Data System (ADS)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  16. An Automated System of Knickpoint Definition and Extraction from a Digital Elevation Model (DEM): Implications for Efficient Large-Scale Mapping and Statistical Analyses of Knickpoint Distributions in Fluvial Networks

    NASA Astrophysics Data System (ADS)

    Neely, A. B.; Bookhagen, B.; Burbank, D. W.

    2014-12-01

    Knickpoints, or convexities in a stream's longitudinal profile, often delineate boundaries in stream networks that separate reaches eroding at different rates resulting from sharp temporal or spatial changes in uplift rate, contributing drainage area, precipitation, or bedrock lithology. We explicitly defined the geometry of a knickpoint in a manner that can be identified using an algorithm which operates in accordance with the stream power incision model, using a chi-plot analysis approach. This method allows for comparison between the real stream profile extracted from a DEM, and a linear best-fit line profile in chi-elevation space, representing a steady state theoretical stream functioning in accordance to uniform temporal and spatial conditions listed above. Assessing where the stream of interest is "under-steepened" and "over-steepened" with respect to a theoretical linear profile reveals knickpoints as certain points of slope inflection, extractable by our algorithm. We tested our algorithm on a 1m resolution LiDAR DEM of Santa Cruz Island (SCI), a tectonically active island 25km south of Santa Barbara, CA with an estimated uplift rate between 0.5 and 1.2mm/yr calculated from uplifted paleoshorelines. We have identified 1025 knickpoints using our algorithm and compared the position of these knickpoints to a similarly-sized dataset of knickpoints manually selected from distance-elevation longitudinal stream profiles for the same region. Our algorithm reduced mapping time by 99.3% and agreed with knickpoint positions from the manually selected knickpoint map for 85% of the 1025 knickpoints. Discrepancies can arise from inconsistencies in manual knickpoint selection that are not present in an automated computation. Additionally, the algorithm measures useful characteristics for each knickpoint allowing for quick statistical analyses. Histograms of knickpoint elevation and chi coordinate have a 3 peaked distribution, possibly expressing 3 levels of uplifted

  17. Stochastic pattern transitions in large scale swarms

    NASA Astrophysics Data System (ADS)

    Schwartz, Ira; Lindley, Brandon; Mier-Y-Teran, Luis

    2013-03-01

    We study the effects of time dependent noise and discrete, randomly distributed time delays on the dynamics of a large coupled system of self-propelling particles. Bifurcation analysis on a mean field approximation of the system reveals that the system possesses patterns with certain universal characteristics that depend on distinguished moments of the time delay distribution. We show both theoretically and numerically that although bifurcations of simple patterns, such as translations, change stability only as a function of the first moment of the time delay distribution, more complex bifurcating patterns depend on all of the moments of the delay distribution. In addition, we show that for sufficiently large values of the coupling strength and/or the mean time delay, there is a noise intensity threshold, dependent on the delay distribution width, that forces a transition of the swarm from a misaligned state into an aligned state. We show that this alignment transition exhibits hysteresis when the noise intensity is taken to be time dependent. Research supported by the Office of Naval Research

  18. Large-scale coastal evolution of Louisiana's barrier islands

    USGS Publications Warehouse

    List, Jeffrey H.; Jaffe, Bruce E.; Sallenger,, Asbury H., Jr.

    1991-01-01

    The prediction of large-scale coastal change is an extremely important, but distant goal. Here we describe some of our initial efforts in this direction, using historical bathymetric information along a 150 km reach of the rapidly evolving barrier island coast of Louisiana. Preliminary results suggest that the relative sea level rise rate, though extremely high in the area, has played a secondary role in coastal erosion over the last 100 years, with longshore transport of sand-sized sediment being the primary cause. Prediction of future conditions is hampered by a general lack of erosion processes understanding; however, an examination of the changing volumes of sand stored in a large ebb-tidal delta system suggests a continued high rate of shoreline retreat driven by the longshore re-distribution of sand.

  19. Large scale magnetic fields in galaxies at high redshifts

    NASA Astrophysics Data System (ADS)

    Bernet, M. L.; Miniati, F.; Lilly, S. J.; Kronberg, P. P.; Dessauges-Zavadsky, M.

    2012-09-01

    In a recent study we have used a large sample of extragalactic radio sources to investigate the redshift evolution of the Rotation Measure (RM) of polarized quasars up to z ≈ 3.0. We found that the dispersion in the RM distribution of quasars increases at higher redshifts and hypothesized that MgII intervening systems were responsible for the observed trend. To test this hypothesis, we have recently obtained high-resolution UVES/VLT spectra for 76 quasars in our sample and in the redshift range 0.6 < z < 2.0. We found a clear correlation between the presence of strong MgII systems and large RMs. This implies that normal galaxies at z ≈ 1 already had large-scale magnetic fields comparable to those seen today.

  20. Complex modular structure of large-scale brain networks

    NASA Astrophysics Data System (ADS)

    Valencia, M.; Pastor, M. A.; Fernández-Seara, M. A.; Artieda, J.; Martinerie, J.; Chavez, M.

    2009-06-01

    Modular structure is ubiquitous among real-world networks from related proteins to social groups. Here we analyze the modular organization of brain networks at a large scale (voxel level) extracted from functional magnetic resonance imaging signals. By using a random-walk-based method, we unveil the modularity of brain webs and show modules with a spatial distribution that matches anatomical structures with functional significance. The functional role of each node in the network is studied by analyzing its patterns of inter- and intramodular connections. Results suggest that the modular architecture constitutes the structural basis for the coexistence of functional integration of distant and specialized brain areas during normal brain activities at rest.

  1. Monitoring large-scale precipitation over the globe

    SciTech Connect

    Xie, Pingping; Arkin, P.A.

    1997-11-01

    A previously developed algorithm was used to produce global monthly precipitation analysis on a 2.5{degrees} latitude/longitude grid for a 17-year period from 1979 to 1995. This paper reports the construction of the data set and describes some of its applications. Seven kinds of individual data sources were used, including gauge observations, estimates inferred from satellite observations, and the National Centers for Environmental Prediction (NCEP)/National Center for Atmospheric Research (NCAR) reanalysis. The merged analysis was applied to investigate the annual and interannual variability in large-scale precipitation. The mean distribution and the annual cycle of the 17-year merged analysis exhibited reasonable agreements with existing long-term means but with major differences over the eastern Pacific. The interannual variability associated with the El Nino-Southern Oscillation is similar to previous findings, but with substantial details over the ocean. 13 refs., 4 figs.

  2. Double inflation - A possible resolution of the large-scale structure problem

    NASA Technical Reports Server (NTRS)

    Turner, Michael S.; Villumsen, Jens V.; Vittorio, Nicola; Silk, Joseph; Juszkiewicz, Roman

    1987-01-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Omega = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of about 100 Mpc, while the small-scale structure over less than about 10 Mpc resembles that in a low-density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations.

  3. Large scale properties of the Webgraph

    NASA Astrophysics Data System (ADS)

    Donato, D.; Laura, L.; Leonardi, S.; Millozzi, S.

    2004-03-01

    In this paper we present an experimental study of the properties of web graphs. We study a large crawl from 2001 of 200M pages and about 1.4 billion edges made available by the WebBase project at Stanford[CITE]. We report our experimental findings on the topological properties of such graphs, such as the number of bipartite cores and the distribution of degree, PageRank values and strongly connected components.

  4. Large-scale spatial population databases in infectious disease research

    PubMed Central

    2012-01-01

    Modelling studies on the spatial distribution and spread of infectious diseases are becoming increasingly detailed and sophisticated, with global risk mapping and epidemic modelling studies now popular. Yet, in deriving populations at risk of disease estimates, these spatial models must rely on existing global and regional datasets on population distribution, which are often based on outdated and coarse resolution data. Moreover, a variety of different methods have been used to model population distribution at large spatial scales. In this review we describe the main global gridded population datasets that are freely available for health researchers and compare their construction methods, and highlight the uncertainties inherent in these population datasets. We review their application in past studies on disease risk and dynamics, and discuss how the choice of dataset can affect results. Moreover, we highlight how the lack of contemporary, detailed and reliable data on human population distribution in low income countries is proving a barrier to obtaining accurate large-scale estimates of population at risk and constructing reliable models of disease spread, and suggest research directions required to further reduce these barriers. PMID:22433126

  5. Locating inefficient links in a large-scale transportation network

    NASA Astrophysics Data System (ADS)

    Sun, Li; Liu, Like; Xu, Zhongzhi; Jie, Yang; Wei, Dong; Wang, Pu

    2015-02-01

    Based on data from geographical information system (GIS) and daily commuting origin destination (OD) matrices, we estimated the distribution of traffic flow in the San Francisco road network and studied Braess's paradox in a large-scale transportation network with realistic travel demand. We measured the variation of total travel time Δ T when a road segment is closed, and found that | Δ T | follows a power-law distribution if Δ T < 0 or Δ T > 0. This implies that most roads have a negligible effect on the efficiency of the road network, while the failure of a few crucial links would result in severe travel delays, and closure of a few inefficient links would counter-intuitively reduce travel costs considerably. Generating three theoretical networks, we discovered that the heterogeneously distributed travel demand may be the origin of the observed power-law distributions of | Δ T | . Finally, a genetic algorithm was used to pinpoint inefficient link clusters in the road network. We found that closing specific road clusters would further improve the transportation efficiency.

  6. Investigation of Coronal Large Scale Structures Utilizing Spartan 201 Data

    NASA Technical Reports Server (NTRS)

    Guhathakurta, Madhulika

    1998-01-01

    Two telescopes aboard Spartan 201, a small satellite has been launched from the Space Shuttles, on April 8th, 1993, September 8th, 1994, September 7th, 1995 and November 20th, 1997. The main objective of the mission was to answer some of the most fundamental unanswered questions of solar physics-What accelerates the solar wind and what heats the corona? The two telescopes are 1) Ultraviolet Coronal Spectrometer (UVCS) provided by the Smithsonian Astrophysical Observatory which uses ultraviolet emissions from neutral hydrogen and ions in the corona to determine velocities of the coronal plasma within the solar wind source region, and the temperature and density distributions of protons and 2) White Light Coronagraph (WLC) provided by NASA's Goddard Space Flight Center which measures visible light to determine the density distribution of coronal electrons within the same region. The PI has had the primary responsibility in the development and application of computer codes necessary for scientific data analysis activities, end instrument calibration for the white-light coronagraph for the entire Spartan mission. The PI was responsible for the science output from the WLC instrument. PI has also been involved in the investigation of coronal density distributions in large-scale structures by use of numerical models which are (mathematically) sufficient to reproduce the details of the observed brightness and polarized brightness distributions found in SPARTAN 201 data.

  7. Large-scale spatial population databases in infectious disease research.

    PubMed

    Linard, Catherine; Tatem, Andrew J

    2012-01-01

    Modelling studies on the spatial distribution and spread of infectious diseases are becoming increasingly detailed and sophisticated, with global risk mapping and epidemic modelling studies now popular. Yet, in deriving populations at risk of disease estimates, these spatial models must rely on existing global and regional datasets on population distribution, which are often based on outdated and coarse resolution data. Moreover, a variety of different methods have been used to model population distribution at large spatial scales. In this review we describe the main global gridded population datasets that are freely available for health researchers and compare their construction methods, and highlight the uncertainties inherent in these population datasets. We review their application in past studies on disease risk and dynamics, and discuss how the choice of dataset can affect results. Moreover, we highlight how the lack of contemporary, detailed and reliable data on human population distribution in low income countries is proving a barrier to obtaining accurate large-scale estimates of population at risk and constructing reliable models of disease spread, and suggest research directions required to further reduce these barriers. PMID:22433126

  8. Probes of large-scale structure in the universe

    NASA Technical Reports Server (NTRS)

    Suto, Yasushi; Gorski, Krzysztof; Juszkiewicz, Roman; Silk, Joseph

    1988-01-01

    A general formalism is developed which shows that the gravitational instability theory for the origin of the large-scale structure of the universe is now capable of critically confronting observational results on cosmic background radiation angular anisotropies, large-scale bulk motions, and large-scale clumpiness in the galaxy counts. The results indicate that presently advocated cosmological models will have considerable difficulty in simultaneously explaining the observational results.

  9. Penetration of Large Scale Electric Field to Inner Magnetosphere

    NASA Astrophysics Data System (ADS)

    Chen, S. H.; Fok, M. C. H.; Sibeck, D. G.; Wygant, J. R.; Spence, H. E.; Larsen, B.; Reeves, G. D.; Funsten, H. O.

    2015-12-01

    The direct penetration of large scale global electric field to the inner magnetosphere is a critical element in controlling how the background thermal plasma populates within the radiation belts. These plasma populations provide the source of particles and free energy needed for the generation and growth of various plasma waves that, at critical points of resonances in time and phase space, can scatter or energize radiation belt particles to regulate the flux level of the relativistic electrons in the system. At high geomagnetic activity levels, the distribution of large scale electric fields serves as an important indicator of how prevalence of strong wave-particle interactions extend over local times and radial distances. To understand the complex relationship between the global electric fields and thermal plasmas, particularly due to the ionospheric dynamo and the magnetospheric convection effects, and their relations to the geomagnetic activities, we analyze the electric field and cold plasma measurements from Van Allen Probes over more than two years period and simulate a geomagnetic storm event using Coupled Inner Magnetosphere-Ionosphere Model (CIMI). Our statistical analysis of the measurements from Van Allan Probes and CIMI simulations of the March 17, 2013 storm event indicate that: (1) Global dawn-dusk electric field can penetrate the inner magnetosphere inside the inner belt below L~2. (2) Stronger convections occurred in the dusk and midnight sectors than those in the noon and dawn sectors. (3) Strong convections at multiple locations exist at all activity levels but more complex at higher activity levels. (4) At the high activity levels, strongest convections occur in the midnight sectors at larger distances from the Earth and in the dusk sector at closer distances. (5) Two plasma populations of distinct ion temperature isotropies divided at L-Shell ~2, indicating distinct heating mechanisms between inner and outer radiation belts. (6) CIMI

  10. Large-Scale Structures of Planetary Systems

    NASA Astrophysics Data System (ADS)

    Murray-Clay, Ruth; Rogers, Leslie A.

    2015-12-01

    A class of solar system analogs has yet to be identified among the large crop of planetary systems now observed. However, since most observed worlds are more easily detectable than direct analogs of the Sun's planets, the frequency of systems with structures similar to our own remains unknown. Identifying the range of possible planetary system architectures is complicated by the large number of physical processes that affect the formation and dynamical evolution of planets. I will present two ways of organizing planetary system structures. First, I will suggest that relatively few physical parameters are likely to differentiate the qualitative architectures of different systems. Solid mass in a protoplanetary disk is perhaps the most obvious possible controlling parameter, and I will give predictions for correlations between planetary system properties that we would expect to be present if this is the case. In particular, I will suggest that the solar system's structure is representative of low-metallicity systems that nevertheless host giant planets. Second, the disk structures produced as young stars are fed by their host clouds may play a crucial role. Using the observed distribution of RV giant planets as a function of stellar mass, I will demonstrate that invoking ice lines to determine where gas giants can form requires fine tuning. I will suggest that instead, disk structures built during early accretion have lasting impacts on giant planet distributions, and disk clean-up differentially affects the orbital distributions of giant and lower-mass planets. These two organizational hypotheses have different implications for the solar system's context, and I will suggest observational tests that may allow them to be validated or falsified.