Science.gov

Sample records for mammogrid large-scale distributed

  1. The Large -scale Distribution of Galaxies

    NASA Astrophysics Data System (ADS)

    Flin, Piotr

    A review of the Large-scale structure of the Universe is given. A connection is made with the titanic work by Johannes Kepler in many areas of astronomy and cosmology. A special concern is made to spatial distribution of Galaxies, voids and walls (cellular structure of the Universe). Finaly, the author is concluding that the large scale structure of the Universe can be observed in much greater scale that it was thought twenty years ago.

  2. The large-scale distribution of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.

    1989-01-01

    The spatial distribution of galaxies in the universe is characterized on the basis of the six completed strips of the Harvard-Smithsonian Center for Astrophysics redshift-survey extension. The design of the survey is briefly reviewed, and the results are presented graphically. Vast low-density voids similar to the void in Bootes are found, almost completely surrounded by thin sheets of galaxies. Also discussed are the implications of the results for the survey sampling problem, the two-point correlation function of the galaxy distribution, the possibility of detecting large-scale coherent flows, theoretical models of large-scale structure, and the identification of groups and clusters of galaxies.

  3. Very Large Scale Distributed Information Processing Systems

    DTIC Science & Technology

    1991-09-27

    34Reliable Distributed Database Management", Proc. of the IEEE, May 1987, pp. 601-620. [GOTT881 Gottlob , Georg andRoberto Zicari, "Closed World Databases... Gottlob , and Gio Wiederhold, "Interfacing Relational Databases and Prolog Efficiently," in Proceedings 2nd Expert Database Systems Conference, pp. 141

  4. Distribution probability of large-scale landslides in central Nepal

    NASA Astrophysics Data System (ADS)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  5. Efficient Distributed Test Architectures for Large-Scale Systems

    NASA Astrophysics Data System (ADS)

    de Almeida, Eduardo Cunha; Marynowski, Jõao Eugenio; Sunyé, Gerson; Le Traon, Yves; Valduriez, Patrick

    Typical testing architectures for distributed software rely on a centralized test controller, which decomposes test cases in steps and deploy them across distributed testers. The controller also guarantees the correct execution of test steps through synchronization messages. These architectures are not scalable while testing large-scale distributed systems due to the cost of synchronization management, which may increase the cost of a test and even prevent its execution. This paper presents a distributed architecture to synchronize the test execution sequence. This approach organizes the testers in a tree, where messages are exchanged among parents and children. The experimental evaluation shows that the synchronization management overhead can be reduced by several orders of magnitude. We conclude that testing architectures should scale up along with the distributed system under test.

  6. Distributed Coordinated Control of Large-Scale Nonlinear Networks

    DOE PAGES

    Kundu, Soumya; Anghel, Marian

    2015-11-08

    We provide a distributed coordinated approach to the stability analysis and control design of largescale nonlinear dynamical systems by using a vector Lyapunov functions approach. In this formulation the large-scale system is decomposed into a network of interacting subsystems and the stability of the system is analyzed through a comparison system. However finding such comparison system is not trivial. In this work, we propose a sum-of-squares based completely decentralized approach for computing the comparison systems for networks of nonlinear systems. Moreover, based on the comparison systems, we introduce a distributed optimal control strategy in which the individual subsystems (agents) coordinatemore » with their immediate neighbors to design local control policies that can exponentially stabilize the full system under initial disturbances.We illustrate the control algorithm on a network of interacting Van der Pol systems.« less

  7. Distributed Coordinated Control of Large-Scale Nonlinear Networks

    SciTech Connect

    Kundu, Soumya; Anghel, Marian

    2015-11-08

    We provide a distributed coordinated approach to the stability analysis and control design of largescale nonlinear dynamical systems by using a vector Lyapunov functions approach. In this formulation the large-scale system is decomposed into a network of interacting subsystems and the stability of the system is analyzed through a comparison system. However finding such comparison system is not trivial. In this work, we propose a sum-of-squares based completely decentralized approach for computing the comparison systems for networks of nonlinear systems. Moreover, based on the comparison systems, we introduce a distributed optimal control strategy in which the individual subsystems (agents) coordinate with their immediate neighbors to design local control policies that can exponentially stabilize the full system under initial disturbances.We illustrate the control algorithm on a network of interacting Van der Pol systems.

  8. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    NASA Astrophysics Data System (ADS)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  9. Large-scale mass distribution in the Illustris simulation

    NASA Astrophysics Data System (ADS)

    Haider, M.; Steinhauser, D.; Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Hernquist, L.

    2016-04-01

    Observations at low redshifts thus far fail to account for all of the baryons expected in the Universe according to cosmological constraints. A large fraction of the baryons presumably resides in a thin and warm-hot medium between the galaxies, where they are difficult to observe due to their low densities and high temperatures. Cosmological simulations of structure formation can be used to verify this picture and provide quantitative predictions for the distribution of mass in different large-scale structure components. Here we study the distribution of baryons and dark matter at different epochs using data from the Illustris simulation. We identify regions of different dark matter density with the primary constituents of large-scale structure, allowing us to measure mass and volume of haloes, filaments and voids. At redshift zero, we find that 49 per cent of the dark matter and 23 per cent of the baryons are within haloes more massive than the resolution limit of 2 × 108 M⊙. The filaments of the cosmic web host a further 45 per cent of the dark matter and 46 per cent of the baryons. The remaining 31 per cent of the baryons reside in voids. The majority of these baryons have been transported there through active galactic nuclei feedback. We note that the feedback model of Illustris is too strong for heavy haloes, therefore it is likely that we are overestimating this amount. Categorizing the baryons according to their density and temperature, we find that 17.8 per cent of them are in a condensed state, 21.6 per cent are present as cold, diffuse gas, and 53.9 per cent are found in the state of a warm-hot intergalactic medium.

  10. The large scale dust distribution in the inner galaxy

    NASA Technical Reports Server (NTRS)

    Hauser, M. G.; Dwek, E.; Gezari, D.; Silverberg, R.; Kelsall, T.; Stier, M.; Cheung, L.

    1983-01-01

    Initial results are presented from a new large-scale survey of the first quadrant of the galactic plane at wavelengths of 160, 260, and 300 microns. The submillimeter wavelength emission, interpreted as thermal radiation by dust grains, reveals an optically thin disk of angular width about 0.09 deg (FWHM) with a mean dust temperature of 23 K and significant variation of the dust mass column density. Comparison of the dust column density with the gas column density inferred from CO survey data shows a striking spatial correlation. The mean luminosity per hydrogen atom is found to be 2.5 x 10 to the -30th W/H, implying a radiant energy density in the vicinity of the dust an order of magnitude larger than in the solar neighborhood. The data favor dust in molecular clouds as the dominant submillimeter radiation source.

  11. Secure Large-Scale Airport Simulations Using Distributed Computational Resources

    NASA Technical Reports Server (NTRS)

    McDermott, William J.; Maluf, David A.; Gawdiak, Yuri; Tran, Peter; Clancy, Dan (Technical Monitor)

    2001-01-01

    To fully conduct research that will support the far-term concepts, technologies and methods required to improve the safety of Air Transportation a simulation environment of the requisite degree of fidelity must first be in place. The Virtual National Airspace Simulation (VNAS) will provide the underlying infrastructure necessary for such a simulation system. Aerospace-specific knowledge management services such as intelligent data-integration middleware will support the management of information associated with this complex and critically important operational environment. This simulation environment, in conjunction with a distributed network of supercomputers, and high-speed network connections to aircraft, and to Federal Aviation Administration (FAA), airline and other data-sources will provide the capability to continuously monitor and measure operational performance against expected performance. The VNAS will also provide the tools to use this performance baseline to obtain a perspective of what is happening today and of the potential impact of proposed changes before they are introduced into the system.

  12. The galaxy distribution and the large-scale structure of the universe

    NASA Technical Reports Server (NTRS)

    Geller, M. J.; Kurtz, M. J.; De Lapparent, V.

    1986-01-01

    Data related to the large-scale galaxy distribution are discussed. The galaxy counts of Shane-Wirtanen (1967) are analyzed; the effects of residual systematic errors on the galaxy distribution measurements are considered. The analysis reveals that the Shane-Wirtanen data are not applicable to the study of large-scale structure. A model which is capable of measuring galaxy correlation functions on scales greater than about 10 Mpc is evaluated.

  13. Design of Availability-Dependent Distributed Services in Large-Scale Uncooperative Settings

    ERIC Educational Resources Information Center

    Morales, Ramses Victor

    2009-01-01

    Thesis Statement: "Availability-dependent global predicates can be efficiently and scalably realized for a class of distributed services, in spite of specific selfish and colluding behaviors, using local and decentralized protocols". Several types of large-scale distributed systems spanning the Internet have to deal with availability variations…

  14. Framing Innovation: The Role of Distributed Leadership in Gaining Acceptance of Large-Scale Technology Initiatives

    ERIC Educational Resources Information Center

    Turner, Henry J.

    2014-01-01

    This dissertation of practice utilized a multiple case-study approach to examine distributed leadership within five school districts that were attempting to gain acceptance of a large-scale 1:1 technology initiative. Using frame theory and distributed leadership theory as theoretical frameworks, this study interviewed each district's…

  15. Framing Innovation: The Role of Distributed Leadership in Gaining Acceptance of Large-Scale Technology Initiatives

    ERIC Educational Resources Information Center

    Turner, Henry J.

    2014-01-01

    This dissertation of practice utilized a multiple case-study approach to examine distributed leadership within five school districts that were attempting to gain acceptance of a large-scale 1:1 technology initiative. Using frame theory and distributed leadership theory as theoretical frameworks, this study interviewed each district's…

  16. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    PubMed Central

    He, Hui; Fan, Guotao; Ye, Jianwei; Zhang, Weizhe

    2013-01-01

    It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system's emergency response capabilities, alleviate the cyber attacks' damage, and strengthen the system's counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system's plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks' topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology. PMID:24191145

  17. A topology visualization early warning distribution algorithm for large-scale network security incidents.

    PubMed

    He, Hui; Fan, Guotao; Ye, Jianwei; Zhang, Weizhe

    2013-01-01

    It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system's emergency response capabilities, alleviate the cyber attacks' damage, and strengthen the system's counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system's plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks' topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  18. Galaxy clusters in visible light (I): catalogues, large-scale distribution, and general properties.

    NASA Astrophysics Data System (ADS)

    Bian, Yulin

    1995-12-01

    While the nature, behaviour, and evolution of galaxy clusters is a such wide research field, only some of their optical properties are underlined in the present review. The whole article is divided into two parts, of which this is the first one, contributed to cluster catalogues, large-scale distribution, and some general characteristics of galaxy clusters.

  19. Analysis of Large-Scale Matter Distribution with the Minimal Spanning Tree Technique

    NASA Astrophysics Data System (ADS)

    Doroshkevich, A.; Turchaninov, V.

    The application of the Minimal Spanning Three technique to the description of large scale object distribution in observed and simulated catalogues is demonstrated. We show that it can be roughly described as a system of high density filaments half of which is accumulated by wall-like condensations.

  20. Statistics of density maxima and the large-scale matter distribution

    NASA Technical Reports Server (NTRS)

    Kaiser, N.

    1986-01-01

    High peaks in Gaussian noise display enhanced clustering. The enhancement takes two forms: on large scales one obtains a linear amplification of the correlation function which is independent of scale. On smaller scales, but larger than the mass scale of the peaks themselves, a nonlinear (exponential) enhancement of the number density of high peaks in overdense regions arises. The large-scale correlations of Abell's rich clusters can be understood as a manifestation of this phenomenon. If the formation of bright galaxies favors the high overdensity peaks then the number of galaxies (per unit mass) in clusters and groups may be considerably enhanced. Consequences of these ideas for the density parameter and the large-scale matter distribution are discussed.

  1. Multi-level structure in the large scale distribution of optically luminous galaxies

    NASA Astrophysics Data System (ADS)

    Deng, Xin-fa; Deng, Zu-gan; Liu, Yong-zhen

    1992-04-01

    Fractal dimensions in the large scale distribution of galaxies have been calculated with the method given by Wen et al. [1] Samples are taken from CfA redshift survey in northern and southern galactic [2] hemisphere in our analysis respectively. Results from these two regions are compared with each other. There are significant differences between the distributions in these two regions. However, our analyses do show some common features of the distributions in these two regions. All subsamples show multi-level fractal character distinctly. Combining it with the results from analyses of samples given by IRAS galaxies and results from samples given by redshift survey in pencil-beam fields, [3,4] we suggest that multi-level fractal structure is most likely to be a general and important character in the large scale distribution of galaxies. The possible implications of this character are discussed.

  2. Spatial distribution of GRB and large scale structure of the Universe

    NASA Astrophysics Data System (ADS)

    Bagoly, Zsolt; Racz, Istvan; Gyorgy Balazs, Lajos; Toth, Viktor; Horvath, Istvan

    2015-08-01

    We studied the distribution the starburst galaxies from Millenium XXL database at z=0.82. First we examined the starburst distribution in the classical Millenium I, from the DeLucia (2006) database which used a semi-analytical model for the galaxies genesis. We found a relationship between the starburst galaxies and the dark matter density distribution in Millenium I and we determined the Millenium I and Millenium XXL transformation factor. We simulated a starburst galaxies sample with Markov Chain Monte Carlo method where we used the Metropolis-Hastings algorithm. The connection between the large scale structures homogenous and starburst groups distribution on a defined scale were checked too.

  3. NASA's Information Power Grid: Large Scale Distributed Computing and Data Management

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Vaziri, Arsi; Hinke, Tom; Tanner, Leigh Ann; Feiereisen, William J.; Thigpen, William; Tang, Harry (Technical Monitor)

    2001-01-01

    Large-scale science and engineering are done through the interaction of people, heterogeneous computing resources, information systems, and instruments, all of which are geographically and organizationally dispersed. The overall motivation for Grids is to facilitate the routine interactions of these resources in order to support large-scale science and engineering. Multi-disciplinary simulations provide a good example of a class of applications that are very likely to require aggregation of widely distributed computing, data, and intellectual resources. Such simulations - e.g. whole system aircraft simulation and whole system living cell simulation - require integrating applications and data that are developed by different teams of researchers frequently in different locations. The research team's are the only ones that have the expertise to maintain and improve the simulation code and/or the body of experimental data that drives the simulations. This results in an inherently distributed computing and data management environment.

  4. Generalizations of the Alternating Direction Method of Multipliers for Large-Scale and Distributed Optimization

    DTIC Science & Technology

    2014-05-01

    global convergence and further show its linear convergence under a variety of scenarios, which cover a wide range of applications . The derived rate of...global convergence and further show its linear convergence under a variety of scenarios, which cover a wide range of applications . The derived rate of...efficiency, flexibility and applicability for large-scale and distributed op- timization problems. We also make important extensions to the convergence

  5. Spatial distribution of GRBs and large scale structure of the Universe

    NASA Astrophysics Data System (ADS)

    Bagoly, Zsolt; Rácz, István I.; Balázs, Lajos G.; Tóth, L. Viktor; Horváth, István

    We studied the space distribution of the starburst galaxies from Millennium XXL database at z = 0.82. We examined the starburst distribution in the classical Millennium I (De Lucia et al. (2006)) using a semi-analytical model for the genesis of the galaxies. We simulated a starburst galaxies sample with Markov Chain Monte Carlo method. The connection between the large scale structures homogenous and starburst groups distribution (Kofman and Shandarin 1998), Suhhonenko et al. (2011), Liivamägi et al. (2012), Park et al. (2012), Horvath et al. (2014), Horvath et al. (2015)) on a defined scale were checked too.

  6. Cost Distribution of Environmental Flow Demands in a Large Scale Multi-Reservoir System

    NASA Astrophysics Data System (ADS)

    Marques, G.; Tilmant, A.

    2014-12-01

    This paper investigates the recovery of a prescribed flow regime through reservoir system reoperation, focusing on the associated costs and losses imposed on different power plants depending on flows, power plant and reservoir characteristics and systems topology. In large-scale reservoir systems such cost distribution is not trivial, and it should be properly evaluated to identify coordinated operating solutions that avoid penalizing a single reservoir. The methods combine an efficient stochastic dual dynamic programming algorithm for reservoir optimization subject to environmental flow targets with specific magnitude, duration and return period, which effects on fish recruitment are already known. Results indicate that the distribution of the effect of meeting the environmental flow demands throughout the reservoir cascade differs largely, and in some reservoirs power production and revenue are increased, while in others it is reduced. Most importantly, for the example system modeled here (10 reservoirs in the Parana River basin, Brazil) meeting the target environmental flows was possible without reducing the total energy produced in the year, at a cost of $25 Million/year in foregone hydropower revenues (3% reduction). Finally, the results and methods are useful in (a) quantifying the foregone hydropower and revenues resulting from meeting a specific environmental flow demand, (b) identifying the distribution and reallocation of the foregone hydropower and revenue across a large scale system, and (c) identifying optimal reservoir operating strategies to meet environmental flow demands in a large scale multi-reservoir system.

  7. Spatial distribution of ultra-diffuse galaxies within large-scale structures

    NASA Astrophysics Data System (ADS)

    Román, Javier; Trujillo, Ignacio

    2017-06-01

    Taking advantage of the Sloan Digital Sky Survey Stripe82 data, we have explored the spatial distribution of ultra-diffuse galaxies (UDGs) within an area of 8 × 8 Mpc2 centred around the galaxy cluster Abell 168 (z = 0.045). This intermediate massive cluster (σ = 550 km s-1) is surrounded by a complex large-scale structure. Our work confirms the presence of UDGs in the cluster and in the large-scale structure that surrounds it, and it is the first detection of UDGs outside clusters. Approximately 50 per cent of the UDGs analysed in the selected area inhabit the cluster region (˜11 ± 5 per cent in the core and ˜39 ± 9 per cent in the outskirts), whereas the remaining UDGs are found outside the main cluster structure (˜50 ± 11 per cent). The colours and the spatial distribution of the UDGs within this large-scale structure are more similar to dwarf galaxies than to L⋆ galaxies, suggesting that most UDGs could be bona fide dwarf galaxies.

  8. Large-Scale Ichthyoplankton and Water Mass Distribution along the South Brazil Shelf

    PubMed Central

    de Macedo-Soares, Luis Carlos Pinto; Garcia, Carlos Alberto Eiras; Freire, Andrea Santarosa; Muelbert, José Henrique

    2014-01-01

    Ichthyoplankton is an essential component of pelagic ecosystems, and environmental factors play an important role in determining its distribution. We have investigated simultaneous latitudinal and cross-shelf gradients in ichthyoplankton abundance to test the hypothesis that the large-scale distribution of fish larvae in the South Brazil Shelf is associated with water mass composition. Vertical plankton tows were collected between 21°27′ and 34°51′S at 107 stations, in austral late spring and early summer seasons. Samples were taken with a conical-cylindrical plankton net from the depth of chlorophyll maxima to the surface in deep stations, or from 10 m from the bottom to the surface in shallow waters. Salinity and temperature were obtained with a CTD/rosette system, which provided seawater for chlorophyll-a and nutrient concentrations. The influence of water mass on larval fish species was studied using Indicator Species Analysis, whereas environmental effects on the distribution of larval fish species were analyzed by Distance-based Redundancy Analysis. Larval fish species were associated with specific water masses: in the north, Sardinella brasiliensis was found in Shelf Water; whereas in the south, Engraulis anchoita inhabited the Plata Plume Water. At the slope, Tropical Water was characterized by the bristlemouth Cyclothone acclinidens. The concurrent analysis showed the importance of both cross-shelf and latitudinal gradients on the large-scale distribution of larval fish species. Our findings reveal that ichthyoplankton composition and large-scale spatial distribution are determined by water mass composition in both latitudinal and cross-shelf gradients. PMID:24614798

  9. Large-Scale Geographic Variation in Distribution and Abundance of Australian Deep-Water Kelp Forests

    PubMed Central

    Marzinelli, Ezequiel M.; Williams, Stefan B.; Babcock, Russell C.; Barrett, Neville S.; Johnson, Craig R.; Jordan, Alan; Kendrick, Gary A.; Pizarro, Oscar R.; Smale, Dan A.; Steinberg, Peter D.

    2015-01-01

    Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV) facility of Australia’s Integrated Marine Observing System (IMOS) to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10–100 m to 100–1,000 km) and depths (15–60 m) across several regions ca 2–6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth) and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40–50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves. PMID:25693066

  10. Large-scale ichthyoplankton and water mass distribution along the South Brazil Shelf.

    PubMed

    de Macedo-Soares, Luis Carlos Pinto; Garcia, Carlos Alberto Eiras; Freire, Andrea Santarosa; Muelbert, José Henrique

    2014-01-01

    Ichthyoplankton is an essential component of pelagic ecosystems, and environmental factors play an important role in determining its distribution. We have investigated simultaneous latitudinal and cross-shelf gradients in ichthyoplankton abundance to test the hypothesis that the large-scale distribution of fish larvae in the South Brazil Shelf is associated with water mass composition. Vertical plankton tows were collected between 21°27' and 34°51'S at 107 stations, in austral late spring and early summer seasons. Samples were taken with a conical-cylindrical plankton net from the depth of chlorophyll maxima to the surface in deep stations, or from 10 m from the bottom to the surface in shallow waters. Salinity and temperature were obtained with a CTD/rosette system, which provided seawater for chlorophyll-a and nutrient concentrations. The influence of water mass on larval fish species was studied using Indicator Species Analysis, whereas environmental effects on the distribution of larval fish species were analyzed by Distance-based Redundancy Analysis. Larval fish species were associated with specific water masses: in the north, Sardinella brasiliensis was found in Shelf Water; whereas in the south, Engraulis anchoita inhabited the Plata Plume Water. At the slope, Tropical Water was characterized by the bristlemouth Cyclothone acclinidens. The concurrent analysis showed the importance of both cross-shelf and latitudinal gradients on the large-scale distribution of larval fish species. Our findings reveal that ichthyoplankton composition and large-scale spatial distribution are determined by water mass composition in both latitudinal and cross-shelf gradients.

  11. Large-scale geographic variation in distribution and abundance of Australian deep-water kelp forests.

    PubMed

    Marzinelli, Ezequiel M; Williams, Stefan B; Babcock, Russell C; Barrett, Neville S; Johnson, Craig R; Jordan, Alan; Kendrick, Gary A; Pizarro, Oscar R; Smale, Dan A; Steinberg, Peter D

    2015-01-01

    Despite the significance of marine habitat-forming organisms, little is known about their large-scale distribution and abundance in deeper waters, where they are difficult to access. Such information is necessary to develop sound conservation and management strategies. Kelps are main habitat-formers in temperate reefs worldwide; however, these habitats are highly sensitive to environmental change. The kelp Ecklonia radiate is the major habitat-forming organism on subtidal reefs in temperate Australia. Here, we provide large-scale ecological data encompassing the latitudinal distribution along the continent of these kelp forests, which is a necessary first step towards quantitative inferences about the effects of climatic change and other stressors on these valuable habitats. We used the Autonomous Underwater Vehicle (AUV) facility of Australia's Integrated Marine Observing System (IMOS) to survey 157,000 m2 of seabed, of which ca 13,000 m2 were used to quantify kelp covers at multiple spatial scales (10-100 m to 100-1,000 km) and depths (15-60 m) across several regions ca 2-6° latitude apart along the East and West coast of Australia. We investigated the large-scale geographic variation in distribution and abundance of deep-water kelp (>15 m depth) and their relationships with physical variables. Kelp cover generally increased with latitude despite great variability at smaller spatial scales. Maximum depth of kelp occurrence was 40-50 m. Kelp latitudinal distribution along the continent was most strongly related to water temperature and substratum availability. This extensive survey data, coupled with ongoing AUV missions, will allow for the detection of long-term shifts in the distribution and abundance of habitat-forming kelp and the organisms they support on a continental scale, and provide information necessary for successful implementation and management of conservation reserves.

  12. A Web-based Distributed Voluntary Computing Platform for Large Scale Hydrological Computations

    NASA Astrophysics Data System (ADS)

    Demir, I.; Agliamzanov, R.

    2014-12-01

    Distributed volunteer computing can enable researchers and scientist to form large parallel computing environments to utilize the computing power of the millions of computers on the Internet, and use them towards running large scale environmental simulations and models to serve the common good of local communities and the world. Recent developments in web technologies and standards allow client-side scripting languages to run at speeds close to native application, and utilize the power of Graphics Processing Units (GPU). Using a client-side scripting language like JavaScript, we have developed an open distributed computing framework that makes it easy for researchers to write their own hydrologic models, and run them on volunteer computers. Users will easily enable their websites for visitors to volunteer sharing their computer resources to contribute running advanced hydrological models and simulations. Using a web-based system allows users to start volunteering their computational resources within seconds without installing any software. The framework distributes the model simulation to thousands of nodes in small spatial and computational sizes. A relational database system is utilized for managing data connections and queue management for the distributed computing nodes. In this paper, we present a web-based distributed volunteer computing platform to enable large scale hydrological simulations and model runs in an open and integrated environment.

  13. Multi-agent based control of large-scale complex systems employing distributed dynamic inference engine

    NASA Astrophysics Data System (ADS)

    Zhang, Daili

    Increasing societal demand for automation has led to considerable efforts to control large-scale complex systems, especially in the area of autonomous intelligent control methods. The control system of a large-scale complex system needs to satisfy four system level requirements: robustness, flexibility, reusability, and scalability. Corresponding to the four system level requirements, there arise four major challenges. First, it is difficult to get accurate and complete information. Second, the system may be physically highly distributed. Third, the system evolves very quickly. Fourth, emergent global behaviors of the system can be caused by small disturbances at the component level. The Multi-Agent Based Control (MABC) method as an implementation of distributed intelligent control has been the focus of research since the 1970s, in an effort to solve the above-mentioned problems in controlling large-scale complex systems. However, to the author's best knowledge, all MABC systems for large-scale complex systems with significant uncertainties are problem-specific and thus difficult to extend to other domains or larger systems. This situation is partly due to the control architecture of multiple agents being determined by agent to agent coupling and interaction mechanisms. Therefore, the research objective of this dissertation is to develop a comprehensive, generalized framework for the control system design of general large-scale complex systems with significant uncertainties, with the focus on distributed control architecture design and distributed inference engine design. A Hybrid Multi-Agent Based Control (HyMABC) architecture is proposed by combining hierarchical control architecture and module control architecture with logical replication rings. First, it decomposes a complex system hierarchically; second, it combines the components in the same level as a module, and then designs common interfaces for all of the components in the same module; third, replications

  14. Multiplexed extremely short distributed Bragg reflector fiber laser array for large-scale sensing applications

    NASA Astrophysics Data System (ADS)

    Wong, Allan C. L.; Chung, W. H.; Tam, H. Y.; Lu, C.

    2010-12-01

    We report the fabrication of extremely short linear-cavity distributed Bragg reflector fiber lasers (DBR-FLs). It has a total length of 7 mm, with the nominal cavity of only 0.4 mm. The FL has a linewidth and polarization beat frequency of 220 Hz and 18.9 MHz, respectively. The relaxation oscillation frequency and its relative peak are 120 kHz and -75dB/Hz, respectively. The FL exhibited low-noise characteristics, with an intensity noise of -107 dB/Hz at 1 MHz. We constructed a FL sensor array that has great potentials for large-scale, high sensitivity sensing applications.

  15. Distributed weighted least-squares estimation with fast convergence for large-scale systems.

    PubMed

    Marelli, Damián Edgardo; Fu, Minyue

    2015-01-01

    In this paper we study a distributed weighted least-squares estimation problem for a large-scale system consisting of a network of interconnected sub-systems. Each sub-system is concerned with a subset of the unknown parameters and has a measurement linear in the unknown parameters with additive noise. The distributed estimation task is for each sub-system to compute the globally optimal estimate of its own parameters using its own measurement and information shared with the network through neighborhood communication. We first provide a fully distributed iterative algorithm to asymptotically compute the global optimal estimate. The convergence rate of the algorithm will be maximized using a scaling parameter and a preconditioning method. This algorithm works for a general network. For a network without loops, we also provide a different iterative algorithm to compute the global optimal estimate which converges in a finite number of steps. We include numerical experiments to illustrate the performances of the proposed methods.

  16. Upper limit on periodicity in the three-dimensional large-scale distribution of matter

    NASA Technical Reports Server (NTRS)

    Tytler, David; Sandoval, John; Fan, Xiao-Ming

    1993-01-01

    A search for large-scale periodicity in the 3D distribution of 268 Mg II QSO absorption systems which are distributed over 60 percent of the sky, at redshifts 0.1-2.0 is presented. The scalar 3D comoving separations of all pairs of absorption systems are calculated, and peaks in the power spectrum of the distribution of those separations are searched for. The present 95-percent confidence upper limit on the amplitude of a possible periodic fluctuation in the density of galaxies is between one-fourth and three-fourths of the amplitude implied by the data of Broadhurst et al. (1990), depending on the extent to which the wavelength varies and the phase of the signal drifts down lines of sight. A description is presented of how QSO absorption systems sample the 3D population of absorbers and how 3D positions can be represented by their scalar separations.

  17. On distributed wavefront reconstruction for large-scale adaptive optics systems.

    PubMed

    de Visser, Cornelis C; Brunner, Elisabeth; Verhaegen, Michel

    2016-05-01

    The distributed-spline-based aberration reconstruction (D-SABRE) method is proposed for distributed wavefront reconstruction with applications to large-scale adaptive optics systems. D-SABRE decomposes the wavefront sensor domain into any number of partitions and solves a local wavefront reconstruction problem on each partition using multivariate splines. D-SABRE accuracy is within 1% of a global approach with a speedup that scales quadratically with the number of partitions. The D-SABRE is compared to the distributed cumulative reconstruction (CuRe-D) method in open-loop and closed-loop simulations using the YAO adaptive optics simulation tool. D-SABRE accuracy exceeds CuRe-D for low levels of decomposition, and D-SABRE proved to be more robust to variations in the loop gain.

  18. Large-Scale Distributed Computational Fluid Dynamics on the Information Power Grid Using Globus

    NASA Technical Reports Server (NTRS)

    Barnard, Stephen; Biswas, Rupak; Saini, Subhash; VanderWijngaart, Robertus; Yarrow, Maurice; Zechtzer, Lou; Foster, Ian; Larsson, Olle

    1999-01-01

    This paper describes an experiment in which a large-scale scientific application development for tightly-coupled parallel machines is adapted to the distributed execution environment of the Information Power Grid (IPG). A brief overview of the IPG and a description of the computational fluid dynamics (CFD) algorithm are given. The Globus metacomputing toolkit is used as the enabling device for the geographically-distributed computation. Modifications related to latency hiding and Load balancing were required for an efficient implementation of the CFD application in the IPG environment. Performance results on a pair of SGI Origin 2000 machines indicate that real scientific applications can be effectively implemented on the IPG; however, a significant amount of continued effort is required to make such an environment useful and accessible to scientists and engineers.

  19. New Distributed Multipole Methods for Accurate Electrostatics for Large-Scale Biomolecular Simultations

    NASA Astrophysics Data System (ADS)

    Sagui, Celeste

    2006-03-01

    An accurate and numerically efficient treatment of electrostatics is essential for biomolecular simulations, as this stabilizes much of the delicate 3-d structure associated with biomolecules. Currently, force fields such as AMBER and CHARMM assign ``partial charges'' to every atom in a simulation in order to model the interatomic electrostatic forces, so that the calculation of the electrostatics rapidly becomes the computational bottleneck in large-scale simulations. There are two main issues associated with the current treatment of classical electrostatics: (i) how does one eliminate the artifacts associated with the point-charges (e.g., the underdetermined nature of the current RESP fitting procedure for large, flexible molecules) used in the force fields in a physically meaningful way? (ii) how does one efficiently simulate the very costly long-range electrostatic interactions? Recently, we have dealt with both of these challenges as follows. In order to improve the description of the molecular electrostatic potentials (MEPs), a new distributed multipole analysis based on localized functions -- Wannier, Boys, and Edminston-Ruedenberg -- was introduced, which allows for a first principles calculation of the partial charges and multipoles. Through a suitable generalization of the particle mesh Ewald (PME) and multigrid method, one can treat electrostatic multipoles all the way to hexadecapoles all without prohibitive extra costs. The importance of these methods for large-scale simulations will be discussed, and examplified by simulations from polarizable DNA models.

  20. Large-scale galaxy distribution in the Las Campanas Redshift Survey

    NASA Astrophysics Data System (ADS)

    Doroshkevich, A. G.; Tucker, D. L.; Fong, R.; Turchaninov, V.; Lin, H.

    2001-04-01

    We make use of three-dimensional clustering analysis, inertia tensor methods, and the minimal spanning tree technique to estimate some physical and statistical characteristics of the large-scale galaxy distribution and, in particular, of the sample of overdense regions seen in the Las Campanas Redshift Survey (LCRS). Our investigation provides additional evidence for a network of structures found in our core sampling analysis of the LCRS: a system of rich sheet-like structures, which in turn surround large underdense regions criss-crossed by a variety of filamentary structures. We find that the overdense regions contain ~40-50 per cent of LCRS galaxies and have proper sizes similar to those of nearby superclusters. The formation of such structures can be roughly described as a non-linear compression of protowalls of typical cross-sectional size ~20-25h-1Mpc this scale is ~5 times the conventional value for the onset of non-linear clustering - to wit, r0, the autocorrelation length for galaxies. The comparison with available simulations and theoretical estimates shows that the formation of structure elements with parameters similar to those observed is presently possible only in low-density cosmological models, Ωmh~0.2-0.3, with a suitable large-scale bias between galaxies and dark matter.

  1. The impact of the stratospheric ozone distribution on large-scale tropospheric systems over South America

    NASA Astrophysics Data System (ADS)

    Da Silva, L. A.; Vieira, L. A.; Prestes, A.; Pacini, A. A.; Rigozo, N. R.

    2013-12-01

    Most of the large-scale changes of the climate can be attributed to the cumulative impact of the human activities since the beginning of the industrial revolution. However, the impact of natural drivers to the present climate change is still under debate, especially on regional scale. These regional changes over South America can potentially affect large vulnerable populations in the near future. Here, we show that the distribution of the stratospheric ozone can affect the climate patterns over South America and adjoin oceans. The impact of the stratospheric ozone distribution was evaluated employing the Global Atmospheric-Ocean Model developed by the Goddard Institute for Space Studies (GISS Model E). We conducted two numerical experiments. In the first experiment we used a realistic distribution of the stratospheric ozone, while in the second experiment we employed a uniform longitudinal distribution. We have integrated each model over 60 years. We find that the distribution of stratospheric ozone has a strong influence on the Intertropical Convergence Zone (ITCZ) and South Atlantic Convergence Zone (SACZ). However, the Upper Tropospheric Cyclonic Vortex (UTCV) is not affected by the ozone's distribution.

  2. Shared and Distributed Memory Parallel Security Analysis of Large-Scale Source Code and Binary Applications

    SciTech Connect

    Quinlan, D; Barany, G; Panas, T

    2007-08-30

    Many forms of security analysis on large scale applications can be substantially automated but the size and complexity can exceed the time and memory available on conventional desktop computers. Most commercial tools are understandably focused on such conventional desktop resources. This paper presents research work on the parallelization of security analysis of both source code and binaries within our Compass tool, which is implemented using the ROSE source-to-source open compiler infrastructure. We have focused on both shared and distributed memory parallelization of the evaluation of rules implemented as checkers for a wide range of secure programming rules, applicable to desktop machines, networks of workstations and dedicated clusters. While Compass as a tool focuses on source code analysis and reports violations of an extensible set of rules, the binary analysis work uses the exact same infrastructure but is less well developed into an equivalent final tool.

  3. Multiple antibiotic resistance genes distribution in ten large-scale membrane bioreactors for municipal wastewater treatment.

    PubMed

    Sun, Yanmei; Shen, Yue-Xiao; Liang, Peng; Zhou, Jizhong; Yang, Yunfeng; Huang, Xia

    2016-12-01

    Wastewater treatment plants are thought to be potential reservoirs of antibiotic resistance genes. In this study, GeoChip was used for analyzing multiple antibiotic resistance genes, including four multidrug efflux system gene groups and three β-lactamase genes in ten large-scale membrane bioreactors (MBRs) for municipal wastewater treatment. Results revealed that the diversity of antibiotic genes varied a lot among MBRs, but about 40% common antibiotic resistance genes were existent. The average signal intensity of each antibiotic resistance group was similar among MBRs, nevertheless the total abundance of each group varied remarkably and the dominant resistance gene groups were different in individual MBR. The antibiotic resistance genes majorly derived from Proteobacteria and Actinobacteria. Further study indicated that TN, TP and COD of influent, temperature and conductivity of mixed liquor were significant (P<0.05) correlated to the multiple antibiotic resistance genes distribution in MBRs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Q Value-Based Dynamic Programming with Boltzmann Distribution in Large Scale Road Network

    NASA Astrophysics Data System (ADS)

    Yu, Shanqing; Xu, Yelei; Mabu, Shingo; Mainali, Manoj Kanta; Shimada, Kaoru; Hirasawa, Kotaro

    In this paper, a global optimal traffic assignment strategy, i.e., Q value-based Dynamic Programming with Boltzmann Distribution is applied to the Kitakyushu City traffic system. The main idea of the proposed traffic assignment strategy is to calculate the expected traveling time for each origin-destination pair and the probability of selecting the next section, then to generate a considerable number of route candidates for the drivers based on the calculated probability. In the simulation, how to select the temperature parameter and the number of the route candidates is discussed in detail. The comparison between the proposed method and the shortest path algorithms indicates that the proposed method could reduce the risk of the traffic congestion occurrence and save the traveling cost effectively. In addition, the computation time is given to reveal the feasibility of the proposed method in large scale networks.

  5. LARGE SCALE DISTRIBUTED PARAMETER MODEL OF MAIN MAGNET SYSTEM AND FREQUENCY DECOMPOSITION ANALYSIS

    SciTech Connect

    ZHANG,W.; MARNERIS, I.; SANDBERG, J.

    2007-06-25

    Large accelerator main magnet system consists of hundreds, even thousands, of dipole magnets. They are linked together under selected configurations to provide highly uniform dipole fields when powered. Distributed capacitance, insulation resistance, coil resistance, magnet inductance, and coupling inductance of upper and lower pancakes make each magnet a complex network. When all dipole magnets are chained together in a circle, they become a coupled pair of very high order complex ladder networks. In this study, a network of more than thousand inductive, capacitive or resistive elements are used to model an actual system. The circuit is a large-scale network. Its equivalent polynomial form has several hundred degrees. Analysis of this high order circuit and simulation of the response of any or all components is often computationally infeasible. We present methods to use frequency decomposition approach to effectively simulate and analyze magnet configuration and power supply topologies.

  6. cOSPREY: A Cloud-Based Distributed Algorithm for Large-Scale Computational Protein Design.

    PubMed

    Pan, Yuchao; Dong, Yuxi; Zhou, Jingtian; Hallen, Mark; Donald, Bruce R; Zeng, Jianyang; Xu, Wei

    2016-09-01

    Finding the global minimum energy conformation (GMEC) of a huge combinatorial search space is the key challenge in computational protein design (CPD) problems. Traditional algorithms lack a scalable and efficient distributed design scheme, preventing researchers from taking full advantage of current cloud infrastructures. We design cloud OSPREY (cOSPREY), an extension to a widely used protein design software OSPREY, to allow the original design framework to scale to the commercial cloud infrastructures. We propose several novel designs to integrate both algorithm and system optimizations, such as GMEC-specific pruning, state search partitioning, asynchronous algorithm state sharing, and fault tolerance. We evaluate cOSPREY on three different cloud platforms using different technologies and show that it can solve a number of large-scale protein design problems that have not been possible with previous approaches.

  7. cOSPREY: A Cloud-Based Distributed Algorithm for Large-Scale Computational Protein Design

    PubMed Central

    Pan, Yuchao; Dong, Yuxi; Zhou, Jingtian; Hallen, Mark; Donald, Bruce R.; Xu, Wei

    2016-01-01

    Abstract Finding the global minimum energy conformation (GMEC) of a huge combinatorial search space is the key challenge in computational protein design (CPD) problems. Traditional algorithms lack a scalable and efficient distributed design scheme, preventing researchers from taking full advantage of current cloud infrastructures. We design cloud OSPREY (cOSPREY), an extension to a widely used protein design software OSPREY, to allow the original design framework to scale to the commercial cloud infrastructures. We propose several novel designs to integrate both algorithm and system optimizations, such as GMEC-specific pruning, state search partitioning, asynchronous algorithm state sharing, and fault tolerance. We evaluate cOSPREY on three different cloud platforms using different technologies and show that it can solve a number of large-scale protein design problems that have not been possible with previous approaches. PMID:27154509

  8. Calibration of a large-scale semi-distributed hydrological model for the continental United States

    NASA Astrophysics Data System (ADS)

    Li, S.; Lohmann, D.

    2011-12-01

    Recent major flood losses raised the awareness of flood risk worldwide. In large-scale (e.g., country) flood simulation, semi-distributed hydrological model shows its advantage in capturing spatial heterogeneity of hydrological characteristics within a basin with relatively low computational cost. However, it is still very challenging to calibrate the model over large scale and a wide variety of hydroclimatic conditions. The objectives of this study are (1) to compare the effectiveness of state-of-the-art evolutionary multiobjective algorithms in calibrating a semi-distributed hydrological model used in the RMS flood loss model; (2) to calibrate the model over the entire continental United States. Firstly, the computational efficiency of the following four algorithms is evaluated: the Non-Dominated Sorted Genetic Algorithm II (NSGAII), the Strength Pareto Evolutionary Algorithm 2 (SPEA2), the Epsilon-Dominance Non-Dominated Sorted Genetic Algorithm II (ɛ-NSGAII), and the Epsilon-Dominance Multi-Objective Evolutionary Algorithm (ɛMOEA). The test was conducted on four river basins with a wide variety of hydro-climatic conditions in US. The optimization objectives include RMSE and high-flow RMSE. Results of the analysis indicate that NSGAII has the best performance in terms of effectiveness and stability. Then we applied the modified version of NSGAII to calibrate the hydrological model over the entire continental US. Comparing with the observation and published data shows the performance of the calibrated model is good overall. This well-calibrated model allows a more accurate modeling of flood risk and loss in the continental United States. Furthermore it will allow underwriters to better manage the exposure.

  9. Spatially-explicit estimation of geographical representation in large-scale species distribution datasets.

    PubMed

    Kalwij, Jesse M; Robertson, Mark P; Ronk, Argo; Zobel, Martin; Pärtel, Meelis

    2014-01-01

    Much ecological research relies on existing multispecies distribution datasets. Such datasets, however, can vary considerably in quality, extent, resolution or taxonomic coverage. We provide a framework for a spatially-explicit evaluation of geographical representation within large-scale species distribution datasets, using the comparison of an occurrence atlas with a range atlas dataset as a working example. Specifically, we compared occurrence maps for 3773 taxa from the widely-used Atlas Florae Europaeae (AFE) with digitised range maps for 2049 taxa of the lesser-known Atlas of North European Vascular Plants. We calculated the level of agreement at a 50-km spatial resolution using average latitudinal and longitudinal species range, and area of occupancy. Agreement in species distribution was calculated and mapped using Jaccard similarity index and a reduced major axis (RMA) regression analysis of species richness between the entire atlases (5221 taxa in total) and between co-occurring species (601 taxa). We found no difference in distribution ranges or in the area of occupancy frequency distribution, indicating that atlases were sufficiently overlapping for a valid comparison. The similarity index map showed high levels of agreement for central, western, and northern Europe. The RMA regression confirmed that geographical representation of AFE was low in areas with a sparse data recording history (e.g., Russia, Belarus and the Ukraine). For co-occurring species in south-eastern Europe, however, the Atlas of North European Vascular Plants showed remarkably higher richness estimations. Geographical representation of atlas data can be much more heterogeneous than often assumed. Level of agreement between datasets can be used to evaluate geographical representation within datasets. Merging atlases into a single dataset is worthwhile in spite of methodological differences, and helps to fill gaps in our knowledge of species distribution ranges. Species distribution

  10. A practical large scale/high speed data distribution system using 8 mm libraries

    NASA Technical Reports Server (NTRS)

    Howard, Kevin

    1993-01-01

    Eight mm tape libraries are known primarily for their small size, large storage capacity, and low cost. However, many applications require an additional attribute which, heretofore, has been lacking -- high transfer rate. Transfer rate is particularly important in a large scale data distribution environment -- an environment in which 8 mm tape should play a very important role. Data distribution is a natural application for 8 mm for several reasons: most large laboratories have access to 8 mm tape drives, 8 mm tapes are upwardly compatible, 8 mm media are very inexpensive, 8 mm media are light weight (important for shipping purposes), and 8 mm media densely pack data (5 gigabytes now and 15 gigabytes on the horizon). If the transfer rate issue were resolved, 8 mm could offer a good solution to the data distribution problem. To that end Exabyte has analyzed four ways to increase its transfer rate: native drive transfer rate increases, data compression at the drive level, tape striping, and homogeneous drive utilization. Exabyte is actively pursuing native drive transfer rate increases and drive level data compression. However, for non-transmitted bulk data applications (which include data distribution) the other two methods (tape striping and homogeneous drive utilization) hold promise.

  11. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    NASA Technical Reports Server (NTRS)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated

  12. Arctic Ice Algae Distribution as Function of Large Scale Sea Ice Variables

    NASA Astrophysics Data System (ADS)

    Flores, H.; Castellani, G.; Lange, B. A.; David, C.; Katlein, C.; Peeken, I.; Nicolaus, M.; Losch, M. J.; van Franeker, J. A.

    2016-02-01

    One of the most pronounced impacts of climate change is the declining sea ice cover in the Arctic Ocean, which has implications for sea-ice associated ecosystems that are strongly dependent on carbon produced by ice algae. In order to understand these ecosystems there is a need to understand the interaction between the physical and biological components of sea ice. Our current understanding of Arctic sea ice algae is based on observations with limited spatial coverage. Therefore, we aim to model the spatial distribution of ice-algae on a basin scale. Current sea-ice-ocean models do allow the representation of sea-ice variability on a scale of few km. Large scale characteristics of sea ice such as age, deformation, and snow cover, do affect the small scale ice properties, such as salinity, porosity, light transmission. The latter directly affect the sea ice algae content, but to what extent is not yet well understood. In this work we present a new parameterization for the sea-ice algae content developed with the aim to model the algae content and variability based on large scale sea-ice characteristics. This parameterization is tuned with data collected during a ship-based campaign to the Eastern Central Arctic in summer 2012. Sea-ice thickness and under-ice spectral surveys over different sea ice regimes were conducted with a Surface and Under Ice Trawl (SUIT) and a Remote Operated Vehicle (ROV). In addition, ice cores were extracted at several sites for chl a analysis. We use a coupled sea-ice-ocean model with a spatial scale of 10 km and we show here the results for the temporal evolution of algae content in sea ice.

  13. Distributed weighted least-squares estimation with fast convergence for large-scale systems☆

    PubMed Central

    Marelli, Damián Edgardo; Fu, Minyue

    2015-01-01

    In this paper we study a distributed weighted least-squares estimation problem for a large-scale system consisting of a network of interconnected sub-systems. Each sub-system is concerned with a subset of the unknown parameters and has a measurement linear in the unknown parameters with additive noise. The distributed estimation task is for each sub-system to compute the globally optimal estimate of its own parameters using its own measurement and information shared with the network through neighborhood communication. We first provide a fully distributed iterative algorithm to asymptotically compute the global optimal estimate. The convergence rate of the algorithm will be maximized using a scaling parameter and a preconditioning method. This algorithm works for a general network. For a network without loops, we also provide a different iterative algorithm to compute the global optimal estimate which converges in a finite number of steps. We include numerical experiments to illustrate the performances of the proposed methods. PMID:25641976

  14. Large-scale asynchronous and distributed multidimensional replica exchange molecular simulations and efficiency analysis.

    PubMed

    Xia, Junchao; Flynn, William F; Gallicchio, Emilio; Zhang, Bin W; He, Peng; Tan, Zhiqiang; Levy, Ronald M

    2015-09-05

    We describe methods to perform replica exchange molecular dynamics (REMD) simulations asynchronously (ASyncRE). The methods are designed to facilitate large scale REMD simulations on grid computing networks consisting of heterogeneous and distributed computing environments as well as on homogeneous high-performance clusters. We have implemented these methods on NSF (National Science Foundation) XSEDE (Extreme Science and Engineering Discovery Environment) clusters and BOINC (Berkeley Open Infrastructure for Network Computing) distributed computing networks at Temple University and Brooklyn College at CUNY (the City University of New York). They are also being implemented on the IBM World Community Grid. To illustrate the methods, we have performed extensive (more than 60 ms in aggregate) simulations for the beta-cyclodextrin-heptanoate host-guest system in the context of one- and two-dimensional ASyncRE, and we used the results to estimate absolute binding free energies using the binding energy distribution analysis method. We propose ways to improve the efficiency of REMD simulations: these include increasing the number of exchanges attempted after a specified molecular dynamics (MD) period up to the fast exchange limit and/or adjusting the MD period to allow sufficient internal relaxation within each thermodynamic state. Although ASyncRE simulations generally require long MD periods (>picoseconds) per replica exchange cycle to minimize the overhead imposed by heterogeneous computing networks, we found that it is possible to reach an efficiency similar to conventional synchronous REMD, by optimizing the combination of the MD period and the number of exchanges attempted per cycle.

  15. Large Scale Asynchronous and Distributed Multi-Dimensional Replica Exchange Molecular Simulations and Efficiency Analysis

    PubMed Central

    Xia, Junchao; Flynn, William F.; Gallicchio, Emilio; Zhang, Bin W.; He, Peng; Tan, Zhiqiang; Levy, Ronald M.

    2015-01-01

    We describe methods to perform replica exchange molecular dynamics (REMD) simulations asynchronously (ASyncRE). The methods are designed to facilitate large scale REMD simulations on grid computing networks consisting of heterogeneous and distributed computing environments as well as on homogeneous high performance clusters. We have implemented these methods on NSF XSEDE clusters and BOINC distributed computing networks at Temple University, and Brooklyn College at CUNY. They are also being implemented on the IBM World Community Grid. To illustrate the methods we have performed extensive (more than 60 microseconds in aggregate) simulations for the beta-cyclodextrin-heptanoate host-guest system in the context of one and two dimensional ASyncRE and we used the results to estimate absolute binding free energies using the Binding Energy Distribution Analysis Method (BEDAM). We propose ways to improve the efficiency of REMD simulations: these include increasing the number of exchanges attempted after a specified MD period up to the fast exchange limit, and/or adjusting the MD period to allow sufficient internal relaxation within each thermodynamic state. Although ASyncRE simulations generally require long MD periods (> picoseconds) per replica exchange cycle to minimize the overhead imposed by heterogeneous computing networks, we found that it is possible to reach an efficiency similar to conventional synchronous REMD, by optimizing the combination of the MD period and the number of exchanges attempted per cycle. PMID:26149645

  16. Large Scale Behavior and Droplet Size Distributions in Crude Oil Jets and Plumes

    NASA Astrophysics Data System (ADS)

    Katz, Joseph; Murphy, David; Morra, David

    2013-11-01

    The 2010 Deepwater Horizon blowout introduced several million barrels of crude oil into the Gulf of Mexico. Injected initially as a turbulent jet containing crude oil and gas, the spill caused formation of a subsurface plume stretching for tens of miles. The behavior of such buoyant multiphase plumes depends on several factors, such as the oil droplet and bubble size distributions, current speed, and ambient stratification. While large droplets quickly rise to the surface, fine ones together with entrained seawater form intrusion layers. Many elements of the physics of droplet formation by an immiscible turbulent jet and their resulting size distribution have not been elucidated, but are known to be significantly influenced by the addition of dispersants, which vary the Weber Number by orders of magnitude. We present experimental high speed visualizations of turbulent jets of sweet petroleum crude oil (MC 252) premixed with Corexit 9500A dispersant at various dispersant to oil ratios. Observations were conducted in a 0.9 m × 0.9 m × 2.5 m towing tank, where large-scale behavior of the jet, both stationary and towed at various speeds to simulate cross-flow, have been recorded at high speed. Preliminary data on oil droplet size and spatial distributions were also measured using a videoscope and pulsed light sheet. Sponsored by Gulf of Mexico Research Initiative (GoMRI).

  17. Large-scale P2P network based distributed virtual geographic environment (DVGE)

    NASA Astrophysics Data System (ADS)

    Tan, Xicheng; Yu, Liang; Bian, Fuling

    2007-06-01

    Virtual Geographic Environment has raised full concern as a kind of software information system that helps us understand and analyze the real geographic environment, and it has also expanded to application service system in distributed environment--distributed virtual geographic environment system (DVGE), and gets some achievements. However, limited by the factor of the mass data of VGE, the band width of network, as well as numerous requests and economic, etc. DVGE still faces some challenges and problems which directly cause the current DVGE could not provide the public with high-quality service under current network mode. The Rapid development of peer-to-peer network technology has offered new ideas of solutions to the current challenges and problems of DVGE. Peer-to-peer network technology is able to effectively release and search network resources so as to realize efficient share of information. Accordingly, this paper brings forth a research subject on Large-scale peer-to-peer network extension of DVGE as well as a deep study on network framework, routing mechanism, and DVGE data management on P2P network.

  18. Large scale patterns of abundance and distribution of parasites in Mexican bumblebees.

    PubMed

    Gallot-Lavallée, Marie; Schmid-Hempel, Regula; Vandame, Rémy; Vergara, Carlos H; Schmid-Hempel, Paul

    2016-01-01

    Bumblebees are highly valued for their pollination services in natural ecosystems as well as for agricultural crops. These precious pollinators are known to be declining worldwide, and one major factor contributing to this decline are infections by parasites. Knowledge about parasites in wild bumblebee populations is thus of paramount importance for conservation purposes. We here report the geographical distribution of Crithidia and Nosema, two common parasites of bumblebees, in a yet poorly investigated country: Mexico. Based on sequence divergence of the Cytochrome b and Glycosomal glyceraldehyde phosphate deshydrogenase (gGPDAH) genes, we discovered the presence of a new Crithidia species, which is mainly distributed in the southern half of the country. It is placed by Bayesian inference as a sister species to C. bombi. We suggest the name Crithidia mexicana for this newly discovered organism. A population of C. expoeki was encountered concentrated on the flanks of the dormant volcanic mountain, Iztaccihuatl, and microsatellite data showed evidence of a bottleneck in this population. This study is the first to provide a large-scale insight into the health status of endemic bumblebees in Mexico, based on a large sample size (n=3,285 bees examined) over a variety of host species and habitats. Copyright © 2015 Elsevier Inc. All rights reserved.

  19. Tree Age Distributions Reveal Large-Scale Disturbance-Recovery Cycles in Three Tropical Forests

    PubMed Central

    Vlam, Mart; van der Sleen, Peter; Groenendijk, Peter; Zuidema, Pieter A.

    2017-01-01

    Over the past few decades there has been a growing realization that a large share of apparently ‘virgin’ or ‘old-growth’ tropical forests carries a legacy of past natural or anthropogenic disturbances that have a substantial effect on present-day forest composition, structure and dynamics. Yet, direct evidence of such disturbances is scarce and comparisons of disturbance dynamics across regions even more so. Here we present a tree-ring based reconstruction of disturbance histories from three tropical forest sites in Bolivia, Cameroon, and Thailand. We studied temporal patterns in tree regeneration of shade-intolerant tree species, because establishment of these trees is indicative for canopy disturbance. In three large areas (140–300 ha), stem disks and increment cores were collected for a total of 1154 trees (>5 cm diameter) from 12 tree species to estimate the age of every tree. Using these age estimates we produced population age distributions, which were analyzed for evidence of past disturbance. Our approach allowed us to reconstruct patterns of tree establishment over a period of around 250 years. In Bolivia, we found continuous regeneration rates of three species and a peaked age distribution of a long-lived pioneer species. In both Cameroon and Thailand we found irregular age distributions, indicating strongly reduced regeneration rates over a period of 10–60 years. Past fires, windthrow events or anthropogenic disturbances all provide plausible explanations for the reported variation in tree age across the three sites. Our results support the recent idea that the long-term dynamics of tropical forests are impacted by large-scale disturbance-recovery cycles, similar to those driving temperate forest dynamics. PMID:28105034

  20. Tree Age Distributions Reveal Large-Scale Disturbance-Recovery Cycles in Three Tropical Forests.

    PubMed

    Vlam, Mart; van der Sleen, Peter; Groenendijk, Peter; Zuidema, Pieter A

    2016-01-01

    Over the past few decades there has been a growing realization that a large share of apparently 'virgin' or 'old-growth' tropical forests carries a legacy of past natural or anthropogenic disturbances that have a substantial effect on present-day forest composition, structure and dynamics. Yet, direct evidence of such disturbances is scarce and comparisons of disturbance dynamics across regions even more so. Here we present a tree-ring based reconstruction of disturbance histories from three tropical forest sites in Bolivia, Cameroon, and Thailand. We studied temporal patterns in tree regeneration of shade-intolerant tree species, because establishment of these trees is indicative for canopy disturbance. In three large areas (140-300 ha), stem disks and increment cores were collected for a total of 1154 trees (>5 cm diameter) from 12 tree species to estimate the age of every tree. Using these age estimates we produced population age distributions, which were analyzed for evidence of past disturbance. Our approach allowed us to reconstruct patterns of tree establishment over a period of around 250 years. In Bolivia, we found continuous regeneration rates of three species and a peaked age distribution of a long-lived pioneer species. In both Cameroon and Thailand we found irregular age distributions, indicating strongly reduced regeneration rates over a period of 10-60 years. Past fires, windthrow events or anthropogenic disturbances all provide plausible explanations for the reported variation in tree age across the three sites. Our results support the recent idea that the long-term dynamics of tropical forests are impacted by large-scale disturbance-recovery cycles, similar to those driving temperate forest dynamics.

  1. Large-scale coronal temperature and density distributions, 1984-1992

    NASA Technical Reports Server (NTRS)

    Guhathakurta, M.; Fisher, R. R.; Altrock, R. C.

    1993-01-01

    We characterize the temperature and the density structure of the corona utilizing spectrophotometric observations at different heights but at the same latitude during the descending phase of cycle 21 through the ascending phase of cycle 22. The data include ground-based intensity observations of the green (Fe XIV 5303) and red (Fe X 6374) coronal forbidden lines, photospheric magnetographs from the National Solar Observatory, Kitt Peak, and synoptic maps of white-light K-coronal polarized brightness from the High Altitude Observatory. A determination of plasma temperature, T, can be estimated from the intensity ratio Fe X/Fe XIV (where T is inversely proportional to the ratio), since both emission lines come from ionized states of Fe, and the ratio is only weakly dependent on density. Distributions of the electron temperature from the line ratio and the polarized brightness which yields electron density of the corona during the descending and the ascending phases of solar cycles 21 and 22 are presented. These data refer to structures of the corona which are relatively large scale, having a temporal coherence of at least two or more synoptic rotation periods, such as the streamer belts, the individual helmet streamers, and the larger coronal holes.

  2. Vertical Distributions of Sulfur Species Simulated by Large Scale Atmospheric Models in COSAM: Comparison with Observations

    SciTech Connect

    Lohmann, U.; Leaitch, W. R.; Barrie, Leonard A.; Law, K.; Yi, Y.; Bergmann, D.; Bridgeman, C.; Chin, M.; Christensen, J.; Easter, Richard C.; Feichter, J.; Jeuken, A.; Kjellstrom, E.; Koch, D.; Land, C.; Rasch, P.; Roelofs, G.-J.

    2001-11-01

    A comparison of large-scale models simulating atmospheric sulfate aerosols (COSAM) was conducted to increase our understanding of global distributions of sulfate aerosols and precursors. Earlier model comparisons focused on wet deposition measurements and sulfate aerosol concentrations in source regions at the surface. They found that different models simulated the observed sulfate surface concentrations mostly within a factor of two, but that the simulated column burdens and vertical profiles were very different amongst different models. In the COSAM exercise, one aspect is the comparison of sulfate aerosol and precursor gases above the surface. Vertical profiles of SO2, SO42-, oxidants and cloud properties were measured by aircraft during the North Atlantic Regional Experiment (NARE) experiment in August/September 1993 off the coast of Nova Scotia and during the Second Eulerian Model Evaluation Field Study (EMEFSII) in central Ontario in March/April 1990. While no single model stands out as being best or worst, the general tendency is that those models simulating the full oxidant chemistry tend to agree best with observations although differences in transport and treatment of clouds are important as well.

  3. Functional and large-scale testing of the ATLAS distributed analysis facilities with Ganga

    NASA Astrophysics Data System (ADS)

    Vanderster, D. C.; Elmsheuser, J.; Biglietti, M.; Galeazzi, F.; Serfon, C.; Slater, M.

    2010-04-01

    Effective distributed user analysis requires a system which meets the demands of running arbitrary user applications on sites with varied configurations and availabilities. The challenge of tracking such a system requires a tool to monitor not only the functional statuses of each grid site, but also to perform large-scale analysis challenges on the ATLAS grids. This work presents one such tool, the ATLAS GangaRobot, and the results of its use in tests and challenges. For functional testing, the GangaRobot performs daily tests of all sites; specifically, a set of exemplary applications are submitted to all sites and then monitored for success and failure conditions. These results are fed back into Ganga to improve job placements by avoiding currently problematic sites. For analysis challenges, a cloud is first prepared by replicating a number of desired DQ2 datasets across all the sites. Next, the GangaRobot is used to submit and manage a large number of jobs targeting these datasets. The high-loads resulting from multiple parallel instances of the GangaRobot exposes shortcomings in storage and network configurations. The results from a series of cloud-by-cloud analysis challenges starting in fall 2008 are presented.

  4. Rucio - The next generation of large scale distributed system for ATLAS Data Management

    NASA Astrophysics Data System (ADS)

    Garonne, V.; Vigne, R.; Stewart, G.; Barisits, M.; eermann, T. B.; Lassnig, M.; Serfon, C.; Goossens, L.; Nairz, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation Distributed Data Management (DDM) system benefiting from recent advances in cloud and "Big Data" computing to address HEP experiments scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 140 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will deal with these issues by relying on a conceptual data model and new technology to ensure system scalability, address new user requirements and employ new automation framework to reduce operational overheads. We present the key concepts of Rucio, including its data organization/representation and a model of how to manage central group and user activities. The Rucio design, and the technology it employs, is described, specifically looking at its RESTful architecture and the various software components it uses. We show also the performance of the system.

  5. A study of residence time distribution using radiotracer technique in the large scale plant facility

    NASA Astrophysics Data System (ADS)

    Wetchagarun, S.; Tippayakul, C.; Petchrak, A.; Sukrod, K.; Khoonkamjorn, P.

    2017-06-01

    As the demand for troubleshooting of large industrial plants increases, radiotracer techniques, which have capability to provide fast, online and effective detections to plant problems, have been continually developed. One of the good potential applications of the radiotracer for troubleshooting in a process plant is the analysis of Residence Time Distribution (RTD). In this paper, the study of RTD in a large scale plant facility using radiotracer technique was presented. The objective of this work is to gain experience on the RTD analysis using radiotracer technique in a “larger than laboratory” scale plant setup which can be comparable to the real industrial application. The experiment was carried out at the sedimentation tank in the water treatment facility of Thailand Institute of Nuclear Technology (Public Organization). Br-82 was selected to use in this work due to its chemical property, its suitable half-life and its on-site availability. NH4Br in the form of aqueous solution was injected into the system as the radiotracer. Six NaI detectors were placed along the pipelines and at the tank in order to determine the RTD of the system. The RTD and the Mean Residence Time (MRT) of the tank was analysed and calculated from the measured data. The experience and knowledge attained from this study is important for extending this technique to be applied to industrial facilities in the future.

  6. Large-Scale Spatial Distribution Patterns of Echinoderms in Nearshore Rocky Habitats

    PubMed Central

    Iken, Katrin; Konar, Brenda; Benedetti-Cecchi, Lisandro; Cruz-Motta, Juan José; Knowlton, Ann; Pohle, Gerhard; Mead, Angela; Miloslavich, Patricia; Wong, Melisa; Trott, Thomas; Mieszkowska, Nova; Riosmena-Rodriguez, Rafael; Airoldi, Laura; Kimani, Edward; Shirayama, Yoshihisa; Fraschetti, Simonetta; Ortiz-Touzet, Manuel; Silva, Angelica

    2010-01-01

    This study examined echinoderm assemblages from nearshore rocky habitats for large-scale distribution patterns with specific emphasis on identifying latitudinal trends and large regional hotspots. Echinoderms were sampled from 76 globally-distributed sites within 12 ecoregions, following the standardized sampling protocol of the Census of Marine Life NaGISA project (www.nagisa.coml.org). Sample-based species richness was overall low (<1–5 species per site), with a total of 32 asteroid, 18 echinoid, 21 ophiuroid, and 15 holothuroid species. Abundance and species richness in intertidal assemblages sampled with visual methods (organisms >2 cm in 1 m2 quadrats) was highest in the Caribbean ecoregions and echinoids dominated these assemblages with an average of 5 ind m−2. In contrast, intertidal echinoderm assemblages collected from clearings of 0.0625 m2 quadrats had the highest abundance and richness in the Northeast Pacific ecoregions where asteroids and holothurians dominated with an average of 14 ind 0.0625 m−2. Distinct latitudinal trends existed for abundance and richness in intertidal assemblages with declines from peaks at high northern latitudes. No latitudinal trends were found for subtidal echinoderm assemblages with either sampling technique. Latitudinal gradients appear to be superseded by regional diversity hotspots. In these hotspots echinoderm assemblages may be driven by local and regional processes, such as overall productivity and evolutionary history. We also tested a set of 14 environmental variables (six natural and eight anthropogenic) as potential drivers of echinoderm assemblages by ecoregions. The natural variables of salinity, sea-surface temperature, chlorophyll a, and primary productivity were strongly correlated with echinoderm assemblages; the anthropogenic variables of inorganic pollution and nutrient contamination also contributed to correlations. Our results indicate that nearshore echinoderm assemblages appear to be shaped by a

  7. Large-Scale Spatial Distribution Patterns of Gastropod Assemblages in Rocky Shores

    PubMed Central

    Miloslavich, Patricia; Cruz-Motta, Juan José; Klein, Eduardo; Iken, Katrin; Weinberger, Vanessa; Konar, Brenda; Trott, Tom; Pohle, Gerhard; Bigatti, Gregorio; Benedetti-Cecchi, Lisandro; Shirayama, Yoshihisa; Mead, Angela; Palomo, Gabriela; Ortiz, Manuel; Gobin, Judith; Sardi, Adriana; Díaz, Juan Manuel; Knowlton, Ann; Wong, Melisa; Peralta, Ana C.

    2013-01-01

    Gastropod assemblages from nearshore rocky habitats were studied over large spatial scales to (1) describe broad-scale patterns in assemblage composition, including patterns by feeding modes, (2) identify latitudinal pattern of biodiversity, i.e., richness and abundance of gastropods and/or regional hotspots, and (3) identify potential environmental and anthropogenic drivers of these assemblages. Gastropods were sampled from 45 sites distributed within 12 Large Marine Ecosystem regions (LME) following the NaGISA (Natural Geography in Shore Areas) standard protocol (www.nagisa.coml.org). A total of 393 gastropod taxa from 87 families were collected. Eight of these families (9.2%) appeared in four or more different LMEs. Among these, the Littorinidae was the most widely distributed (8 LMEs) followed by the Trochidae and the Columbellidae (6 LMEs). In all regions, assemblages were dominated by few species, the most diverse and abundant of which were herbivores. No latitudinal gradients were evident in relation to species richness or densities among sampling sites. Highest diversity was found in the Mediterranean and in the Gulf of Alaska, while highest densities were found at different latitudes and represented by few species within one genus (e.g. Afrolittorina in the Agulhas Current, Littorina in the Scotian Shelf, and Lacuna in the Gulf of Alaska). No significant correlation was found between species composition and environmental variables (r≤0.355, p>0.05). Contributing variables to this low correlation included invasive species, inorganic pollution, SST anomalies, and chlorophyll-a anomalies. Despite data limitations in this study which restrict conclusions in a global context, this work represents the first effort to sample gastropod biodiversity on rocky shores using a standardized protocol across a wide scale. Our results will generate more work to build global databases allowing for large-scale diversity comparisons of rocky intertidal assemblages. PMID

  8. A passive Distributed Temperature Sensing approach to large-scale soil moisture validation

    NASA Astrophysics Data System (ADS)

    Steele-Dunne, S. C.; Rutten, M. M.; Krzeminksa, D.; van de Giesen, N. C.; Bogaard, T. A.; Selker, J.; Sailhac, P.

    2009-04-01

    Global monitoring of soil moisture is key to quantifying and understanding the exchanges of water and energy between the land surface and the atmosphere. ESA's Soil Moisture and Ocean Salinity (SMOS) Mission represents the first dedicated space-borne mission to observe soil moisture. To validate the observations from SMOS, in-situ measurements must be made over a wide variety of soil and land cover types. In recent years, Distributed Temperature Sensing (DTS) has been used in a wide variety of applications including estimating the seepage in polders, to measuring flow into streams. Active DTS, in which the cables observe the response to a heat pulse, has been successfully used to measure soil moisture in several studies. The objective of this study was to investigate the potential of passive Distributed Temperature Sensing as a relatively portable, and inexpensive alternative approach to measuring soil moisture on a large-scale. From June to September 2008, fibre-optic cables were used to monitor temperature at 5cm and 10cm depth at a field site at Monster in the Netherlands. Meteorological data, as well as independent soil temperature and soil moisture profile data were also recorded. Through its impact on diffusivity, soil moisture influences heat transport between the cables. Here, we demonstrate how solving for the optimum parameters of the advection-diffusion equation can yield a time-series of 3-hourly soil moisture. We will also discuss the lessons learned from this experiment, and a new protocol for using this technique in future planned field experiments.

  9. Large-scale spatial distribution patterns of gastropod assemblages in rocky shores.

    PubMed

    Miloslavich, Patricia; Cruz-Motta, Juan José; Klein, Eduardo; Iken, Katrin; Weinberger, Vanessa; Konar, Brenda; Trott, Tom; Pohle, Gerhard; Bigatti, Gregorio; Benedetti-Cecchi, Lisandro; Shirayama, Yoshihisa; Mead, Angela; Palomo, Gabriela; Ortiz, Manuel; Gobin, Judith; Sardi, Adriana; Díaz, Juan Manuel; Knowlton, Ann; Wong, Melisa; Peralta, Ana C

    2013-01-01

    Gastropod assemblages from nearshore rocky habitats were studied over large spatial scales to (1) describe broad-scale patterns in assemblage composition, including patterns by feeding modes, (2) identify latitudinal pattern of biodiversity, i.e., richness and abundance of gastropods and/or regional hotspots, and (3) identify potential environmental and anthropogenic drivers of these assemblages. Gastropods were sampled from 45 sites distributed within 12 Large Marine Ecosystem regions (LME) following the NaGISA (Natural Geography in Shore Areas) standard protocol (www.nagisa.coml.org). A total of 393 gastropod taxa from 87 families were collected. Eight of these families (9.2%) appeared in four or more different LMEs. Among these, the Littorinidae was the most widely distributed (8 LMEs) followed by the Trochidae and the Columbellidae (6 LMEs). In all regions, assemblages were dominated by few species, the most diverse and abundant of which were herbivores. No latitudinal gradients were evident in relation to species richness or densities among sampling sites. Highest diversity was found in the Mediterranean and in the Gulf of Alaska, while highest densities were found at different latitudes and represented by few species within one genus (e.g. Afrolittorina in the Agulhas Current, Littorina in the Scotian Shelf, and Lacuna in the Gulf of Alaska). No significant correlation was found between species composition and environmental variables (r≤0.355, p>0.05). Contributing variables to this low correlation included invasive species, inorganic pollution, SST anomalies, and chlorophyll-a anomalies. Despite data limitations in this study which restrict conclusions in a global context, this work represents the first effort to sample gastropod biodiversity on rocky shores using a standardized protocol across a wide scale. Our results will generate more work to build global databases allowing for large-scale diversity comparisons of rocky intertidal assemblages.

  10. Large-scale spatial distribution patterns of echinoderms in nearshore rocky habitats.

    PubMed

    Iken, Katrin; Konar, Brenda; Benedetti-Cecchi, Lisandro; Cruz-Motta, Juan José; Knowlton, Ann; Pohle, Gerhard; Mead, Angela; Miloslavich, Patricia; Wong, Melisa; Trott, Thomas; Mieszkowska, Nova; Riosmena-Rodriguez, Rafael; Airoldi, Laura; Kimani, Edward; Shirayama, Yoshihisa; Fraschetti, Simonetta; Ortiz-Touzet, Manuel; Silva, Angelica

    2010-11-05

    This study examined echinoderm assemblages from nearshore rocky habitats for large-scale distribution patterns with specific emphasis on identifying latitudinal trends and large regional hotspots. Echinoderms were sampled from 76 globally-distributed sites within 12 ecoregions, following the standardized sampling protocol of the Census of Marine Life NaGISA project (www.nagisa.coml.org). Sample-based species richness was overall low (<1-5 species per site), with a total of 32 asteroid, 18 echinoid, 21 ophiuroid, and 15 holothuroid species. Abundance and species richness in intertidal assemblages sampled with visual methods (organisms >2 cm in 1 m(2) quadrats) was highest in the Caribbean ecoregions and echinoids dominated these assemblages with an average of 5 ind m(-2). In contrast, intertidal echinoderm assemblages collected from clearings of 0.0625 m(2) quadrats had the highest abundance and richness in the Northeast Pacific ecoregions where asteroids and holothurians dominated with an average of 14 ind 0.0625 m(-2). Distinct latitudinal trends existed for abundance and richness in intertidal assemblages with declines from peaks at high northern latitudes. No latitudinal trends were found for subtidal echinoderm assemblages with either sampling technique. Latitudinal gradients appear to be superseded by regional diversity hotspots. In these hotspots echinoderm assemblages may be driven by local and regional processes, such as overall productivity and evolutionary history. We also tested a set of 14 environmental variables (six natural and eight anthropogenic) as potential drivers of echinoderm assemblages by ecoregions. The natural variables of salinity, sea-surface temperature, chlorophyll a, and primary productivity were strongly correlated with echinoderm assemblages; the anthropogenic variables of inorganic pollution and nutrient contamination also contributed to correlations. Our results indicate that nearshore echinoderm assemblages appear to be shaped by

  11. Ecological niche modeling as a new paradigm for large-scale investigations of diversity and distribution of birds

    Treesearch

    A. Townsend Peterson; Daniel A. Kluza

    2005-01-01

    Large-scale assessments of the distribution and diversity of birds have been challenged by the need for a robust methodology for summarizing or predicting species' geographic distributions (e.g. Beard et al. 1999, Manel et al. 1999, Saveraid et al. 2001). Methodologies used in such studies have at times been inappropriate, or even more frequently limited in their...

  12. Dictionaries and distributions: Combining expert knowledge and large scale textual data content analysis : Distributed dictionary representation.

    PubMed

    Garten, Justin; Hoover, Joe; Johnson, Kate M; Boghrati, Reihane; Iskiwitch, Carol; Dehghani, Morteza

    2017-03-31

    Theory-driven text analysis has made extensive use of psychological concept dictionaries, leading to a wide range of important results. These dictionaries have generally been applied through word count methods which have proven to be both simple and effective. In this paper, we introduce Distributed Dictionary Representations (DDR), a method that applies psychological dictionaries using semantic similarity rather than word counts. This allows for the measurement of the similarity between dictionaries and spans of text ranging from complete documents to individual words. We show how DDR enables dictionary authors to place greater emphasis on construct validity without sacrificing linguistic coverage. We further demonstrate the benefits of DDR on two real-world tasks and finally conduct an extensive study of the interaction between dictionary size and task performance. These studies allow us to examine how DDR and word count methods complement one another as tools for applying concept dictionaries and where each is best applied. Finally, we provide references to tools and resources to make this method both available and accessible to a broad psychological audience.

  13. Design and implementation of a distributed large-scale spatial database system based on J2EE

    NASA Astrophysics Data System (ADS)

    Gong, Jianya; Chen, Nengcheng; Zhu, Xinyan; Zhang, Xia

    2003-03-01

    With the increasing maturity of distributed object technology, CORBA, .NET and EJB are universally used in traditional IT field. However, theories and practices of distributed spatial database need farther improvement in virtue of contradictions between large scale spatial data and limited network bandwidth or between transitory session and long transaction processing. Differences and trends among of CORBA, .NET and EJB are discussed in details, afterwards the concept, architecture and characteristic of distributed large-scale seamless spatial database system based on J2EE is provided, which contains GIS client application, web server, GIS application server and spatial data server. Moreover the design and implementation of components of GIS client application based on JavaBeans, the GIS engine based on servlet, the GIS Application server based on GIS enterprise JavaBeans(contains session bean and entity bean) are explained.Besides, the experiments of relation of spatial data and response time under different conditions are conducted, which proves that distributed spatial database system based on J2EE can be used to manage, distribute and share large scale spatial data on Internet. Lastly, a distributed large-scale seamless image database based on Internet is presented.

  14. An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Xu, Xueli; von Davier, Matthias

    2015-01-01

    In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…

  15. An Alternative Way to Model Population Ability Distributions in Large-Scale Educational Surveys

    ERIC Educational Resources Information Center

    Wetzel, Eunike; Xu, Xueli; von Davier, Matthias

    2015-01-01

    In large-scale educational surveys, a latent regression model is used to compensate for the shortage of cognitive information. Conventionally, the covariates in the latent regression model are principal components extracted from background data. This operational method has several important disadvantages, such as the handling of missing data and…

  16. Geographic distribution, large-scale spatial structure and diversity of parasitoids of the seed-feeding beetle Acanthoscelides macrophthalmus.

    PubMed

    Wood, A; Haga, E B; Costa, V A; Rossi, M N

    2016-10-21

    Bruchine beetles are highly host-specific seed feeders during the larval stage. Although some specific parasitoid families have been recorded attacking bruchine beetles, most studies have been done at small spatial scales. Therefore, the current knowledge about the diversity and the geographic distribution of parasitoid species parasitizing bruchines is scarce, especially at a wide geographic area that extends over large distances through a latitudinal cline (i.e. large-scale spatial structure). The present study determined the species richness and evenness of parasitoids attacking the bruchine beetle Acanthoscelides macrophthalmus feeding on Leucaena leucocephala seeds, examined their geographic distribution, and characterized the large-scale spatial structure in parasitoid species composition. A total of 1420 parasitoids (all Hymenoptera) belonging to four families, five subfamilies and eight species were collected (genera: Horismenus, Paracrias, Urosigalphus, Stenocorse, Chryseida, Eupelmus). Most parasitoid species showed wide spatial distribution, high evenness in species abundance and the species richness estimators were close to stabilization (approximately eight species). Overall, greater similarity was observed in the species composition of plant populations near to each other than those farther apart, revealing a large-scale spatial structure in parasitoid species composition.

  17. Responses of Cloud Type Distributions to the Large-Scale Dynamical Circulation: Water Budget-Related Dynamical Phase Space and Dynamical Regimes

    NASA Technical Reports Server (NTRS)

    Wong, Sun; Del Genio, Anthony; Wang, Tao; Kahn, Brian; Fetzer, Eric J.; L'Ecuyer, Tristan S.

    2015-01-01

    Goals: Water budget-related dynamical phase space; Connect large-scale dynamical conditions to atmospheric water budget (including precipitation); Connect atmospheric water budget to cloud type distributions.

  18. Large-scale genetic structuring of a widely distributed carnivore--the Eurasian lynx (Lynx lynx).

    PubMed

    Rueness, Eli K; Naidenko, Sergei; Trosvik, Pål; Stenseth, Nils Chr

    2014-01-01

    Over the last decades the phylogeography and genetic structure of a multitude of species inhabiting Europe and North America have been described. The flora and fauna of the vast landmasses of north-eastern Eurasia are still largely unexplored in this respect. The Eurasian lynx is a large felid that is relatively abundant over much of the Russian sub-continent and the adjoining countries. Analyzing 148 museum specimens collected throughout its range over the last 150 years we have described the large-scale genetic structuring in this highly mobile species. We have investigated the spatial genetic patterns using mitochondrial DNA sequences (D-loop and cytochrome b) and 11 microsatellite loci, and describe three phylogenetic clades and a clear structuring along an east-west gradient. The most likely scenario is that the contemporary Eurasian lynx populations originated in central Asia and that parts of Europe were inhabited by lynx during the Pleistocene. After the Last Glacial Maximum (LGM) range expansions lead to colonization of north-western Siberia and Scandinavia from the Caucasus and north-eastern Siberia from a refugium further east. No evidence of a Berinigan refugium could be detected in our data. We observed restricted gene flow and suggest that future studies of the Eurasian lynx explore to what extent the contemporary population structure may be explained by ecological variables.

  19. Large-Scale Genetic Structuring of a Widely Distributed Carnivore - The Eurasian Lynx (Lynx lynx)

    PubMed Central

    Rueness, Eli K.; Naidenko, Sergei; Trosvik, Pål; Stenseth, Nils Chr.

    2014-01-01

    Over the last decades the phylogeography and genetic structure of a multitude of species inhabiting Europe and North America have been described. The flora and fauna of the vast landmasses of north-eastern Eurasia are still largely unexplored in this respect. The Eurasian lynx is a large felid that is relatively abundant over much of the Russian sub-continent and the adjoining countries. Analyzing 148 museum specimens collected throughout its range over the last 150 years we have described the large-scale genetic structuring in this highly mobile species. We have investigated the spatial genetic patterns using mitochondrial DNA sequences (D-loop and cytochrome b) and 11 microsatellite loci, and describe three phylogenetic clades and a clear structuring along an east-west gradient. The most likely scenario is that the contemporary Eurasian lynx populations originated in central Asia and that parts of Europe were inhabited by lynx during the Pleistocene. After the Last Glacial Maximum (LGM) range expansions lead to colonization of north-western Siberia and Scandinavia from the Caucasus and north-eastern Siberia from a refugium further east. No evidence of a Berinigan refugium could be detected in our data. We observed restricted gene flow and suggest that future studies of the Eurasian lynx explore to what extent the contemporary population structure may be explained by ecological variables. PMID:24695745

  20. DC-DC Converter Topology Assessment for Large Scale Distributed Photovoltaic Plant Architectures

    SciTech Connect

    Agamy, Mohammed S; Harfman-Todorovic, Maja; Elasser, Ahmed; Sabate, Juan A; Steigerwald, Robert L; Jiang, Yan; Essakiappan, Somasundaram

    2011-07-01

    Distributed photovoltaic (PV) plant architectures are emerging as a replacement for the classical central inverter based systems. However, power converters of smaller ratings may have a negative impact on system efficiency, reliability and cost. Therefore, it is necessary to design converters with very high efficiency and simpler topologies in order not to offset the benefits gained by using distributed PV systems. In this paper an evaluation of the selection criteria for dc-dc converters for distributed PV systems is performed; this evaluation includes efficiency, simplicity of design, reliability and cost. Based on this evaluation, recommendations can be made as to which class of converters is best fit for this application.

  1. Relationship between the large scale structure of the universe and spatial distribution of GRBs

    NASA Astrophysics Data System (ADS)

    Rácz, István I.; Balázs, Lajos G.; Bagoly, Zsolt; Tóth, L. Viktor; Horváth, István

    2017-01-01

    We studied the distribution of star-forming galaxies in the classical Millenium Simulation Springer et al. [1] and the Horizon Runs Kim et al. [2] databases. DeLucia and Blaizot [3] used a semi-analytical model for the galaxy genesis in Millenium I. We found a relationship between the distribution of the star-forming galaxies and the dark matter (DM), which we used as a transformation factor from the Millenium I to Millenium XXL. We simulated a star-forming galaxy sample with Markov Chain Monte Carlo (MCMC) method, and checked the relation between structures and the distribution of star-forming groups on a various scales. We concluded that above the BAO-scale we need a much more numerous sample than that of the current 407 GRBs to reveal the DM distribution. We got similar results applying our method to the Horizon Runs data.

  2. Tail-scope: Using friends to estimate heavy tails of degree distributions in large-scale complex networks

    PubMed Central

    Eom, Young-Ho; Jo, Hang-Hyun

    2015-01-01

    Many complex networks in natural and social phenomena have often been characterized by heavy-tailed degree distributions. However, due to rapidly growing size of network data and concerns on privacy issues about using these data, it becomes more difficult to analyze complete data sets. Thus, it is crucial to devise effective and efficient estimation methods for heavy tails of degree distributions in large-scale networks only using local information of a small fraction of sampled nodes. Here we propose a tail-scope method based on local observational bias of the friendship paradox. We show that the tail-scope method outperforms the uniform node sampling for estimating heavy tails of degree distributions, while the opposite tendency is observed in the range of small degrees. In order to take advantages of both sampling methods, we devise the hybrid method that successfully recovers the whole range of degree distributions. Our tail-scope method shows how structural heterogeneities of large-scale complex networks can be used to effectively reveal the network structure only with limited local information. PMID:25959097

  3. Tail-scope: Using friends to estimate heavy tails of degree distributions in large-scale complex networks.

    PubMed

    Eom, Young-Ho; Jo, Hang-Hyun

    2015-05-11

    Many complex networks in natural and social phenomena have often been characterized by heavy-tailed degree distributions. However, due to rapidly growing size of network data and concerns on privacy issues about using these data, it becomes more difficult to analyze complete data sets. Thus, it is crucial to devise effective and efficient estimation methods for heavy tails of degree distributions in large-scale networks only using local information of a small fraction of sampled nodes. Here we propose a tail-scope method based on local observational bias of the friendship paradox. We show that the tail-scope method outperforms the uniform node sampling for estimating heavy tails of degree distributions, while the opposite tendency is observed in the range of small degrees. In order to take advantages of both sampling methods, we devise the hybrid method that successfully recovers the whole range of degree distributions. Our tail-scope method shows how structural heterogeneities of large-scale complex networks can be used to effectively reveal the network structure only with limited local information.

  4. Tail-scope: Using friends to estimate heavy tails of degree distributions in large-scale complex networks

    NASA Astrophysics Data System (ADS)

    Eom, Young-Ho; Jo, Hang-Hyun

    2015-05-01

    Many complex networks in natural and social phenomena have often been characterized by heavy-tailed degree distributions. However, due to rapidly growing size of network data and concerns on privacy issues about using these data, it becomes more difficult to analyze complete data sets. Thus, it is crucial to devise effective and efficient estimation methods for heavy tails of degree distributions in large-scale networks only using local information of a small fraction of sampled nodes. Here we propose a tail-scope method based on local observational bias of the friendship paradox. We show that the tail-scope method outperforms the uniform node sampling for estimating heavy tails of degree distributions, while the opposite tendency is observed in the range of small degrees. In order to take advantages of both sampling methods, we devise the hybrid method that successfully recovers the whole range of degree distributions. Our tail-scope method shows how structural heterogeneities of large-scale complex networks can be used to effectively reveal the network structure only with limited local information.

  5. Probing large scale homogeneity and periodicity in the LRG distribution using Shannon entropy

    NASA Astrophysics Data System (ADS)

    Pandey, Biswajit; Sarkar, Suman

    2016-08-01

    We quantify the degree of inhomogeneity in the Luminous Red Galaxy (LRG) distribution from the SDSS DR7 as a function of length scales by measuring the Shannon entropy in independent and regular cubic voxels of increasing grid sizes. We also analyse the data by carrying out measurements in overlapping spheres and find that it suppresses inhomogeneities by a factor of 5-10 on different length scales. Despite the differences observed in the degree of inhomogeneity both the methods show a decrease in inhomogeneity with increasing length scales which eventually settle down to a plateau at ˜150 h-1 Mpc. Considering the minuscule values of inhomogeneity at the plateaus and their expected variations we conclude that the LRG distribution becomes homogeneous at 150 h-1 Mpc and beyond. We also use the Kullback-Leibler divergence as an alternative measure of inhomogeneity which reaffirms our findings. We show that the method presented here can effectively capture the inhomogeneity in a truly inhomogeneous distribution at all length scales. We analyse a set of Monte Carlo simulations with certain periodicity in their spatial distributions and find periodic variations in their inhomogeneity which helps us to identify the underlying regularities present in such distributions and quantify the scale of their periodicity. We do not find any underlying regularities in the LRG distribution within the length scales probed.

  6. Empirical Distributions of FST from Large-Scale Human Polymorphism Data

    PubMed Central

    Elhaik, Eran

    2012-01-01

    Studies of the apportionment of human genetic variation have long established that most human variation is within population groups and that the additional variation between population groups is small but greatest when comparing different continental populations. These studies often used Wright’s FST that apportions the standardized variance in allele frequencies within and between population groups. Because local adaptations increase population differentiation, high-FST may be found at closely linked loci under selection and used to identify genes undergoing directional or heterotic selection. We re-examined these processes using HapMap data. We analyzed 3 million SNPs on 602 samples from eight worldwide populations and a consensus subset of 1 million SNPs found in all populations. We identified four major features of the data: First, a hierarchically FST analysis showed that only a paucity (12%) of the total genetic variation is distributed between continental populations and even a lesser genetic variation (1%) is found between intra-continental populations. Second, the global FST distribution closely follows an exponential distribution. Third, although the overall FST distribution is similarly shaped (inverse J), FST distributions varies markedly by allele frequency when divided into non-overlapping groups by allele frequency range. Because the mean allele frequency is a crude indicator of allele age, these distributions mark the time-dependent change in genetic differentiation. Finally, the change in mean-FST of these groups is linear in allele frequency. These results suggest that investigating the extremes of the FST distribution for each allele frequency group is more efficient for detecting selection. Consequently, we demonstrate that such extreme SNPs are more clustered along the chromosomes than expected from linkage disequilibrium for each allele frequency group. These genomic regions are therefore likely candidates for natural selection. PMID

  7. Empirical distributions of F(ST) from large-scale human polymorphism data.

    PubMed

    Elhaik, Eran

    2012-01-01

    Studies of the apportionment of human genetic variation have long established that most human variation is within population groups and that the additional variation between population groups is small but greatest when comparing different continental populations. These studies often used Wright's F(ST) that apportions the standardized variance in allele frequencies within and between population groups. Because local adaptations increase population differentiation, high-F(ST) may be found at closely linked loci under selection and used to identify genes undergoing directional or heterotic selection. We re-examined these processes using HapMap data. We analyzed 3 million SNPs on 602 samples from eight worldwide populations and a consensus subset of 1 million SNPs found in all populations. We identified four major features of the data: First, a hierarchically F(ST) analysis showed that only a paucity (12%) of the total genetic variation is distributed between continental populations and even a lesser genetic variation (1%) is found between intra-continental populations. Second, the global F(ST) distribution closely follows an exponential distribution. Third, although the overall F(ST) distribution is similarly shaped (inverse J), F(ST) distributions varies markedly by allele frequency when divided into non-overlapping groups by allele frequency range. Because the mean allele frequency is a crude indicator of allele age, these distributions mark the time-dependent change in genetic differentiation. Finally, the change in mean-F(ST) of these groups is linear in allele frequency. These results suggest that investigating the extremes of the F(ST) distribution for each allele frequency group is more efficient for detecting selection. Consequently, we demonstrate that such extreme SNPs are more clustered along the chromosomes than expected from linkage disequilibrium for each allele frequency group. These genomic regions are therefore likely candidates for natural selection.

  8. Analysis of large-scale distributed knowledge sources via autonomous cooperative graph mining

    NASA Astrophysics Data System (ADS)

    Levchuk, Georgiy; Ortiz, Andres; Yan, Xifeng

    2014-05-01

    In this paper, we present a model for processing distributed relational data across multiple autonomous heterogeneous computing resources in environments with limited control, resource failures, and communication bottlenecks. Our model exploits dependencies in the data to enable collaborative distributed querying in noisy data. The collaboration policy for computational resources is efficiently constructed from the belief propagation algorithm. To scale to large data sizes, we employ a combination of priority-based filtering, incremental processing, and communication compression techniques. Our solution achieved high accuracy of analysis results and orders of magnitude improvements in computation time compared to the centralized graph matching solution.

  9. Anthropogenic aerosols and the distribution of past large-scale precipitation change.

    PubMed

    Wang, Chien

    2015-12-28

    The climate response of precipitation to the effects of anthropogenic aerosols is a critical while not yet fully understood aspect in climate science. Results of selected models that participated the Coupled Model Intercomparison Project Phase 5 and the data from the Twentieth Century Reanalysis Project suggest that, throughout the tropics and also in the extratropical Northern Hemisphere, aerosols have largely dominated the distribution of precipitation changes in reference to the preindustrial era in the second half of the last century. Aerosol-induced cooling has offset some of the warming caused by the greenhouse gases from the tropics to the Arctic and thus formed the gradients of surface temperature anomaly that enable the revealed precipitation change patterns to occur. Improved representation of aerosol-cloud interaction has been demonstrated as the key factor for models to reproduce consistent distributions of past precipitation change with the reanalysis data.

  10. Anthropogenic aerosols and the distribution of past large-scale precipitation change

    DOE PAGES

    Wang, Chien

    2015-12-28

    In this paper, the climate response of precipitation to the effects of anthropogenic aerosols is a critical while not yet fully understood aspect in climate science. Results of selected models that participated the Coupled Model Intercomparison Project Phase 5 and the data from the Twentieth Century Reanalysis Project suggest that, throughout the tropics and also in the extratropical Northern Hemisphere, aerosols have largely dominated the distribution of precipitation changes in reference to the preindustrial era in the second half of the last century. Aerosol-induced cooling has offset some of the warming caused by the greenhouse gases from the tropics tomore » the Arctic and thus formed the gradients of surface temperature anomaly that enable the revealed precipitation change patterns to occur. Improved representation of aerosol-cloud interaction has been demonstrated as the key factor for models to reproduce consistent distributions of past precipitation change with the reanalysis data.« less

  11. Large-Scale Control and Distributed Computing Systems under Stochastic Structural Perturbations

    DTIC Science & Technology

    1993-01-28

    Stability of Lotka - Volterra Model Authors: G. S. Ladde and S. Sathananthan Journal: Mathematics and Computer Modelling, Vol. 16, No. 3, pp. 99-107, 1992. ...SUBTITLE LagScl oto n itiue . FUNDING NUMBERS Computing Systems Under Stochastic Structural Perturbatio •s 6. AUTHOR(S) 1. Dr. S. Sathananthan 98AL03...non- hereditary control and distributed systems under randomly varying structural pertur- bations. Three principal areas of research, namely, (i

  12. Large scale patterns in vertical distribution and behaviour of mesopelagic scattering layers

    PubMed Central

    Klevjer, T. A.; Irigoien, X.; Røstad, A.; Fraile-Nuez, E.; Benítez-Barrios, V. M.; Kaartvedt., S.

    2016-01-01

    Recent studies suggest that previous estimates of mesopelagic biomasses are severely biased, with the new, higher estimates underlining the need to unveil behaviourally mediated coupling between shallow and deep ocean habitats. We analysed vertical distribution and diel vertical migration (DVM) of mesopelagic acoustic scattering layers (SLs) recorded at 38 kHz across oceanographic regimes encountered during the circumglobal Malaspina expedition. Mesopelagic SLs were observed in all areas covered, but vertical distributions and DVM patterns varied markedly. The distribution of mesopelagic backscatter was deepest in the southern Indian Ocean (weighted mean daytime depth: WMD 590 m) and shallowest at the oxygen minimum zone in the eastern Pacific (WMD 350 m). DVM was evident in all areas covered, on average ~50% of mesopelagic backscatter made daily excursions from mesopelagic depths to shallow waters. There were marked differences in migrating proportions between the regions, ranging from ~20% in the Indian Ocean to ~90% in the Eastern Pacific. Overall the data suggest strong spatial gradients in mesopelagic DVM patterns, with implied ecological and biogeochemical consequences. Our results suggest that parts of this spatial variability can be explained by horizontal patterns in physical-chemical properties of water masses, such as oxygen, temperature and turbidity. PMID:26813333

  13. Large-Scale Merging of Histograms using Distributed In-Memory Computing

    NASA Astrophysics Data System (ADS)

    Blomer, Jakob; Ganis, Gerardo

    2015-12-01

    Most high-energy physics analysis jobs are embarrassingly parallel except for the final merging of the output objects, which are typically histograms. Currently, the merging of output histograms scales badly. The running time for distributed merging depends not only on the overall number of bins but also on the number partial histogram output files. That means, while the time to analyze data decreases linearly with the number of worker nodes, the time to merge the histograms in fact increases with the number of worker nodes. On the grid, merging jobs that take a few hours are not unusual. In order to improve the situation, we present a distributed and decentral merging algorithm whose running time is independent of the number of worker nodes. We exploit full bisection bandwidth of local networks and we keep all intermediate results in memory. We present benchmarks from an implementation using the parallel ROOT facility (PROOF) and RAMCloud, a distributed key-value store that keeps all data in DRAM.

  14. Large scale patterns in vertical distribution and behaviour of mesopelagic scattering layers.

    PubMed

    Klevjer, T A; Irigoien, X; Røstad, A; Fraile-Nuez, E; Benítez-Barrios, V M; Kaartvedt, S

    2016-01-27

    Recent studies suggest that previous estimates of mesopelagic biomasses are severely biased, with the new, higher estimates underlining the need to unveil behaviourally mediated coupling between shallow and deep ocean habitats. We analysed vertical distribution and diel vertical migration (DVM) of mesopelagic acoustic scattering layers (SLs) recorded at 38 kHz across oceanographic regimes encountered during the circumglobal Malaspina expedition. Mesopelagic SLs were observed in all areas covered, but vertical distributions and DVM patterns varied markedly. The distribution of mesopelagic backscatter was deepest in the southern Indian Ocean (weighted mean daytime depth: WMD 590 m) and shallowest at the oxygen minimum zone in the eastern Pacific (WMD 350 m). DVM was evident in all areas covered, on average ~50% of mesopelagic backscatter made daily excursions from mesopelagic depths to shallow waters. There were marked differences in migrating proportions between the regions, ranging from ~20% in the Indian Ocean to ~90% in the Eastern Pacific. Overall the data suggest strong spatial gradients in mesopelagic DVM patterns, with implied ecological and biogeochemical consequences. Our results suggest that parts of this spatial variability can be explained by horizontal patterns in physical-chemical properties of water masses, such as oxygen, temperature and turbidity.

  15. Large scale patterns in vertical distribution and behaviour of mesopelagic scattering layers

    NASA Astrophysics Data System (ADS)

    Klevjer, T. A.; Irigoien, X.; Røstad, A.; Fraile-Nuez, E.; Benítez-Barrios, V. M.; Kaartvedt., S.

    2016-01-01

    Recent studies suggest that previous estimates of mesopelagic biomasses are severely biased, with the new, higher estimates underlining the need to unveil behaviourally mediated coupling between shallow and deep ocean habitats. We analysed vertical distribution and diel vertical migration (DVM) of mesopelagic acoustic scattering layers (SLs) recorded at 38 kHz across oceanographic regimes encountered during the circumglobal Malaspina expedition. Mesopelagic SLs were observed in all areas covered, but vertical distributions and DVM patterns varied markedly. The distribution of mesopelagic backscatter was deepest in the southern Indian Ocean (weighted mean daytime depth: WMD 590 m) and shallowest at the oxygen minimum zone in the eastern Pacific (WMD 350 m). DVM was evident in all areas covered, on average ~50% of mesopelagic backscatter made daily excursions from mesopelagic depths to shallow waters. There were marked differences in migrating proportions between the regions, ranging from ~20% in the Indian Ocean to ~90% in the Eastern Pacific. Overall the data suggest strong spatial gradients in mesopelagic DVM patterns, with implied ecological and biogeochemical consequences. Our results suggest that parts of this spatial variability can be explained by horizontal patterns in physical-chemical properties of water masses, such as oxygen, temperature and turbidity.

  16. 'Oorja' in India: Assessing a large-scale commercial distribution of advanced biomass stoves to households.

    PubMed

    Thurber, Mark C; Phadke, Himani; Nagavarapu, Sriniketh; Shrimali, Gireesh; Zerriffi, Hisham

    2014-04-01

    Replacing traditional stoves with advanced alternatives that burn more cleanly has the potential to ameliorate major health problems associated with indoor air pollution in developing countries. With a few exceptions, large government and charitable programs to distribute advanced stoves have not had the desired impact. Commercially-based distributions that seek cost recovery and even profits might plausibly do better, both because they encourage distributors to supply and promote products that people want and because they are based around properly-incentivized supply chains that could more be scalable, sustainable, and replicable. The sale in India of over 400,000 "Oorja" stoves to households from 2006 onwards represents the largest commercially-based distribution of a gasification-type advanced biomass stove. BP's Emerging Consumer Markets (ECM) division and then successor company First Energy sold this stove and the pelletized biomass fuel on which it operates. We assess the success of this effort and the role its commercial aspect played in outcomes using a survey of 998 households in areas of Maharashtra and Karnataka where the stove was sold as well as detailed interviews with BP and First Energy staff. Statistical models based on this data indicate that Oorja purchase rates were significantly influenced by the intensity of Oorja marketing in a region as well as by pre-existing stove mix among households. The highest rate of adoption came from LPG-using households for which Oorja's pelletized biomass fuel reduced costs. Smoke- and health-related messages from Oorja marketing did not significantly influence the purchase decision, although they did appear to affect household perceptions about smoke. By the time of our survey, only 9% of households that purchased Oorja were still using the stove, the result in large part of difficulties First Energy encountered in developing a viable supply chain around low-cost procurement of "agricultural waste" to make

  17. Large-Scale CORBA-Distributed Software Framework for NIF Controls

    SciTech Connect

    Carey, R W; Fong, K W; Sanchez, R J; Tappero, J D; Woodruff, J P

    2001-10-16

    The Integrated Computer Control System (ICCS) is based on a scalable software framework that is distributed over some 325 computers throughout the NIF facility. The framework provides templates and services at multiple levels of abstraction for the construction of software applications that communicate via CORBA (Common Object Request Broker Architecture). Various forms of object-oriented software design patterns are implemented as templates to be extended by application software. Developers extend the framework base classes to model the numerous physical control points, thereby sharing the functionality defined by the base classes. About 56,000 software objects each individually addressed through CORBA are to be created in the complete ICCS. Most objects have a persistent state that is initialized at system start-up and stored in a database. Additional framework services are provided by centralized server programs that implement events, alerts, reservations, message logging, database/file persistence, name services, and process management. The ICCS software framework approach allows for efficient construction of a software system that supports a large number of distributed control points representing a complex control application.

  18. Cost-Efficient and Multi-Functional Secure Aggregation in Large Scale Distributed Application

    PubMed Central

    Zhang, Ping; Li, Wenjun; Sun, Hua

    2016-01-01

    Secure aggregation is an essential component of modern distributed applications and data mining platforms. Aggregated statistical results are typically adopted in constructing a data cube for data analysis at multiple abstraction levels in data warehouse platforms. Generating different types of statistical results efficiently at the same time (or referred to as enabling multi-functional support) is a fundamental requirement in practice. However, most of the existing schemes support a very limited number of statistics. Securely obtaining typical statistical results simultaneously in the distribution system, without recovering the original data, is still an open problem. In this paper, we present SEDAR, which is a SEcure Data Aggregation scheme under the Range segmentation model. Range segmentation model is proposed to reduce the communication cost by capturing the data characteristics, and different range uses different aggregation strategy. For raw data in the dominant range, SEDAR encodes them into well defined vectors to provide value-preservation and order-preservation, and thus provides the basis for multi-functional aggregation. A homomorphic encryption scheme is used to achieve data privacy. We also present two enhanced versions. The first one is a Random based SEDAR (REDAR), and the second is a Compression based SEDAR (CEDAR). Both of them can significantly reduce communication cost with the trade-off lower security and lower accuracy, respectively. Experimental evaluations, based on six different scenes of real data, show that all of them have an excellent performance on cost and accuracy. PMID:27551747

  19. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex, Hydrogeologic Systems

    NASA Astrophysics Data System (ADS)

    Wolfsberg, A.; Kang, Q.; Li, C.; Ruskauff, G.; Bhark, E.; Freeman, E.; Prothro, L.; Drellack, S.

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  20. Investigation of Homogeneity and Matter Distribution on Large Scales Using Large Quasar Groups

    NASA Astrophysics Data System (ADS)

    Li, Ming-Hua

    2015-12-01

    We use 20 large quasar group (LQG) samples in Park et al. (2015) to investigate the homogeneity of the 0.3 ≲ z ≲ 1.6 Universe (z denotes the redshift). For comparison, we also employ the 12 LQGs samples at 0.5 ≲ z ≲ 2 in Komberg et al. (1996) to do the analysis. We calculate the bias factor b and the two-point correlation function ξLQG for such groups for three different density profiles of the LQG dark matter halos, i.e. the isothermal profile, the Navarro-Frenk-White (NFW) profile, and the (gravitational) lensing profile. We consider the ΛCDM (cold dark matter plus a cosmological constant Λ) underlying matter power spectrum with Ωm = 0.28, ΩΛ = 0.72, the Hubble constant H0 = 100 h·km·s-1· Mpc-1 with h = 0.72. Dividing the samples into three redshift bins, we find that the LQGs with higher redshift are more biased and correlated than those with lower redshift. The homogeneity scale RH of the LQG distribution is also deduced from theory. It is defined as the comoving radius of the sphere inside which the number of LQGs N(< r) is proportional to r3 within 1%, or equivalently above which the correlation dimension of the sample D2 is within 1% of D2 = 3. For Park et al.'s samples and the NFW dark matter halo profile, the homogeneity scales of the LQG distribution are RH ⋍ 247 h-1· Mpc for 0.2 < z ≤ 0.6, RH ⋍ 360 h-1· Mpc for 0.6 < z ≤ 1.2, and RH ⋍ 480 h-1· Mpc for 1.2 < z ≲ 1.6. The maximum extent of the LQG samples are beyond RH in each bin, showing that the LQG samples are not homogeneously distributed on such a scale, i.e. a length range of ˜ 500 h-1. Mpc and a mass scale of ˜1014M⊙. The possibilities of a top-down structure formation process as was predicted by the hot/warm dark matter (WDM) scenarios and the redshift evolution of bias factor b and correlation amplitude ξLQG of the LQGs as a consequence of the cosmic expansion are both discussed. Different results were obtained based on the LQG sample in Komberg et al. (1996

  1. Large-Scale Transport Model Uncertainty and Sensitivity Analysis: Distributed Sources in Complex Hydrogeologic Systems

    SciTech Connect

    Sig Drellack, Lance Prothro

    2007-12-01

    The Underground Test Area (UGTA) Project of the U.S. Department of Energy, National Nuclear Security Administration Nevada Site Office is in the process of assessing and developing regulatory decision options based on modeling predictions of contaminant transport from underground testing of nuclear weapons at the Nevada Test Site (NTS). The UGTA Project is attempting to develop an effective modeling strategy that addresses and quantifies multiple components of uncertainty including natural variability, parameter uncertainty, conceptual/model uncertainty, and decision uncertainty in translating model results into regulatory requirements. The modeling task presents multiple unique challenges to the hydrological sciences as a result of the complex fractured and faulted hydrostratigraphy, the distributed locations of sources, the suite of reactive and non-reactive radionuclides, and uncertainty in conceptual models. Characterization of the hydrogeologic system is difficult and expensive because of deep groundwater in the arid desert setting and the large spatial setting of the NTS. Therefore, conceptual model uncertainty is partially addressed through the development of multiple alternative conceptual models of the hydrostratigraphic framework and multiple alternative models of recharge and discharge. Uncertainty in boundary conditions is assessed through development of alternative groundwater fluxes through multiple simulations using the regional groundwater flow model. Calibration of alternative models to heads and measured or inferred fluxes has not proven to provide clear measures of model quality. Therefore, model screening by comparison to independently-derived natural geochemical mixing targets through cluster analysis has also been invoked to evaluate differences between alternative conceptual models. Advancing multiple alternative flow models, sensitivity of transport predictions to parameter uncertainty is assessed through Monte Carlo simulations. The

  2. Large-scale malaria survey in Cambodia: novel insights on species distribution and risk factors.

    PubMed

    Incardona, Sandra; Vong, Sirenda; Chiv, Lim; Lim, Pharath; Nhem, Sina; Sem, Rithy; Khim, Nimol; Doung, Socheat; Mercereau-Puijalon, Odile; Fandeur, Thierry

    2007-03-27

    In Cambodia, estimates of the malaria burden rely on a public health information system that does not record cases occurring among remote populations, neither malaria cases treated in the private sector nor asymptomatic carriers. A global estimate of the current malaria situation and associated risk factors is, therefore, still lacking. A large cross-sectional survey was carried out in three areas of multidrug resistant malaria in Cambodia, enrolling 11,652 individuals. Fever and splenomegaly were recorded. Malaria prevalence, parasite densities and spatial distribution of infection were determined to identify parasitological profiles and the associated risk factors useful for improving malaria control programmes in the country. Malaria prevalence was 3.0%, 7.0% and 12.3% in Sampovloun, Koh Kong and Preah Vihear areas. Prevalences and Plasmodium species were heterogeneously distributed, with higher Plasmodium vivax rates in areas of low transmission. Malaria-attributable fevers accounted only for 10-33% of malaria cases, and 23-33% of parasite carriers were febrile. Multivariate multilevel regression analysis identified adults and males, mostly involved in forest activities, as high risk groups in Sampovloun, with additional risks for children in forest-fringe villages in the other areas along with an increased risk with distance from health facilities. These observations point to a more complex malaria situation than suspected from official reports. A large asymptomatic reservoir was observed. The rates of P. vivax infections were higher than recorded in several areas. In remote areas, malaria prevalence was high. This indicates that additional health facilities should be implemented in areas at higher risk, such as remote rural and forested parts of the country, which are not adequately served by health services. Precise malaria risk mapping all over the country is needed to assess the extensive geographical heterogeneity of malaria endemicity and risk

  3. Large-scale malaria survey in Cambodia: Novel insights on species distribution and risk factors

    PubMed Central

    Incardona, Sandra; Vong, Sirenda; Chiv, Lim; Lim, Pharath; Nhem, Sina; Sem, Rithy; Khim, Nimol; Doung, Socheat; Mercereau-Puijalon, Odile; Fandeur, Thierry

    2007-01-01

    Background In Cambodia, estimates of the malaria burden rely on a public health information system that does not record cases occurring among remote populations, neither malaria cases treated in the private sector nor asymptomatic carriers. A global estimate of the current malaria situation and associated risk factors is, therefore, still lacking. Methods A large cross-sectional survey was carried out in three areas of multidrug resistant malaria in Cambodia, enrolling 11,652 individuals. Fever and splenomegaly were recorded. Malaria prevalence, parasite densities and spatial distribution of infection were determined to identify parasitological profiles and the associated risk factors useful for improving malaria control programmes in the country. Results Malaria prevalence was 3.0%, 7.0% and 12.3% in Sampovloun, Koh Kong and Preah Vihear areas. Prevalences and Plasmodium species were heterogeneously distributed, with higher Plasmodium vivax rates in areas of low transmission. Malaria-attributable fevers accounted only for 10–33% of malaria cases, and 23–33% of parasite carriers were febrile. Multivariate multilevel regression analysis identified adults and males, mostly involved in forest activities, as high risk groups in Sampovloun, with additional risks for children in forest-fringe villages in the other areas along with an increased risk with distance from health facilities. Conclusion These observations point to a more complex malaria situation than suspected from official reports. A large asymptomatic reservoir was observed. The rates of P. vivax infections were higher than recorded in several areas. In remote areas, malaria prevalence was high. This indicates that additional health facilities should be implemented in areas at higher risk, such as remote rural and forested parts of the country, which are not adequately served by health services. Precise malaria risk mapping all over the country is needed to assess the extensive geographical heterogeneity

  4. 3D-Simulation Of Concentration Distributions Inside Large-Scale Circulating Fluidized Bed Combustors

    NASA Astrophysics Data System (ADS)

    Wischnewski, R.; Ratschow, L.; Hartge, E. U.; Werthe, J.

    With increasing size of modern CFB combustors the lateral mixing of fuels and secondary air gains more and more importance. Strong concentration gradients, which result from improper lateral mixing, can lead to operational problems, high flue gas emissions and lower boiler efficiencies. A 3D-model for the simulation of local gas and solids concentrations inside industrial-sized CFB boilers has been developed. The model is based on a macroscopic approach and considers all major mechanisms during fuel spreading and subsequent combustion of char and volatiles. Typical characteristics of modern boilers like staged combustion, a smaller cross-sectional area in the lower section of the combustion chamber and the co-combustion of additional fuels with coal can be considered. The 252 MWth combustor of Stadtwerke Duisburg AG is used for the validation of the model. A comprehensive picture of the local conditions inside the combustion chamber is achieved by the combination of local gas measurements and the three-dimensional simulation of concentration distributions.

  5. Reconstruction of air-shower parameters for large-scale radio detectors using the lateral distribution

    NASA Astrophysics Data System (ADS)

    Kostunin, D.; Bezyazeekov, P. A.; Hiller, R.; Schröder, F. G.; Lenok, V.; Levinson, E.

    2016-02-01

    We investigate features of the lateral distribution function (LDF) of the radio signal emitted by cosmic ray air-showers with primary energies Epr > 0.1 EeV and its connection to air-shower parameters such as energy and shower maximum using CoREAS simulations made for the configuration of the Tunka-Rex antenna array. Taking into account all significant contributions to the total radio emission, such as by the geomagnetic effect, the charge excess, and the atmospheric refraction we parameterize the radio LDF. This parameterization is two-dimensional and has several free parameters. The large number of free parameters is not suitable for experiments of sparse arrays operating at low SNR (signal-to-noise ratios). Thus, exploiting symmetries, we decrease the number of free parameters based on the shower geometry and reduce the LDF to a simple one-dimensional function. The remaining parameters can be fit with a small number of points, i.e. as few as the signal from three antennas above detection threshold. Finally, we present a method for the reconstruction of air-shower parameters, in particular, energy and Xmax (shower maximum), which can be reached with a theoretical accuracy of better than 15% and 30 g/cm2, respectively.

  6. Response Time Distributions in Rapid Chess: A Large-Scale Decision Making Experiment

    PubMed Central

    Sigman, Mariano; Etchemendy, Pablo; Slezak, Diego Fernández; Cecchi, Guillermo A.

    2010-01-01

    Rapid chess provides an unparalleled laboratory to understand decision making in a natural environment. In a chess game, players choose consecutively around 40 moves in a finite time budget. The goodness of each choice can be determined quantitatively since current chess algorithms estimate precisely the value of a position. Web-based chess produces vast amounts of data, millions of decisions per day, incommensurable with traditional psychological experiments. We generated a database of response times (RTs) and position value in rapid chess games. We measured robust emergent statistical observables: (1) RT distributions are long-tailed and show qualitatively distinct forms at different stages of the game, (2) RT of successive moves are highly correlated both for intra- and inter-player moves. These findings have theoretical implications since they deny two basic assumptions of sequential decision making algorithms: RTs are not stationary and can not be generated by a state-function. Our results also have practical implications. First, we characterized the capacity of blunders and score fluctuations to predict a player strength, which is yet an open problem in chess softwares. Second, we show that the winning likelihood can be reliably estimated from a weighted combination of remaining times and position evaluation. PMID:21031032

  7. Large scale integration of CVD-graphene based NEMS with narrow distribution of resonance parameters

    NASA Astrophysics Data System (ADS)

    Arjmandi-Tash, Hadi; Allain, Adrien; (Vitto Han, Zheng; Bouchiat, Vincent

    2017-06-01

    We present a novel method for the fabrication of the arrays of suspended micron-sized membranes, based on monolayer pulsed-CVD graphene. Such devices are the source of an efficient integration of graphene nano-electro-mechanical resonators, compatible with production at the wafer scale using standard photolithography and processing tools. As the graphene surface is continuously protected by the same polymer layer during the whole process, suspended graphene membranes are clean and free of imperfections such as deposits, wrinkles and tears. Batch fabrication of 100 μm-long multi-connected suspended ribbons is presented. At room temperature, mechanical resonance of electrostatically-actuated devices show narrow distribution of their characteristic parameters with high quality factor and low effective mass and resonance frequencies, as expected for low stress and adsorbate-free membranes. Upon cooling, a sharp increase of both resonant frequency and quality factor is observed, enabling to extract the thermal expansion coefficient of CVD graphene. Comparison with state-of-the-art graphene NEMS is presented.

  8. Distribution of circular proteins in plants: large-scale mapping of cyclotides in the Violaceae.

    PubMed

    Burman, Robert; Yeshak, Mariamawit Y; Larsson, Sonny; Craik, David J; Rosengren, K Johan; Göransson, Ulf

    2015-01-01

    During the last decade there has been increasing interest in small circular proteins found in plants of the violet family (Violaceae). These so-called cyclotides consist of a circular chain of approximately 30 amino acids, including six cysteines forming three disulfide bonds, arranged in a cyclic cystine knot (CCK) motif. In this study we map the occurrence and distribution of cyclotides throughout the Violaceae. Plant material was obtained from herbarium sheets containing samples up to 200 years of age. Even the oldest specimens contained cyclotides in the preserved leaves, with no degradation products observable, confirming their place as one of the most stable proteins in nature. Over 200 samples covering 17 of the 23-31 genera in Violaceae were analyzed, and cyclotides were positively identified in 150 species. Each species contained a unique set of between one and 25 cyclotides, with many exclusive to individual plant species. We estimate the number of different cyclotides in the Violaceae to be 5000-25,000, and propose that cyclotides are ubiquitous among all Violaceae species. Twelve new cyclotides from six phylogenetically dispersed genera were sequenced. Furthermore, the first glycosylated derivatives of cyclotides were identified and characterized, further increasing the diversity and complexity of this unique protein family.

  9. Distribution of circular proteins in plants: large-scale mapping of cyclotides in the Violaceae

    PubMed Central

    Burman, Robert; Yeshak, Mariamawit Y.; Larsson, Sonny; Craik, David J.; Rosengren, K. Johan; Göransson, Ulf

    2015-01-01

    During the last decade there has been increasing interest in small circular proteins found in plants of the violet family (Violaceae). These so-called cyclotides consist of a circular chain of approximately 30 amino acids, including six cysteines forming three disulfide bonds, arranged in a cyclic cystine knot (CCK) motif. In this study we map the occurrence and distribution of cyclotides throughout the Violaceae. Plant material was obtained from herbarium sheets containing samples up to 200 years of age. Even the oldest specimens contained cyclotides in the preserved leaves, with no degradation products observable, confirming their place as one of the most stable proteins in nature. Over 200 samples covering 17 of the 23–31 genera in Violaceae were analyzed, and cyclotides were positively identified in 150 species. Each species contained a unique set of between one and 25 cyclotides, with many exclusive to individual plant species. We estimate the number of different cyclotides in the Violaceae to be 5000–25,000, and propose that cyclotides are ubiquitous among all Violaceae species. Twelve new cyclotides from six phylogenetically dispersed genera were sequenced. Furthermore, the first glycosylated derivatives of cyclotides were identified and characterized, further increasing the diversity and complexity of this unique protein family. PMID:26579135

  10. Cost of mass annual single dose diethylcarbamazine distribution for the large scale control of lymphatic filariasis.

    PubMed

    Krishnamoorthy, K; Ramu, K; Srividya, A; Appavoo, N C; Saxena, N B; Lal, S; Das, P K

    2000-03-01

    Economic analysis of the revised strategy to control lymphatic filariasis with mass annual single dose diethylcarbamazine (DEC) at 6 mg/kg body weight launched in one of the districts of Tamil Nadu in 1996 was carried out. This exploratory study, proposed for five years in 13 districts under 7 states on a pilot scale through the Department of Public Health is an additional input of the existing National Filaria Control Programme in India. A retrospective costing exercise was undertaken systematically from the provider's perspective following the completion of the first round of drug distribution. The major activities and cost components were identified and itemized cost menu was prepared to estimate the direct (financial) and indirect (opportunity) cost related to the implementation of the Programme. The total financial cost of this Programme to cover 22.7 lakh population in the district was Rs. 22.05 lakhs. The opportunity cost of labour and capital investment was calculated to be Rs. 7.98 lakhs. The total per capita cost was Rs. 1.32, with Rs. 0.97 and Rs. 0.35 as financial and opportunity cost respectively. Based on these estimates, the implementation cost of the Programme at Primary Health Centre (PHC) level was calculated and projected for five years. The additional financial cost for the existing health care system is estimated to be Rs. 27,800 per PHC every year. DEC tablets (50 mg) was the major cost component and sensitivity analysis showed that the cost of the Programme could be minimized by 20 per cent by switching over to 100 mg tablets. The analysis indicates that this Programme is a low-cost option and the results are discussed in view of its operational feasibility and epidemiological impact.

  11. An incremental and distributed inference method for large-scale ontologies based on MapReduce paradigm.

    PubMed

    Liu, Bo; Huang, Keman; Li, Jianqiang; Zhou, MengChu

    2015-01-01

    With the upcoming data deluge of semantic data, the fast growth of ontology bases has brought significant challenges in performing efficient and scalable reasoning. Traditional centralized reasoning methods are not sufficient to process large ontologies. Distributed reasoning methods are thus required to improve the scalability and performance of inferences. This paper proposes an incremental and distributed inference method for large-scale ontologies by using MapReduce, which realizes high-performance reasoning and runtime searching, especially for incremental knowledge base. By constructing transfer inference forest and effective assertional triples, the storage is largely reduced and the reasoning process is simplified and accelerated. Finally, a prototype system is implemented on a Hadoop framework and the experimental results validate the usability and effectiveness of the proposed approach.

  12. Large-Scale Spatial Distribution of Virioplankton in the Adriatic Sea: Testing the Trophic State Control Hypothesis

    PubMed Central

    Corinaldesi, C.; Crevatin, E.; Del Negro, P.; Marini, M.; Russo, A.; Fonda-Umani, S.; Danovaro, R.

    2003-01-01

    Little is known concerning environmental factors that may control the distribution of virioplankton on large spatial scales. In previous studies workers reported high viral levels in eutrophic systems and suggested that the trophic state is a possible driving force controlling the spatial distribution of viruses. In order to test this hypothesis, we determined the distribution of viral abundance and bacterial abundance and the virus-to-bacterium ratio in a wide area covering the entire Adriatic basin (Mediterranean Sea). To gather additional information on factors controlling viral distribution on a large scale, functional microbial parameters (exoenzymatic activities, bacterial production and turnover) were related to trophic gradients. At large spatial scales, viral distribution was independent of autotrophic biomass and all other environmental parameters. We concluded that in contrast to what was previously hypothesized, changing trophic conditions do not directly affect virioplankton distribution. Since virus distribution was coupled with bacterial turnover times, our results suggest that viral abundance depends on bacterial activity and on host cell abundance. PMID:12732535

  13. Large-scale spatial distribution of virioplankton in the Adriatic Sea: testing the trophic state control hypothesis.

    PubMed

    Corinaldesi, C; Crevatin, E; Del Negro, P; Marini, M; Russo, A; Fonda-Umani, S; Danovaro, R

    2003-05-01

    Little is known concerning environmental factors that may control the distribution of virioplankton on large spatial scales. In previous studies workers reported high viral levels in eutrophic systems and suggested that the trophic state is a possible driving force controlling the spatial distribution of viruses. In order to test this hypothesis, we determined the distribution of viral abundance and bacterial abundance and the virus-to-bacterium ratio in a wide area covering the entire Adriatic basin (Mediterranean Sea). To gather additional information on factors controlling viral distribution on a large scale, functional microbial parameters (exoenzymatic activities, bacterial production and turnover) were related to trophic gradients. At large spatial scales, viral distribution was independent of autotrophic biomass and all other environmental parameters. We concluded that in contrast to what was previously hypothesized, changing trophic conditions do not directly affect virioplankton distribution. Since virus distribution was coupled with bacterial turnover times, our results suggest that viral abundance depends on bacterial activity and on host cell abundance.

  14. Using large scale surveys to investigate seasonal variations in seabird distribution and abundance. Part I: The North Western Mediterranean Sea

    NASA Astrophysics Data System (ADS)

    Pettex, Emeline; David, Léa; Authier, Matthieu; Blanck, Aurélie; Dorémus, Ghislain; Falchetto, Hélène; Laran, Sophie; Monestiez, Pascal; Van Canneyt, Olivier; Virgili, Auriane; Ridoux, Vincent

    2017-07-01

    Scientific investigation in offshore areas are logistically challenging and expensive, therefore the available knowledge on seabird at sea distribution and abundance, as well as their seasonal variations, remains limited. To investigate the seasonal variability in seabird distribution and abundance in the North-Western Mediterranean Sea (NWMS), we conducted two large-scale aerial surveys in winter 2011-12 and summer 2012, covering a 181,400 km2 area. Following a strip-transect method, observers recorded a total of 4141 seabird sightings in winter and 2334 in summer, along 32,213 km. Using geostatistical methods, we generated sightings density maps for both seasons, as well as estimates of density and abundance. Most taxa showed seasonal variations in their density and distribution patterns, as they used the area either for wintering or for breeding. Highest densities of seabirds were recorded during winter, although large-sized shearwaters, storm petrels and terns were more abundant during summer. Consequently, with nearly 170,000 seabirds estimated in winter, the total abundance was twice higher in winter. Coastal waters of the continental shelf were generally more exploited by seabirds, even though some species, such as Mediterranean gulls, black-headed gulls, little gulls and storm petrels were found at high densities in highly offshore waters. Our results revealed areas highly exploited by the seabird community in the NWMS, such as the Gulf of Lion, the Tuscan region, and the area between Corsica and Sardinia. In addition, these large-scale surveys provide a baseline for the monitoring of seabird at sea distribution, and could inform the EU Marine Strategy Framework Directive.

  15. Assessing Impact of Large-Scale Distributed Residential HVAC Control Optimization on Electricity Grid Operation and Renewable Energy Integration

    NASA Astrophysics Data System (ADS)

    Corbin, Charles D.

    Demand management is an important component of the emerging Smart Grid, and a potential solution to the supply-demand imbalance occurring increasingly as intermittent renewable electricity is added to the generation mix. Model predictive control (MPC) has shown great promise for controlling HVAC demand in commercial buildings, making it an ideal solution to this problem. MPC is believed to hold similar promise for residential applications, yet very few examples exist in the literature despite a growing interest in residential demand management. This work explores the potential for residential buildings to shape electric demand at the distribution feeder level in order to reduce peak demand, reduce system ramping, and increase load factor using detailed sub-hourly simulations of thousands of buildings coupled to distribution power flow software. More generally, this work develops a methodology for the directed optimization of residential HVAC operation using a distributed but directed MPC scheme that can be applied to today's programmable thermostat technologies to address the increasing variability in electric supply and demand. Case studies incorporating varying levels of renewable energy generation demonstrate the approach and highlight important considerations for large-scale residential model predictive control.

  16. A Capacity Design Method of Distributed Battery Storage for Controlling Power Variation with Large-Scale Photovoltaic Sources in Distribution Network

    NASA Astrophysics Data System (ADS)

    Kobayashi, Yasuhiro; Sawa, Toshiyuki; Gunji, Keiko; Yamazaki, Jun; Watanabe, Masahiro

    A design method for distributed battery storage capacity has been developed for evaluating battery storage advantage on demand-supply imbalance control in distribution systems with which large-scale home photovoltaic powers connected. The proposed method is based on a linear storage capacity minimization model with design basis demand load and photovoltaic output time series subjective to battery management constraints. The design method has been experimentally applied to a sample distribution system with substation storage and terminal area storage. From the numerical results, the developed method successfully clarifies the charge-discharge control and stored power variation, satisfies peak cut requirement, and pinpoints the minimum distributed storage capacity.

  17. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems.

    PubMed

    Albattat, Ali; Gruenwald, Benjamin C; Yucelen, Tansel

    2016-08-16

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches.

  18. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill; Feiereisen, William (Technical Monitor)

    2000-01-01

    The term "Grid" refers to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. The vision for NASN's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks that will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: The scientist / design engineer whose primary interest is problem solving (e.g., determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user if the tool designer: The computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. This paper describes the current state of IPG (the operational testbed), the set of capabilities being put into place for the operational prototype IPG, as well as some of the longer term R&D tasks.

  19. On Event-Triggered Adaptive Architectures for Decentralized and Distributed Control of Large-Scale Modular Systems

    PubMed Central

    Albattat, Ali; Gruenwald, Benjamin C.; Yucelen, Tansel

    2016-01-01

    The last decade has witnessed an increased interest in physical systems controlled over wireless networks (networked control systems). These systems allow the computation of control signals via processors that are not attached to the physical systems, and the feedback loops are closed over wireless networks. The contribution of this paper is to design and analyze event-triggered decentralized and distributed adaptive control architectures for uncertain networked large-scale modular systems; that is, systems consist of physically-interconnected modules controlled over wireless networks. Specifically, the proposed adaptive architectures guarantee overall system stability while reducing wireless network utilization and achieving a given system performance in the presence of system uncertainties that can result from modeling and degraded modes of operation of the modules and their interconnections between each other. In addition to the theoretical findings including rigorous system stability and the boundedness analysis of the closed-loop dynamical system, as well as the characterization of the effect of user-defined event-triggering thresholds and the design parameters of the proposed adaptive architectures on the overall system performance, an illustrative numerical example is further provided to demonstrate the efficacy of the proposed decentralized and distributed control approaches. PMID:27537894

  20. Large-scale determinants of intestinal schistosomiasis and intermediate host snail distribution across Africa: does climate matter?

    PubMed

    Stensgaard, Anna-Sofie; Utzinger, Jürg; Vounatsou, Penelope; Hürlimann, Eveline; Schur, Nadine; Saarnak, Christopher F L; Simoonga, Christopher; Mubita, Patricia; Kabatereine, Narcis B; Tchuem Tchuenté, Louis-Albert; Rahbek, Carsten; Kristensen, Thomas K

    2013-11-01

    The geographical ranges of most species, including many infectious disease agents and their vectors and intermediate hosts, are assumed to be constrained by climatic tolerances, mainly temperature. It has been suggested that global warming will cause an expansion of the areas potentially suitable for infectious disease transmission. However, the transmission of infectious diseases is governed by a myriad of ecological, economic, evolutionary and social factors. Hence, a deeper understanding of the total disease system (pathogens, vectors and hosts) and its drivers is important for predicting responses to climate change. Here, we combine a growing degree day model for Schistosoma mansoni with species distribution models for the intermediate host snail (Biomphalaria spp.) to investigate large-scale environmental determinants of the distribution of the African S. mansoni-Biomphalaria system and potential impacts of climatic changes. Snail species distribution models included several combinations of climatic and habitat-related predictors; the latter divided into "natural" and "human-impacted" habitat variables to measure anthropogenic influence. The predictive performance of the combined snail-parasite model was evaluated against a comprehensive compilation of historical S. mansoni parasitological survey records, and then examined for two climate change scenarios of increasing severity for 2080. Future projections indicate that while the potential S. mansoni transmission area expands, the snail ranges are more likely to contract and/or move into cooler areas in the south and east. Importantly, we also note that even though climate per se matters, the impact of humans on habitat play a crucial role in determining the distribution of the intermediate host snails in Africa. Thus, a future contraction in the geographical range size of the intermediate host snails caused by climatic changes does not necessarily translate into a decrease or zero-sum change in human

  1. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  2. Global direct pressures on biodiversity by large-scale metal mining: Spatial distribution and implications for conservation.

    PubMed

    Murguía, Diego I; Bringezu, Stefan; Schaldach, Rüdiger

    2016-09-15

    Biodiversity loss is widely recognized as a serious global environmental change process. While large-scale metal mining activities do not belong to the top drivers of such change, these operations exert or may intensify pressures on biodiversity by adversely changing habitats, directly and indirectly, at local and regional scales. So far, analyses of global spatial dynamics of mining and its burden on biodiversity focused on the overlap between mines and protected areas or areas of high value for conservation. However, it is less clear how operating metal mines are globally exerting pressure on zones of different biodiversity richness; a similar gap exists for unmined but known mineral deposits. By using vascular plants' diversity as a proxy to quantify overall biodiversity, this study provides a first examination of the global spatial distribution of mines and deposits for five key metals across different biodiversity zones. The results indicate that mines and deposits are not randomly distributed, but concentrated within intermediate and high diversity zones, especially bauxite and silver. In contrast, iron, gold, and copper mines and deposits are closer to a more proportional distribution while showing a high concentration in the intermediate biodiversity zone. Considering the five metals together, 63% and 61% of available mines and deposits, respectively, are located in intermediate diversity zones, comprising 52% of the global land terrestrial surface. 23% of mines and 20% of ore deposits are located in areas of high plant diversity, covering 17% of the land. 13% of mines and 19% of deposits are in areas of low plant diversity, comprising 31% of the land surface. Thus, there seems to be potential for opening new mines in areas of low biodiversity in the future. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. The VVDS: Early Results on the Large Scale Structure Distribution of Galaxies out to z ˜ 1.5

    NASA Astrophysics Data System (ADS)

    Le Fèvre, O.; Vettolani, G.; Maccagni, D.; Picat, J. P.; Adami, C.; Arnaboldi, M.; Arnouts, S.; Bardelli, S.; Bolzonella, M.; Bondi, M.; Bottini, D.; Busarello, G.; Cappi, A.; Ciliegi, P.; Contini, T.; Charlot, S.; Foucaud, S.; Franzetti, P.; Garilli, B.; Gavignaud, I.; Guzzo, L.; Ilbert, O.; Iovino, A.; Le Brun, V.; Marano, B.; Marinoni, C.; McCracken, H. J.; Mathez, G.; Mazure, A.; Mellier, Y.; Meneux, B.; Merluzzi, P.; Merighi, R.; Paltani, S.; Pellò, R.; Pollo, A.; Pozzetti, L.; Radovich, M.; Rizzo, D.; Scaramella, R.; Scodeggio, M.; Tresse, L.; Zamorani, G.; Zanichelli, A.; Zucca, E.

    The VIMOS VLT Deep Survey (VVDS) is an on-going program to map the evolution of galaxies, large scale structures and AGNs from the redshift measurement of more than 100000 objects down to a magnitude IAB=24, in combination with a multi-wavelength dataset from radio to X-rays. We present here the first results obtained from more than 20000 spectra. Dedicated effort has been invested to successfully enter the "redshift desert" 1.5distribution of a well controlled sample of faint galaxies, and show that significant clustering is detected out to z~1.5.

  4. Suzaku Observation of A1689: Anisotropic Temperature and Entropy Distributions Associated with the Large-scale Structure

    NASA Astrophysics Data System (ADS)

    Kawaharada, Madoka; Okabe, Nobuhiro; Umetsu, Keiichi; Takizawa, Motokazu; Matsushita, Kyoko; Fukazawa, Yasushi; Hamana, Takashi; Miyazaki, Satoshi; Nakazawa, Kazuhiro; Ohashi, Takaya

    2010-05-01

    We present results of new, deep Suzaku X-ray observations (160 ks) of the intracluster medium (ICM) in A1689 out to its virial radius, combined with complementary data sets of the projected galaxy distribution obtained from the SDSS catalog and the projected mass distribution from our recent comprehensive weak and strong lensing analysis of Subaru/Suprime-Cam and Hubble Space Telescope/Advanced Camera for Surveys observations. Faint X-ray emission from the ICM around the virial radius (r vir ~ 15farcm6) is detected at 4.0σ significance, thanks to the low and stable particle background of Suzaku. The Suzaku observations reveal anisotropic gas temperature and entropy distributions in cluster outskirts of r 500 <~ r <~ r vir correlated with large-scale structure of galaxies in a photometric redshift slice around the cluster. The high temperature (~5.4 keV) and entropy region in the northeastern (NE) outskirts is apparently connected to an overdense filamentary structure of galaxies outside the cluster. The gas temperature and entropy profiles in the NE direction are in good agreement, out to the virial radius, with that expected from a recent XMM-Newton statistical study and with an accretion shock heating model of the ICM, respectively. On the contrary, the other outskirt regions in contact with low-density void environments have low gas temperatures (~1.7 keV) and entropies, deviating from hydrostatic equilibrium. These anisotropic ICM features associated with large-scale structure environments suggest that the thermalization of the ICM occurs faster along overdense filamentary structures than along low-density void regions. We find that the ICM density distribution is fairly isotropic, with a three-dimensional density slope of -2.29 ± 0.18 in the radial range of r 2500 <~ r <~ r 500, and with -1.24+0.23 -0.56 in r 500 <~ r <~ r vir, which, however, is significantly shallower than the Navarro, Frenk, and White universal matter density profile in the outskirts,

  5. SUZAKU OBSERVATION OF A1689: ANISOTROPIC TEMPERATURE AND ENTROPY DISTRIBUTIONS ASSOCIATED WITH THE LARGE-SCALE STRUCTURE

    SciTech Connect

    Kawaharada, Madoka; Okabe, Nobuhiro; Umetsu, Keiichi; Takizawa, Motokazu; Matsushita, Kyoko; Fukazawa, Yasushi; Hamana, Takashi; Miyazaki, Satoshi; Nakazawa, Kazuhiro; Ohashi, Takaya

    2010-05-01

    We present results of new, deep Suzaku X-ray observations (160 ks) of the intracluster medium (ICM) in A1689 out to its virial radius, combined with complementary data sets of the projected galaxy distribution obtained from the SDSS catalog and the projected mass distribution from our recent comprehensive weak and strong lensing analysis of Subaru/Suprime-Cam and Hubble Space Telescope/Advanced Camera for Surveys observations. Faint X-ray emission from the ICM around the virial radius (r{sub vir} {approx} 15.'6) is detected at 4.0{sigma} significance, thanks to the low and stable particle background of Suzaku. The Suzaku observations reveal anisotropic gas temperature and entropy distributions in cluster outskirts of r{sub 500} {approx_lt} r {approx_lt} r{sub vir} correlated with large-scale structure of galaxies in a photometric redshift slice around the cluster. The high temperature ({approx}5.4 keV) and entropy region in the northeastern (NE) outskirts is apparently connected to an overdense filamentary structure of galaxies outside the cluster. The gas temperature and entropy profiles in the NE direction are in good agreement, out to the virial radius, with that expected from a recent XMM-Newton statistical study and with an accretion shock heating model of the ICM, respectively. On the contrary, the other outskirt regions in contact with low-density void environments have low gas temperatures ({approx}1.7 keV) and entropies, deviating from hydrostatic equilibrium. These anisotropic ICM features associated with large-scale structure environments suggest that the thermalization of the ICM occurs faster along overdense filamentary structures than along low-density void regions. We find that the ICM density distribution is fairly isotropic, with a three-dimensional density slope of -2.29 {+-} 0.18 in the radial range of r{sub 2500} {approx_lt} r {approx_lt} r{sub 500}, and with -1.24{sup +0.23}{sub -0.56} in r{sub 500} {approx_lt} r {approx_lt} r{sub vir}, which

  6. Using Distributed Fiber Optic Sensing to Monitor Large Scale Permafrost Transitions: Preliminary Results from a Controlled Thaw Experiment

    NASA Astrophysics Data System (ADS)

    Ajo Franklin, J. B.; Wagner, A. M.; Lindsey, N.; Dou, S.; Bjella, K.; Daley, T. M.; Freifeld, B. M.; Ulrich, C.; Gelvin, A.; Morales, A.; James, S. R.; Saari, S.; Ekblaw, I.; Wood, T.; Robertson, M.; Martin, E. R.

    2016-12-01

    In a warming world, permafrost landscapes are being rapidly transformed by thaw, yielding surface subsidence and groundwater flow alteration. The same transformations pose a threat to arctic infrastructure and can induce catastrophic failure of the roads, runways, and pipelines on which human habitation depends. Scalable solutions to monitoring permafrost thaw dynamics are required to both quantitatively understand biogeochemical feedbacks as well as to protect built infrastructure from damage. Unfortunately, permafrost alteration happens over the time scale of climate change, years to decades, a decided challenge for testing new sensing technologies in a limited context. One solution is to engineer systems capable of rapidly thawing large permafrost units to allow short duration experiments targeting next-generation sensing approaches. We present preliminary results from a large-scale controlled permafrost thaw experiment designed to evaluate the utility of different geophysical approaches for tracking the cause, precursors, and early phases of thaw subsidence. We focus on the use of distributed fiber optic sensing for this challenge and deployed distributed temperature (DTS), strain (DSS), and acoustic (DAS) sensing systems in a 2D array to detect thaw signatures. A 10 x 15 x 1 m section of subsurface permafrost was heated using an array of 120 downhole heaters (60 w) at an experimental site near Fairbanks, AK. Ambient noise analysis of DAS datasets collected at the plot, coupled to shear wave inversion, was utilized to evaluate changes in shear wave velocity associated with heating and thaw. These measurements were confirmed by seismic surveys collected using a semi-permanent orbital seismic source activated on a daily basis. Fiber optic measurements were complemented by subsurface thermistor and thermocouple arrays, timelapse total station surveys, LIDAR, secondary seismic measurements (geophone and broadband recordings), timelapse ERT, borehole NMR, soil

  7. Coupling a distributed hydrological model with detailed forest structural information for large-scale global change impact assessment

    NASA Astrophysics Data System (ADS)

    Eisner, Stephanie; Huang, Shaochun; Majasalmi, Titta; Bright, Ryan; Astrup, Rasmus; Beldring, Stein

    2017-04-01

    Forests are recognized for their decisive effect on landscape water balance with structural forest characteristics as stand density or species composition determining energy partitioning and dominant flow paths. However, spatial and temporal variability in forest structure is often poorly represented in hydrological modeling frameworks, in particular in regional to large scale hydrological modeling and impact analysis. As a common practice, prescribed land cover classes (including different generic forest types) are linked to parameter values derived from literature, or parameters are determined by calibration. While national forest inventory (NFI) data provide comprehensive, detailed information on hydrologically relevant forest characteristics, their potential to inform hydrological simulation over larger spatial domains is rarely exploited. In this study we present a modeling framework that couples the distributed hydrological model HBV with forest structural information derived from the Norwegian NFI and multi-source remote sensing data. The modeling framework, set up for the entire of continental Norway at 1 km spatial resolution, is explicitly designed to study the combined and isolated impacts of climate change, forest management and land use change on hydrological fluxes. We use a forest classification system based on forest structure rather than biomes which allows to implicitly account for impacts of forest management on forest structural attributes. In the hydrological model, different forest classes are represented by three parameters: leaf area index (LAI), mean tree height and surface albedo. Seasonal cycles of LAI and surface albedo are dynamically simulated to make the framework applicable under climate change conditions. Based on a hindcast for the pilot regions Nord-Trøndelag and Sør-Trøndelag, we show how forest management has affected regional hydrological fluxes during the second half of the 20th century as contrasted to climate variability.

  8. Occurrence, distribution and bioaccumulation behaviour of hydrophobic organic contaminants in a large-scale constructed wetland in Singapore.

    PubMed

    Wang, Qian; Kelly, Barry C

    2017-09-01

    This study involved a field-based investigation to assess the occurrence, distribution and bioaccumulation behaviour of hydrophobic organic contaminants in a large-scale constructed wetland. Samples of raw leachate, water and wetland plants, Typha angustifolia, were collected for chemical analysis. Target contaminants included polychlorinated biphenyls (PCBs), organochlorine pesticides (OCP), as well as several halogenated flame retardants (HFRs) and personal care products (triclosan and synthetic musks). In addition to PCBs and OCPs, synthetic musks, triclosan (TCS) and dechlorane plus stereoisomers (syn- and anti-DPs) were frequently detected. Root concentration factors (log RCF L/kg wet weight) of the various contaminants ranged between 3.0 and 7.9. Leaf concentration factors (log LCF L/kg wet weight) ranged between 2.4 and 8.2. syn- and anti-DPs exhibited the greatest RCF and LCF values. A strong linear relationship was observed between log RCF and octanol-water partition coefficient (log KOW). Translocation factors (log TFs) were negatively correlated with log KOW. The results demonstrate that more hydrophobic compounds exhibit higher degrees of partitioning into plant roots and are less effectively transported from roots to plant leaves. Methyl triclosan (MTCS) and 2,8-dichlorodibenzo-p-dioxin (DCDD), TCS degradation products, exhibited relatively high concentrations in roots and leaves., highlighting the importance of degradation/biotransformation. The results further suggest that Typha angustifolia in this constructed wetland can aid the removal of hydrophobic organic contaminants present in this landfill leachate. The findings will aid future investigations regarding the fate and bioaccumulation of hydrophobic organic contaminants in constructed wetlands. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  10. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3

  11. Implementation of Pilot Protection System for Large Scale Distribution System like The Future Renewable Electric Energy Distribution Management Project

    NASA Astrophysics Data System (ADS)

    Iigaya, Kiyohito

    A robust, fast and accurate protection system based on pilot protection concept was developed previously and a few alterations in that algorithm were made to make it faster and more reliable and then was applied to smart distribution grids to verify the results for it. The new 10 sample window method was adapted into the pilot protection program and its performance for the test bed system operation was tabulated. Following that the system comparison between the hardware results for the same algorithm and the simulation results were compared. The development of the dual slope percentage differential method, its comparison with the 10 sample average window pilot protection system and the effects of CT saturation on the pilot protection system are also shown in this thesis. The implementation of the 10 sample average window pilot protection system is done to multiple distribution grids like Green Hub v4.3, IEEE 34, LSSS loop and modified LSSS loop. Case studies of these multi-terminal model are presented, and the results are also shown in this thesis. The result obtained shows that the new algorithm for the previously proposed protection system successfully identifies fault on the test bed and the results for both hardware and software simulations match and the response time is approximately less than quarter of a cycle which is fast as compared to the present commercial protection system and satisfies the FREEDM system requirement.

  12. Novel probabilistic and distributed algorithms for guidance, control, and nonlinear estimation of large-scale multi-agent systems

    NASA Astrophysics Data System (ADS)

    Bandyopadhyay, Saptarshi

    Multi-agent systems are widely used for constructing a desired formation shape, exploring an area, surveillance, coverage, and other cooperative tasks. This dissertation introduces novel algorithms in the three main areas of shape formation, distributed estimation, and attitude control of large-scale multi-agent systems. In the first part of this dissertation, we address the problem of shape formation for thousands to millions of agents. Here, we present two novel algorithms for guiding a large-scale swarm of robotic systems into a desired formation shape in a distributed and scalable manner. These probabilistic swarm guidance algorithms adopt an Eulerian framework, where the physical space is partitioned into bins and the swarm's density distribution over each bin is controlled using tunable Markov chains. In the first algorithm - Probabilistic Swarm Guidance using Inhomogeneous Markov Chains (PSG-IMC) - each agent determines its bin transition probabilities using a time-inhomogeneous Markov chain that is constructed in real-time using feedback from the current swarm distribution. This PSG-IMC algorithm minimizes the expected cost of the transitions required to achieve and maintain the desired formation shape, even when agents are added to or removed from the swarm. The algorithm scales well with a large number of agents and complex formation shapes, and can also be adapted for area exploration applications. In the second algorithm - Probabilistic Swarm Guidance using Optimal Transport (PSG-OT) - each agent determines its bin transition probabilities by solving an optimal transport problem, which is recast as a linear program. In the presence of perfect feedback of the current swarm distribution, this algorithm minimizes the given cost function, guarantees faster convergence, reduces the number of transitions for achieving the desired formation, and is robust to disturbances or damages to the formation. We demonstrate the effectiveness of these two proposed swarm

  13. Robust scalable stabilisability conditions for large-scale heterogeneous multi-agent systems with uncertain nonlinear interactions: towards a distributed computing architecture

    NASA Astrophysics Data System (ADS)

    Manfredi, Sabato

    2016-06-01

    Large-scale dynamic systems are becoming highly pervasive in their occurrence with applications ranging from system biology, environment monitoring, sensor networks, and power systems. They are characterised by high dimensionality, complexity, and uncertainty in the node dynamic/interactions that require more and more computational demanding methods for their analysis and control design, as well as the network size and node system/interaction complexity increase. Therefore, it is a challenging problem to find scalable computational method for distributed control design of large-scale networks. In this paper, we investigate the robust distributed stabilisation problem of large-scale nonlinear multi-agent systems (briefly MASs) composed of non-identical (heterogeneous) linear dynamical systems coupled by uncertain nonlinear time-varying interconnections. By employing Lyapunov stability theory and linear matrix inequality (LMI) technique, new conditions are given for the distributed control design of large-scale MASs that can be easily solved by the toolbox of MATLAB. The stabilisability of each node dynamic is a sufficient assumption to design a global stabilising distributed control. The proposed approach improves some of the existing LMI-based results on MAS by both overcoming their computational limits and extending the applicative scenario to large-scale nonlinear heterogeneous MASs. Additionally, the proposed LMI conditions are further reduced in terms of computational requirement in the case of weakly heterogeneous MASs, which is a common scenario in real application where the network nodes and links are affected by parameter uncertainties. One of the main advantages of the proposed approach is to allow to move from a centralised towards a distributed computing architecture so that the expensive computation workload spent to solve LMIs may be shared among processors located at the networked nodes, thus increasing the scalability of the approach than the network

  14. Which spatial discretization for distributed hydrological models? Proposition of a methodology and illustration for medium to large-scale catchments

    NASA Astrophysics Data System (ADS)

    Dehotin, J.; Braud, I.

    2008-05-01

    discretization). The first part of the paper presents a review about catchment discretization in hydrological models from which we derived the principles of our general methodology. The second part of the paper focuses on the derivation of hydro-landscape units for medium to large scale catchments. For this sub-catchment discretization, we propose the use of principles borrowed from landscape classification. These principles are independent of the catchment size. They allow retaining suitable features required in the catchment description in order to fulfil a specific modelling objective. The method leads to unstructured and homogeneous areas within the sub-catchments, which can be used to derive modelling meshes. It avoids map smoothing by suppressing the smallest units, the role of which can be very important in hydrology, and provides a confidence map (the distance map) for the classification. The confidence map can be used for further uncertainty analysis of modelling results. The final discretization remains consistent with the resolution of input data and that of the source maps. The last part of the paper illustrates the method using available data for the upper Saône catchment in France. The interest of the method for an efficient representation of landscape heterogeneity is illustrated by a comparison with more traditional mapping approaches. Examples of possible models, which can be built on this spatial discretization, are finally given as perspectives for the work.

  15. Impact of Distribution-Connected Large-Scale Wind Turbines on Transmission System Stability during Large Disturbances: Preprint

    SciTech Connect

    Zhang, Y.; Allen, A.; Hodge, B. M.

    2014-02-01

    This work examines the dynamic impacts of distributed utility-scale wind power during contingency events on both the distribution system and the transmission system. It is the first step toward investigating high penetrations of distribution-connected wind power's impact on both distribution and transmission stability.

  16. The typical MSW odorants identification and the spatial odorants distribution in a large-scale transfer station.

    PubMed

    Sun, Zhongtao; Cheng, Zhaowen; Wang, Luochun; Lou, Ziyang; Zhu, Nanwen; Zhou, Xuejun; Feng, Lili

    2017-03-01

    Odorants from municipal solid waste (MSW) were complex variable, and the screening of key offensive odorants was the prerequisite for odor control process. In this study, spatial odor emissions and environmental impacts were investigated based on a large-scale working waste transfer station (LSWTS) using waste container system, and a comprehensive odor characterization method was developed and applied in terms of the odor concentration (OC), theory odor concentration (TOC), total chemical concentration (TCC), and electric nose (EN). The detected odor concentration ranged from 14 to 28 (dimensionless), and MSW container showed the highest OC value of 28, EN of 78, and TCC of 35 (ppm) due to the accumulation of leachate and residual MSW. Ninety-two species odorants were identified, and H2S, NH3, benzene, styrene, ethyl acetate, and dichloromethane were the main contributors in the container, while benzene, m,p,x-xylene, butanone, acetone, isopropanol, and ethyl acetate were predominant in the compression surface (CS) and compression plant (CP). Side of roads (SR) and unload hall (UH) showed low odorous impact. Based on this odor list, 20 species of odor substances were screened for the priority control through the synthetic evaluation method, considering the odorants concentrations, toxicity, threshold values, detection frequency, saturated vapor pressure, and appeared frequency. Graphical abstract.

  17. Large-scale synchrony of gap dynamics and the distribution of understory tree species in maple-beech forests.

    PubMed

    Gravel, Dominique; Beaudet, Marilou; Messier, Christian

    2010-01-01

    Large-scale synchronous variations in community dynamics are well documented for a vast array of organisms, but are considerably less understood for forest trees. Because of temporal variations in canopy gap dynamics, forest communities-even old-growth ones-are never at equilibrium at the stand scale. This paucity of equilibrium may also be true at the regional scale. Our objectives were to determine (1) if nonequilibrium dynamics caused by temporal variations in the formation of canopy gaps are regionally synchronized, and (2) if spatiotemporal variations in canopy gap formation affect the relative abundance of tree species in the understory. We examined these questions by analyzing variations in the suppression and release history of Acer saccharum Marsh. and Fagus grandifolia Ehrh. from 481 growth series of understory saplings taken from 34 mature stands. We observed that (1) the proportion of stems in release as a function of time exhibited a U-shaped pattern over the last 35 years, with the lowest levels occurring during 1975-1985, and that (2) the response to this in terms of species composition was that A. saccharum became more abundant at sites that had the highest proportion of stems in release during 1975-1985. We concluded that the understory dynamics, typically thought of as a stand-scale process, may be regionally synchronized.

  18. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  19. Geospatial analysis and distribution patterns of home nursing care in a metropolitan area - a large-scale analysis.

    PubMed

    Bauer, Jan; Reinhard, Julia; Boll, Michael; Groneberg, David

    2017-01-01

    This study focuses on home nursing care distribution in an urban setting in Germany. A shortage of nursing care workforce is present in Germany. A geospatial analysis was performed to examine distribution patterns at the district level in Frankfurt, Germany (n = 46 districts) and factors were analysed influencing the location choice of home nursing care providers (n = 151). Furthermore, within the analysis we focused on the population aged over 65 years to model the demand for nursing care. The analysis revealed a tendency of home nursing care providers to be located near the city centre (centripetal distribution pattern). However, the demand for care showed more inconsistent patterns. Still, a centripetal distribution pattern of demand could be stated. Compared with the control groups (e.g. acute hospitals and pharmacies) similar geographical distribution patterns were present. However, the location of home nursing care providers was less influenced by demand compared with the control groups. The supply of nursing care was unevenly distributed in this metropolitan setting, but still matched the demand for nursing care. Due to the rapidly changing health care environments policy, regulations must be (re-)evaluated critically to improve the management and delivery of nursing care provision. © 2016 John Wiley & Sons Ltd.

  20. An Efficient Framework for Large Scale Multimedia Content Distribution in P2P Network: I2NC

    PubMed Central

    Anandaraj, M.; Ganeshkumar, P.; Vijayakumar, K. P.; Selvaraj, K.

    2015-01-01

    Network coding (NC) makes content distribution more effective and easier in P2P content distribution network and reduces the burden of the original seeder. It generalizes traditional network routing by allowing the intermediate nodes to generate new coded packet by combining the received packets. The randomization introduced by network coding makes all packets equally important and resolves the problem of locating the rarest block. Further, it reduces traffic in the network. In this paper, we analyze the performance of traditional network coding in P2P content distribution network by using a mathematical model and it is proved that traffic reduction has not been fully achieved in P2P network using traditional network coding. It happens due to the redundant transmission of noninnovative information block among the peers in the network. Hence, we propose a new framework, called I2NC (intelligent-peer selection and incremental-network coding), to eliminate the unnecessary flooding of noninnovative coded packets and thereby to improve the performance of network coding in P2P content distribution further. A comparative study and analysis of the proposed system is made through various related implementations and the results show that 10–15% of traffic reduced and improved the average and maximum download time by reducing original seeder's workload. PMID:26605375

  1. Breeding density, fine-scale tracking, and large-scale modeling reveal the regional distribution of four seabird species.

    PubMed

    Wakefield, Ewan D; Owen, Ellie; Baer, Julia; Carroll, Matthew J; Daunt, Francis; Dodd, Stephen G; Green, Jonathan A; Guilford, Tim; Mavor, Roddy A; Miller, Peter I; Newell, Mark A; Newton, Stephen F; Robertson, Gail S; Shoji, Akiko; Soanes, Louise M; Votier, Stephen C; Wanless, Sarah; Bolton, Mark

    2017-10-01

    Population-level estimates of species' distributions can reveal fundamental ecological processes and facilitate conservation. However, these may be difficult to obtain for mobile species, especially colonial central-place foragers (CCPFs; e.g., bats, corvids, social insects), because it is often impractical to determine the provenance of individuals observed beyond breeding sites. Moreover, some CCPFs, especially in the marine realm (e.g., pinnipeds, turtles, and seabirds) are difficult to observe because they range tens to ten thousands of kilometers from their colonies. It is hypothesized that the distribution of CCPFs depends largely on habitat availability and intraspecific competition. Modeling these effects may therefore allow distributions to be estimated from samples of individual spatial usage. Such data can be obtained for an increasing number of species using tracking technology. However, techniques for estimating population-level distributions using the telemetry data are poorly developed. This is of concern because many marine CCPFs, such as seabirds, are threatened by anthropogenic activities. Here, we aim to estimate the distribution at sea of four seabird species, foraging from approximately 5,500 breeding sites in Britain and Ireland. To do so, we GPS-tracked a sample of 230 European Shags Phalacrocorax aristotelis, 464 Black-legged Kittiwakes Rissa tridactyla, 178 Common Murres Uria aalge, and 281 Razorbills Alca torda from 13, 20, 12, and 14 colonies, respectively. Using Poisson point process habitat use models, we show that distribution at sea is dependent on (1) density-dependent competition among sympatric conspecifics (all species) and parapatric conspecifics (Kittiwakes and Murres); (2) habitat accessibility and coastal geometry, such that birds travel further from colonies with limited access to the sea; and (3) regional habitat availability. Using these models, we predict space use by birds from unobserved colonies and thereby map the

  2. Modelling the role of marine particle on large scale 231Pa, 230Th, Iron and Aluminium distributions

    NASA Astrophysics Data System (ADS)

    Dutay, J.-C.; Tagliabue, A.; Kriest, I.; van Hulten, M. M. P.

    2015-04-01

    The distribution of trace elements in the ocean is governed by the combined effects of various processes, and by exchanges with external sources. Modelling these represents an opportunity to better understand and quantify the mechanisms that regulate the oceanic tracer cycles. Observations collected during the GEOTRACES program provide an opportunity to improve our knowledge regarding processes that should be considered in biogeochemical models to adequately represent the distributions of trace elements in the ocean. Here we present a synthesis about the state of the art for simulating selected trace elements in biogeochemical models: Protactinium, Thorium, Iron and Aluminium. In this contribution we pay particular attention on the role of particles in the cycling of these tracers and how they may provide additional constraints on the transfer of matter in the ocean.

  3. Prototyping a large-scale distributed system for the Great Observatories era - NASA Astrophysics Data System (ADS)

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1990-01-01

    The NASA Astrophysics Data System (ADS) is a distributed information system intended to support research in the Great Observatories era, to simplify access to data, and to enable simultaneous analyses of multispectral data sets. Here, the user agent and interface, its functions, and system components are examined, and the system architecture and infrastructure is addressed. The present status of the system and related future activities are examined.

  4. Prototyping a large-scale distributed system for the Great Observatories era - NASA Astrophysics Data System (ADS)

    NASA Technical Reports Server (NTRS)

    Shames, Peter

    1990-01-01

    The NASA Astrophysics Data System (ADS) is a distributed information system intended to support research in the Great Observatories era, to simplify access to data, and to enable simultaneous analyses of multispectral data sets. Here, the user agent and interface, its functions, and system components are examined, and the system architecture and infrastructure is addressed. The present status of the system and related future activities are examined.

  5. ‘Oorja’ in India: Assessing a large-scale commercial distribution of advanced biomass stoves to households

    PubMed Central

    Thurber, Mark C.; Phadke, Himani; Nagavarapu, Sriniketh; Shrimali, Gireesh; Zerriffi, Hisham

    2015-01-01

    Replacing traditional stoves with advanced alternatives that burn more cleanly has the potential to ameliorate major health problems associated with indoor air pollution in developing countries. With a few exceptions, large government and charitable programs to distribute advanced stoves have not had the desired impact. Commercially-based distributions that seek cost recovery and even profits might plausibly do better, both because they encourage distributors to supply and promote products that people want and because they are based around properly-incentivized supply chains that could more be scalable, sustainable, and replicable. The sale in India of over 400,000 “Oorja” stoves to households from 2006 onwards represents the largest commercially-based distribution of a gasification-type advanced biomass stove. BP's Emerging Consumer Markets (ECM) division and then successor company First Energy sold this stove and the pelletized biomass fuel on which it operates. We assess the success of this effort and the role its commercial aspect played in outcomes using a survey of 998 households in areas of Maharashtra and Karnataka where the stove was sold as well as detailed interviews with BP and First Energy staff. Statistical models based on this data indicate that Oorja purchase rates were significantly influenced by the intensity of Oorja marketing in a region as well as by pre-existing stove mix among households. The highest rate of adoption came from LPG-using households for which Oorja's pelletized biomass fuel reduced costs. Smoke- and health-related messages from Oorja marketing did not significantly influence the purchase decision, although they did appear to affect household perceptions about smoke. By the time of our survey, only 9% of households that purchased Oorja were still using the stove, the result in large part of difficulties First Energy encountered in developing a viable supply chain around low-cost procurement of “agricultural waste” to

  6. FabSim: Facilitating computational research through automation on large-scale and distributed e-infrastructures

    NASA Astrophysics Data System (ADS)

    Groen, Derek; Bhati, Agastya P.; Suter, James; Hetherington, James; Zasada, Stefan J.; Coveney, Peter V.

    2016-10-01

    We present FabSim, a toolkit developed to simplify a range of computational tasks for researchers in diverse disciplines. FabSim is flexible, adaptable, and allows users to perform a wide range of tasks with ease. It also provides a systematic way to automate the use of resources, including HPC and distributed machines, and to make tasks easier to repeat by recording contextual information. To demonstrate this, we present three use cases where FabSim has enhanced our research productivity. These include simulating cerebrovascular bloodflow, modelling clay-polymer nanocomposites across multiple scales, and calculating ligand-protein binding affinities.

  7. Ownership and usage of mosquito nets after four years of large-scale free distribution in Papua New Guinea.

    PubMed

    Hetzel, Manuel W; Gideon, Gibson; Lote, Namarola; Makita, Leo; Siba, Peter M; Mueller, Ivo

    2012-06-10

    Papua New Guinea (PNG) is a highly malaria endemic country in the South-West Pacific with a population of approximately 6.6 million (2009). In 2004, the country intensified its malaria control activities with support from the Global Fund. With the aim of achieving 80% ownership and usage, a country-wide campaign distributed two million free long-lasting insecticide-treated nets (LLINs). In order to evaluate outcomes of the campaign against programme targets, a country-wide household survey based on stratified multi-stage random sampling was carried out in 17 of the 20 provinces after the campaign in 2008/09. In addition, a before-after assessment was carried out in six purposively selected sentinel sites. A structured questionnaire was administered to the heads of sampled households to elicit net ownership and usage information. After the campaign, 64.6% of households owned a LLIN, 80.1% any type of mosquito net. Overall usage by household members amounted to 32.5% for LLINs and 44.3% for nets in general. Amongst children under five years, 39.5% used a LLIN and 51.8% any type of net, whereas 41.3% of pregnant women used a LLIN and 56.1% any net. Accessibility of villages was the key determinant of net ownership, while usage was mainly determined by ownership. Most (99.5%) of the household members who did not sleep under a net did not have access to a (unused) net in their household. In the sentinel sites, LLIN ownership increased from 9.4% to 88.7%, ownership of any net from 52.7% to 94.1%. Usage of LLINs increased from 5.5% to 55.1%, usage of any net from 37.3% to 66.7%. Among children under five years, usage of LLINs and of nets in general increased from 8.2% to 67.0% and from 44.6% to 76.1%, respectively (all p ≤ 0.001). While a single round of free distribution of LLINs significantly increased net ownership, an insufficient number of nets coupled with a heterogeneous distribution led to overall low usage rates. Programme targets were missed mainly as a

  8. Large scale scientific computing

    SciTech Connect

    Deuflhard, P. ); Engquist, B. )

    1987-01-01

    This book presents papers on large scale scientific computing. It includes: Initial value problems of ODE's and parabolic PDE's; Boundary value problems of ODE's and elliptic PDE's; Hyperbolic PDE's; Inverse problems; Optimization and optimal control problems; and Algorithm adaptation on supercomputers.

  9. Household malaria knowledge and its association with bednet ownership in settings without large-scale distribution programs: Evidence from rural Madagascar.

    PubMed

    Krezanoski, Paul J; Tsai, Alexander C; Hamer, Davidson H; Comfort, Alison B; Bangsberg, David R

    2014-06-01

    Insecticide-treated bednets are effective at preventing malaria. This study focuses on household-level factors that are associated with bednet ownership in a rural area of Madagascar which had not been a recipient of large-scale ITN distribution. Data were gathered on individual and household characteristics, malaria knowledge, household assets and bednet ownership. Principal components analysis was used to construct both a wealth index based on household assets and a malaria knowledge index based on responses to questions about malaria. Bivariate and multivariate regressions were used to determine predictors of household bednet ownership and malaria knowledge. Forty-seven of 560 households (8.4%) owned a bednet. In multivariate analysis, higher level of malaria knowledge among household members was the only variable significantly associated with bednet ownership (odds ratio 3.72, P < 0.001). Among respondents, predictors of higher malaria knowledge included higher education levels, female sex and reporting fever as the most frequent or dangerous illness in the community. Household wealth was not a significant predictor of bednet ownership or respondent malaria knowledge. In this setting of limited supply of affordable bednets, malaria knowledge was associated with an increased probability of household bednet ownership. Further studies should determine how such malaria knowledge evolves and if malaria-specific education programs could help overcome the barriers to bednet ownership among at-risk households living outside the reach of large-scale bednet distribution programs.

  10. Large-scale environment of z ˜ 5.7 C IV absorption systems - I. Projected distribution of galaxies

    NASA Astrophysics Data System (ADS)

    Díaz, C. Gonzalo; Koyama, Yusei; Ryan-Weber, Emma V.; Cooke, Jeff; Ouchi, Masami; Shimasaku, Kazuhiro; Nakata, Fumiaki

    2014-08-01

    Metal absorption systems are products of star formation. They are believed to be associated with massive star-forming galaxies, which have significantly enriched their surroundings. To test this idea with high column density C IV absorption systems at z ˜ 5.7, we study the projected distribution of galaxies and characterize the environment of C IV systems in two independent quasar lines of sight: J103027.01+052455.0 and J113717.73+354956.9. Using wide-field photometry (˜80 × 60 h-1 comoving Mpc), we select bright (MUV(1350 Å) ≲ -21.0 mag.) Lyman break galaxies (LBGs) at z ˜ 5.7 in a redshift slice Δz ˜ 0.2 and we compare their projected distribution with z ˜ 5.7 narrow-band selected Lyman alpha emitters (LAEs, Δz ˜ 0.08). We find that the C IV systems are located more than 10 h-1 projected comoving Mpc from the main concentrations of LBGs and no candidate is closer than ˜5 h-1 projected comoving Mpc. In contrast, an excess of LAEs - lower mass galaxies - is found on scales of ˜10 h-1 comoving Mpc, suggesting that LAEs are the primary candidates for the source of the C IV systems. Furthermore, the closest object to the system in the field J1030+0524 is a faint LAE at a projected distance of 212 h-1 physical kpc. However, this work cannot rule out undiscovered lower mass galaxies as the origin of these absorption systems. We conclude that, in contrast with lower redshift examples (z ≲ 3.5), strong C IV absorption systems at z ˜ 5.7 trace low-to-intermediate density environments dominated by low-mass galaxies. Moreover, the excess of LAEs associated with high levels of ionizing flux agrees with the idea that faint galaxies dominate the ionizing photon budget at this redshift.

  11. On the large-scale distribution of magnetospheric currents and thermal plasma: Results from magnetic field models and observations

    SciTech Connect

    Spence, H.E.

    1989-01-01

    The author presents the results of studies using magnetic field models and observations to determine, as a function of magnetic activity, the distributions of plasma pressure in the Earth's magnetic tail and to characterize field-aligned currents globally. He first presents a brief history of magnetic field models of the Earth's magnetosphere. He then discuss related work on magnetotail plasma pressure. In the first study, he develops a technique for obtaining pressure gradients and anisotropies consistent with quasi-static equilibrium from recent empirical magnetic field models. He finds that the near-tail magnetic stresses can be balanced by a nearly isotropic plasma pressure with a realistic equatorial gradient. In the second study, he surveys plasma pressures observed near the midnight meridian. He finds that vertical pressure balance is maintained between lobe magnetic and plasma sheet plasma pressure and that observed and model-derived pressures are consistent. The combined model-derived and observed pressure profile falls off more slowly than it would if established by a two-dimensional, adiabatic, lossless convection model. He reassess the convection model and finds that observed quiet time pressure profiles can be reproduced so long as he accounts for the finite tail width. In the next main section, he presents studies on the distribution of field-aligned currents (FACs). First, empirical magnetic models are used to determine the average FACs flowing in the magnetosphere as a function of geomagnetic activity. When mapped to the ionosphere, FAC systems with region 1 polarity both on the day side (DR1) and the night side (NR1) can be identified; a low-level, region 2-sense system (NPC) flows poleward of the NR1 system.

  12. Determining organic carbon distributions in soil particle size fractions as a precondition of lateral carbon transport modeling at large scales

    NASA Astrophysics Data System (ADS)

    Schindewolf, Marcus; Seher, Wiebke; Pfeffer, Eduard; Schultze, Nico; Amorim, Ricardo S. S.; Schmidt, Jürgen

    2016-04-01

    The erosional transport of organic carbon has an effect on the global carbon budget, however, it is uncertain, whether erosion is a sink or a source for carbon in the atmosphere. Continuous erosion leads to a massive loss of top soils including the loss of organic carbon historically accumulated in the soil humus fraction. The colluvial organic carbon could be protected from further degradation depending on the depth of the colluvial cover and local decomposing conditions. Another part of eroded soils and organic carbon will enter surface water bodies and might be transported over long distances. The selective nature of soil erosion results in a preferential transport of fine particles while less carbonic larger particles remain on site. Consequently organic carbon is enriched in the eroded sediment compared to the origin soil. As a precondition of process based lateral carbon flux modeling, carbon distribution on soil particle size fractions has to be known. In this regard the present study refers to the determination of organic carbon contents on soil particle size separates by a combined sieve-sedimentation method for different tropical and temperate soils Our results suggest high influences of parent material and climatic conditions on carbon distribution on soil particle separates. By applying these results in erosion modeling a test slope was simulated with the EROSION 2D simulation software covering certain land use and soil management scenarios referring to different rainfall events. These simulations allow first insights on carbon loss and depletion on sediment delivery areas as well as carbon gains and enrichments on deposition areas on the landscape scale and could be used as a step forward in landscape scaled carbon redistribution modeling.

  13. Galaxy clustering on large scales.

    PubMed Central

    Efstathiou, G

    1993-01-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe. PMID:11607400

  14. Development of a low-power, low-cost front end electronics module for large scale distributed neutrino detectors

    SciTech Connect

    James J. Beatty Richard D. Kass

    2008-03-08

    A number of concepts have been presented for distributed neutrino detectors formed of large numbers of autonomous detectors. Examples include the Antarctic Ross Ice Shelf Antenna Neutrino Array (ARIANNA) [Barwick 2006], as well as proposed radio extensions to the IceCube detector at South Pole Station such as AURA and IceRay. [Besson 2008]. We have focused on key enabling technical developments required by this class of experiments. The radio Cherenkov signal, generated by the Askaryan mechanism [Askaryan 1962, 1965], is impulsive and coherent up to above 1 GHz. In the frequency domain, the impulsive character of the emission results in simultaneous increase of the power detected in multiple frequency bands. This multiband triggering approach has proven fruitful, especially as anthropogenic interference often results from narrowband communications signals. A typical distributed experiment of this type consists of a station responsible for the readout of a cluster of antennas either near the surface of the ice or deployed in boreholes. Each antenna is instrumented with a broadband low-noise amplifier, followed by an array of filters to facilitate multi-band coincidence trigger schemes at the antenna level. The power in each band is detected at the output of each band filter, using either square-law diode detectors or log-power detectors developed for the cellular telephone market. The use of multiple antennas per station allows a local coincidence among antennas to be used as the next stage of the trigger. Station triggers can then be combined into an array trigger by comparing timestamps of triggers among stations and identifying space-time clusters of station triggers. Data from each station is buffered and can be requested from the individual stations when a multi-station coincidence occurs. This approach has been successfully used in distributed experiments such as the Pierre Auger Observatory. [Abraham et al. 2004] We identified the filters as being especially

  15. Coronal mass ejection rate and the evolution of the large-scale K-coronal density distribution

    SciTech Connect

    Sime, D.G.

    1989-01-01

    Recently reported occurrence rates of coronal mass ejections (CMEs) are compared with the time scale for the long-term evolution of the global white light coronal density distribution. This time scale is estimated from the synoptic observations of the corona made from Mauna Loa, Hawaii, by a seies of K-coronameters. The data span a period of more than 20 years and show evolution rates which vary with time roughly in phase with the solar activity cycle. However, there are detailed differences between the sunspot number curve and the long-term behavior of this quantity. When the occurrence rates of CMEs observed from orbiting coronagraphs, available mainly during the descending phase of the activity cycle, are compared with this evolution time, it is found that the two quantities are inversely proportional. From energy considerations, it is unlikely that there is a causal relationship between CMEs and this coronal evolution. Rather, the result indicates that the processes which lead to the global evolution are intimately related to those which give rise to CMEs, a hypothesis consistent with current theories that CMEs arise from preexisting magnetic structures which become stressed by the global magnetic field rearrangement to the point of instability. copyright American Geophysical Union 1989

  16. A methodology for preserving channel flow networks and connectivity patterns in large-scale distributed hydrological models

    NASA Astrophysics Data System (ADS)

    Shaw, Dean A.; Martz, Lawrence W.; Pietroniro, Alain

    2005-01-01

    Physiographic data are often used to parameterize hydrological models and, in the past, physiographic parameters have often been derived manually. However, this can be a lengthy and unreliable process, particularly for application to a gridded hydrological or atmospheric model applied to large or continental-scale basins. An important attribute of gridded models is drainage direction. Current methods that determine drainage directions for large or continental-scale basins, by general circulation models (GCMs), route flow using lowest neighbour algorithms. These methods, however, do not reflect the hydrology of the basin subunit.This paper proposes a method of parameterizing hydrological models with physiographic data using the ArcInfo macro language to create an interface between the Topographic Parameterization (TOPAZ) software and the WATFLOOD hydrological model. The interface uses output raster data created by TOPAZ (i.e. drainage identification) to supply physiographic parameters required by WATFLOOD.The interface (WATPAZ) is an expert system based on a manual method of deriving parameters for the WATFLOOD distributed model. The WATPAZ interface uses grouped response units to subdivide the watershed. This allows large drainage basins to be subdivided at a scale that allows computational efficiency while preserving the hydrological variability of the watershed.To test whether the WATPAZ method improves the current GCM methodology for determining drainage directions, WATPAZ is applied on a local basin (Wolf Creek) a regional-scale basin, (Athabasca) and a continental-scale basin (Mackenzie).An examination of flow directions derived from this new method with current GCM methods is carried out. The results indicate that a substantial improvement is made to flow routing within the basin using the channel network to determine drainage directions for each segment. Copyright

  17. Large-Scale Distributions of Tropospheric Nitric, Formic, and Acetic acids Over the Westerm Pacific Basin During Wintertime

    NASA Technical Reports Server (NTRS)

    Talbot, R. W.; Dibb, J. E.; Lefer, B. L.; Scheuer, E. M.; Bradshaw, J. D.; Sandholm, S. T.; Smyth, S.; Blake, D. R.; Blake, N. J.; Sachse, G. W.; hide

    1997-01-01

    We report here measurements of the acidic gases nitric (HNO3), formic (HCOOH), and acetic (CH3COOH) over the western Pacific basin during the February-March 1994 Pacific Exploratory Mission-West (PEM-West B). These data were obtained aboard the NASA DC-8 research aircraft as it flew missions in the altitude range of 0.3 - 12.5 km over equatorial regions near Guam and then further westward encompassing the entire Pacific Rim arc. Aged marine air over the equatorial Pacific generally exhibited mixing ratios of acidic gases less than 100 parts per trillion by volume (pptv). Near the Asian continent, discrete plumes encountered below 6 km altitude contained up to 8 parts per billion by volume (ppbv) HNO3 and 10 ppbv HCOOH and CH3COOH. Overall there was a general correlation between mixing ratios of acidic gases with those of CO, C2H2, and C2Cl4, indicative of emissions from combustion and industrial sources. The latitudinal distributions of HNO3 and CO showed that the largest mixing ratios were centered around 15 deg N, while HCOOH, CH3COOH, and C2Cl4 peaked at 25 deg N. The mixing ratios of HCOOH and CH3COOH were highly correlated (r(sup 2) = 0.87) below 6 km altitude, with a slope (0.89) characteristic of the nongrowing season at midlatitudes in the northern hemisphere. Above 6 km altitude, HCOOH and CH3COOH were marginally correlated (r(sup 2) = 0.50), and plumes well defined by CO, C2H2, and C2Cl4 were depleted in acidic gases, most likely due to scavenging during vertical transport of air masses through convective cloud systems over the Asian continent. In stratospheric air masses, HNO, mixing ratios were several parts per billion by volume (ppbv), yielding relationships with 03 and N2O consistent with those previously reported for NO(y).

  18. Large-Scale Distributions of Tropospheric Nitric, Formic, and Acetic acids Over the Westerm Pacific Basin During Wintertime

    NASA Technical Reports Server (NTRS)

    Talbot, R. W.; Dibb, J. E.; Lefer, B. L.; Scheuer, E. M.; Bradshaw, J. D.; Sandholm, S. T.; Smyth, S.; Blake, D. R.; Blake, N. J.; Sachse, G. W.; Collins, J. E.; Gregory, G. L.

    1997-01-01

    We report here measurements of the acidic gases nitric (HNO3), formic (HCOOH), and acetic (CH3COOH) over the western Pacific basin during the February-March 1994 Pacific Exploratory Mission-West (PEM-West B). These data were obtained aboard the NASA DC-8 research aircraft as it flew missions in the altitude range of 0.3 - 12.5 km over equatorial regions near Guam and then further westward encompassing the entire Pacific Rim arc. Aged marine air over the equatorial Pacific generally exhibited mixing ratios of acidic gases less than 100 parts per trillion by volume (pptv). Near the Asian continent, discrete plumes encountered below 6 km altitude contained up to 8 parts per billion by volume (ppbv) HNO3 and 10 ppbv HCOOH and CH3COOH. Overall there was a general correlation between mixing ratios of acidic gases with those of CO, C2H2, and C2Cl4, indicative of emissions from combustion and industrial sources. The latitudinal distributions of HNO3 and CO showed that the largest mixing ratios were centered around 15 deg N, while HCOOH, CH3COOH, and C2Cl4 peaked at 25 deg N. The mixing ratios of HCOOH and CH3COOH were highly correlated (r(sup 2) = 0.87) below 6 km altitude, with a slope (0.89) characteristic of the nongrowing season at midlatitudes in the northern hemisphere. Above 6 km altitude, HCOOH and CH3COOH were marginally correlated (r(sup 2) = 0.50), and plumes well defined by CO, C2H2, and C2Cl4 were depleted in acidic gases, most likely due to scavenging during vertical transport of air masses through convective cloud systems over the Asian continent. In stratospheric air masses, HNO, mixing ratios were several parts per billion by volume (ppbv), yielding relationships with 03 and N2O consistent with those previously reported for NO(y).

  19. Large-scale Distribution of Arrival Directions of Cosmic Rays Detected Above 1018 eV at the Pierre Auger Observatory

    NASA Astrophysics Data System (ADS)

    Pierre Auger Collaboration; Abreu, P.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Antiči'c, T.; Aramo, C.; Arganda, E.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Badescu, A. M.; Balzer, M.; Barber, K. B.; Barbosa, A. F.; Bardenet, R.; Barroso, S. L. C.; Baughman, B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellétoile, A.; Bellido, J. A.; BenZvi, S.; Berat, C.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blanco, F.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Brogueira, P.; Brown, W. C.; Bruijn, R.; Buchholz, P.; Bueno, A.; Buroker, L.; Burton, R. E.; Caballero-Mora, K. S.; Caccianiga, B.; Caramete, L.; Caruso, R.; Castellina, A.; Catalano, O.; Cataldi, G.; Cazon, L.; Cester, R.; Chauvin, J.; Cheng, S. H.; Chiavassa, A.; Chinellato, J. A.; Chirinos Diaz, J.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cook, H.; Cooper, M. J.; Coppens, J.; Cordier, A.; Coutu, S.; Covault, C. E.; Creusot, A.; Criss, A.; Cronin, J.; Curutiu, A.; Dagoret-Campagne, S.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; De Donato, C.; de Jong, S. J.; De La Vega, G.; de Mello Junior, W. J. M.; de Mello Neto, J. R. T.; De Mitri, I.; de Souza, V.; de Vries, K. D.; del Peral, L.; del Río, M.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Díaz Castro, M. L.; Diep, P. N.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dong, P. N.; Dorofeev, A.; dos Anjos, J. C.; Dova, M. T.; D'Urso, D.; Dutan, I.; Ebr, J.; Engel, R.; Erdmann, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fliescher, S.; Fracchiolla, C. E.; Fraenkel, E. D.; Fratu, O.; Fröhlich, U.; Fuchs, B.; Gaior, R.; Gamarra, R. F.; Gambetta, S.; García, B.; Garcia Roca, S. T.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gemmeke, H.; Ghia, P. L.; Giller, M.; Gitto, J.; Glass, H.; Gold, M. S.; Golup, G.; Gomez Albarracin, F.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; Gookin, B.; Gorgi, A.; Gouffon, P.; Grashorn, E.; Grebe, S.; Griffith, N.; Grillo, A. F.; Guardincerri, Y.; Guarino, F.; Guedes, G. P.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holmes, V. C.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Ionita, F.; Italiano, A.; Jansen, S.; Jarne, C.; Jiraskova, S.; Josebachuili, M.; Kadija, K.; Kampert, K. H.; Karhan, P.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kelley, J. L.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Knapp, J.; Koang, D.-H.; Kotera, K.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kulbartz, J. K.; Kunka, N.; La Rosa, G.; Lachaud, C.; LaHurd, D.; Latronico, L.; Lauer, R.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Lopez Agüera, A.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Lyberis, H.; Maccarone, M. C.; Macolino, C.; Maldera, S.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, J.; Marin, V.; Maris, I. C.; Marquez Falcon, H. R.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mazur, P. O.; Medina-Tanco, G.; Melissas, M.; Melo, D.; Menichetti, E.; Menshikov, A.; Mertsch, P.; Messina, S.; Meurer, C.; Meyhandan, R.; Mi'canovi'c, S.; Micheletti, M. I.; Minaya, I. A.; Miramonti, L.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morales, B.; Morello, C.; Moreno, E.; Moreno, J. C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navarro, J. L.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nhung, P. T.; Niechciol, M.; Niemietz, L.; Nierstenhoefer, N.; Nitz, D.; Nosek, D.; Nožka, L.; Oehlschläger, J.; Olinto, A.; Ortiz, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Parente, G.; Parizot, E.; Parra, A.; Pastor, S.; Paul, T.; Pech, M.; Peķala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Pesce, R.; Petermann, E.; Petrera, S.; Petrolini, A.; Petrov, Y.; Pfendner, C.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Ponce, V. H.; Pontz, M.; Porcelli, A.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rivera, H.; Rizi, V.; Roberts, J.; Rodrigues de Carvalho, W.; Rodriguez, G.; Rodriguez Cabo, I.; Rodriguez Martino, J.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Rouillé-d'Orfeuil, B.; Roulet, E.; Rovero, A. C.; Rühle, C.; Saftoiu, A.; Salamida, F.; Salazar, H.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarkar, S.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, A.; Scholten, O.; Schoorlemmer, H.; Schovancova, J.; Schovánek, P.; Schröder, F.; Schuster, D.; Sciutto, S. J.; Scuderi, M.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Silva Lopez, H. H.; Sima, O.; 'Smiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Spinka, H.; Squartini, R.; Srivastava, Y. N.; Stanic, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Šuša, T.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Tapia, A.; Tartare, M.; Taşcău, O.; Tcaciuc, R.; Thao, N. T.; Thomas, D.; Tiffenberg, J.; Timmermans, C.; Tkaczyk, W.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Travnicek, P.; Tridapalli, D. B.; Tristram, G.; Trovato, E.; Tueros, M.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Wahlberg, H.; Wahrlich, P.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Westerhoff, S.; Whelan, B. J.; Widom, A.; Wieczorek, G.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Will, M.; Williams, C.; Winchen, T.; Wommer, M.; Wundheiler, B.; Yamamoto, T.; Yapici, T.; Younk, P.; Yuan, G.; Yushkov, A.; Zamorano Garcia, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zaw, I.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.

    2012-12-01

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 1018 eV at the Pierre Auger Observatory is presented. This search is performed as a function of both declination and right ascension in several energy ranges above 1018 eV, and reported in terms of dipolar and quadrupolar coefficients. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Assuming that any cosmic-ray anisotropy is dominated by dipole and quadrupole moments in this energy range, upper limits on their amplitudes are derived. These upper limits allow us to test the origin of cosmic rays above 1018 eV from stationary Galactic sources densely distributed in the Galactic disk and predominantly emitting light particles in all directions.

  20. LARGE-SCALE DISTRIBUTION OF ARRIVAL DIRECTIONS OF COSMIC RAYS DETECTED ABOVE 10{sup 18} eV AT THE PIERRE AUGER OBSERVATORY

    SciTech Connect

    Abreu, P.; Andringa, S.; Aglietta, M.; Ahlers, M.; Ahn, E. J.; Albuquerque, I. F. M.; Allard, D.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muniz, J.; Alves Batista, R.; Ambrosio, M.; Aramo, C.; Aminaei, A.; Anchordoqui, L.; Antici'c, T.; Arganda, E.; Collaboration: Pierre Auger Collaboration; and others

    2012-12-15

    A thorough search for large-scale anisotropies in the distribution of arrival directions of cosmic rays detected above 10{sup 18} eV at the Pierre Auger Observatory is presented. This search is performed as a function of both declination and right ascension in several energy ranges above 10{sup 18} eV, and reported in terms of dipolar and quadrupolar coefficients. Within the systematic uncertainties, no significant deviation from isotropy is revealed. Assuming that any cosmic-ray anisotropy is dominated by dipole and quadrupole moments in this energy range, upper limits on their amplitudes are derived. These upper limits allow us to test the origin of cosmic rays above 10{sup 18} eV from stationary Galactic sources densely distributed in the Galactic disk and predominantly emitting light particles in all directions.

  1. Using large scale surveys to investigate seasonal variations in seabird distribution and abundance. Part II: The Bay of Biscay and the English Channel

    NASA Astrophysics Data System (ADS)

    Pettex, Emeline; Laran, Sophie; Authier, Matthieu; Blanck, Aurélie; Dorémus, Ghislain; Falchetto, Hélène; Lambert, Charlotte; Monestiez, Pascal; Stéfan, Eric; Van Canneyt, Olivier; Ridoux, Vincent

    2017-07-01

    Seabird distributions and the associated seasonal variations remain challenging to investigate, especially in oceanic areas. Recent advances in telemetry have provided considerable information on seabird ecology, but still exclude small species, non-breeding birds and individuals from inaccessible colonies from any scientific survey. To overcome this issue and investigate seabird distribution and abundance in the eastern North Atlantic (ENA), large-scale aerial surveys were conducted in winter 2011-12 and summer 2012 over a 375,000 km2 area encompassing the English Channel (EC) and the Bay of Biscay (BoB). Seabird sightings, from 15 taxonomic groups, added up to 17,506 and 8263 sightings in winter and summer respectively, along 66,307 km. Using geostatistical methods, density maps were provided for both seasons. Abundance was estimated by strip transect sampling. Most taxa showed marked seasonal variations in their density and distribution. The highest densities were recorded during winter for most groups except shearwaters, storm-petrels, terns and large-sized gulls. Subsequently, the abundance in winter nearly reached one million individuals and was 2.5 times larger than in summer. The continental shelf and the slope in the BoB and the EC were identified as key areas for seabird conservation, especially during winter, as birds from northern Europe migrate southward after breeding. This large-scale study provided a synoptic view of the seabird community in the ENA, over two contrasting seasons. Our results highlight that oceanic areas harbour an abundant avifauna. Since most of the existing marine protected areas are restricted to the coastal fringe, the importance of oceanic areas in winter should be considered in future conservation plans. Our work will provide a baseline for the monitoring of seabird distribution at sea, and could inform the EU Marine Strategy Framework Directive.

  2. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  3. Large-Scale Distribution and Activity of Prokaryotes in Deep-Sea Surface Sediments of the Mediterranean Sea and the Adjacent Atlantic Ocean

    PubMed Central

    Giovannelli, Donato; Molari, Massimiliano; d’Errico, Giuseppe; Baldrighi, Elisa; Pala, Claudia; Manini, Elena

    2013-01-01

    The deep-sea represents a substantial portion of the biosphere and has a major influence on carbon cycling and global biogeochemistry. Benthic deep-sea prokaryotes have crucial roles in this ecosystem, with their recycling of organic matter from the photic zone. Despite this, little is known about the large-scale distribution of prokaryotes in the surface deep-sea sediments. To assess the influence of environmental and trophic variables on the large-scale distribution of prokaryotes, we investigated the prokaryotic assemblage composition (Bacteria to Archaea and Euryarchaeota to Crenarchaeota ratio) and activity in the surface deep-sea sediments of the Mediterranean Sea and the adjacent North Atlantic Ocean. Prokaryotic abundance and biomass did not vary significantly across the Mediterranean Sea; however, there were depth-related trends in all areas. The abundance of prokaryotes was positively correlated with the sedimentary concentration of protein, an indicator of the quality and bioavailability of organic matter. Moving eastwards, the Bacteria contribution to the total prokaryotes decreased, which appears to be linked to the more oligotrophic conditions of the Eastern Mediterranean basins. Despite the increased importance of Archaea, the contributions of Crenarchaeota Marine Group I to the total pool was relatively constant across the investigated stations, with the exception of Matapan-Vavilov Deep, in which Euryarchaeota Marine Group II dominated. Overall, our data suggest that deeper areas of the Mediterranean Sea share more similar communities with each other than with shallower sites. Freshness and quality of sedimentary organic matter were identified through Generalized Additive Model analysis as the major factors for describing the variation in the prokaryotic community structure and activity in the surface deep-sea sediments. Longitude was also important in explaining the observed variability, which suggests that the overlying water masses might have a

  4. Large-scale distribution and activity of prokaryotes in deep-sea surface sediments of the Mediterranean Sea and the adjacent Atlantic Ocean.

    PubMed

    Giovannelli, Donato; Molari, Massimiliano; d'Errico, Giuseppe; Baldrighi, Elisa; Pala, Claudia; Manini, Elena

    2013-01-01

    The deep-sea represents a substantial portion of the biosphere and has a major influence on carbon cycling and global biogeochemistry. Benthic deep-sea prokaryotes have crucial roles in this ecosystem, with their recycling of organic matter from the photic zone. Despite this, little is known about the large-scale distribution of prokaryotes in the surface deep-sea sediments. To assess the influence of environmental and trophic variables on the large-scale distribution of prokaryotes, we investigated the prokaryotic assemblage composition (Bacteria to Archaea and Euryarchaeota to Crenarchaeota ratio) and activity in the surface deep-sea sediments of the Mediterranean Sea and the adjacent North Atlantic Ocean. Prokaryotic abundance and biomass did not vary significantly across the Mediterranean Sea; however, there were depth-related trends in all areas. The abundance of prokaryotes was positively correlated with the sedimentary concentration of protein, an indicator of the quality and bioavailability of organic matter. Moving eastwards, the Bacteria contribution to the total prokaryotes decreased, which appears to be linked to the more oligotrophic conditions of the Eastern Mediterranean basins. Despite the increased importance of Archaea, the contributions of Crenarchaeota Marine Group I to the total pool was relatively constant across the investigated stations, with the exception of Matapan-Vavilov Deep, in which Euryarchaeota Marine Group II dominated. Overall, our data suggest that deeper areas of the Mediterranean Sea share more similar communities with each other than with shallower sites. Freshness and quality of sedimentary organic matter were identified through Generalized Additive Model analysis as the major factors for describing the variation in the prokaryotic community structure and activity in the surface deep-sea sediments. Longitude was also important in explaining the observed variability, which suggests that the overlying water masses might have a

  5. Integrating SMOS brightness temperatures with a new conceptual spatially distributed hydrological model for improving flood and drought predictions at large scale.

    NASA Astrophysics Data System (ADS)

    Hostache, Renaud; Rains, Dominik; Chini, Marco; Lievens, Hans; Verhoest, Niko E. C.; Matgen, Patrick

    2017-04-01

    Motivated by climate change and its impact on the scarcity or excess of water in many parts of the world, several agencies and research institutions have taken initiatives in monitoring and predicting the hydrologic cycle at a global scale. Such a monitoring/prediction effort is important for understanding the vulnerability to extreme hydrological events and for providing early warnings. This can be based on an optimal combination of hydro-meteorological models and remote sensing, in which satellite measurements can be used as forcing or calibration data or for regularly updating the model states or parameters. Many advances have been made in these domains and the near future will bring new opportunities with respect to remote sensing as a result of the increasing number of spaceborn sensors enabling the large scale monitoring of water resources. Besides of these advances, there is currently a tendency to refine and further complicate physically-based hydrologic models to better capture the hydrologic processes at hand. However, this may not necessarily be beneficial for large-scale hydrology, as computational efforts are therefore increasing significantly. As a matter of fact, a novel thematic science question that is to be investigated is whether a flexible conceptual model can match the performance of a complex physically-based model for hydrologic simulations at large scale. In this context, the main objective of this study is to investigate how innovative techniques that allow for the estimation of soil moisture from satellite data can help in reducing errors and uncertainties in large scale conceptual hydro-meteorological modelling. A spatially distributed conceptual hydrologic model has been set up based on recent developments of the SUPERFLEX modelling framework. As it requires limited computational efforts, this model enables early warnings for large areas. Using as forcings the ERA-Interim public dataset and coupled with the CMEM radiative transfer model

  6. Large Scale Nonlinear Programming.

    DTIC Science & Technology

    1978-06-15

    KEY WORDS (Conhinu. as, t.n.t.. aid. if nic••iary aid ld.ntify by block n,a,b.r) L. In,~~~ IP!CIE LARGE SCALE OPTIMIZATION APPLICATIONS OF NONLINEAR ... NONLINEAR PROGRAMMING by Garth P. McCormick 1. Introduction The general mathematical programming ( optimization ) problem can be stated in the following form...because the difficulty in solving a general nonlinear optimization problem has a~ much to do with the nature of the functions involved as it does with the

  7. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    PubMed

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  8. A Large-Scale Distribution of Milk-Based Fortified Spreads: Evidence for a New Approach in Regions with High Burden of Acute Malnutrition

    PubMed Central

    Defourny, Isabelle; Minetti, Andrea; Harczi, Géza; Doyon, Stéphane; Shepherd, Susan; Tectonidis, Milton; Bradol, Jean-Hervé; Golden, Michael

    2009-01-01

    Background There are 146 million underweight children in the developing world, which contribute to up to half of the world's child deaths. In high burden regions for malnutrition, the treatment of individual children is limited by available resources. Here, we evaluate a large-scale distribution of a nutritional supplement on the prevention of wasting. Methods and Findings A new ready-to-use food (RUF) was developed as a diet supplement for children under three. The intervention consisted of six monthly distributions of RUF during the 2007 hunger gap in a district of Maradi region, Niger, for approximately 60,000 children (length: 60–85 cm). At each distribution, all children over 65 cm had their Mid-Upper Arm Circumference (MUAC) recorded. Admission trends for severe wasting (WFH<70% NCHS) in Maradi, 2002–2005 show an increase every year during the hunger gap. In contrast, in 2007, throughout the period of the distribution, the incidence of severe acute malnutrition (MUAC<110 mm) remained at extremely low levels. Comparison of year-over-year admissions to the therapeutic feeding program shows that the 2007 blanket distribution had essentially the same flattening effect on the seasonal rise in admissions as the 2006 individualized treatment of almost 60,000 children moderately wasted. Conclusions These results demonstrate the potential for distribution of fortified spreads to reduce the incidence of severe wasting in large population of children 6–36 months of age. Although further information is needed on the cost-effectiveness of such distributions, these results highlight the importance of re-evaluating current nutritional strategies and international recommendations for high burden areas of childhood malnutrition. PMID:19421316

  9. A large-scale distribution of milk-based fortified spreads: evidence for a new approach in regions with high burden of acute malnutrition.

    PubMed

    Defourny, Isabelle; Minetti, Andrea; Harczi, Géza; Doyon, Stéphane; Shepherd, Susan; Tectonidis, Milton; Bradol, Jean-Hervé; Golden, Michael

    2009-01-01

    There are 146 million underweight children in the developing world, which contribute to up to half of the world's child deaths. In high burden regions for malnutrition, the treatment of individual children is limited by available resources. Here, we evaluate a large-scale distribution of a nutritional supplement on the prevention of wasting. A new ready-to-use food (RUF) was developed as a diet supplement for children under three. The intervention consisted of six monthly distributions of RUF during the 2007 hunger gap in a district of Maradi region, Niger, for approximately 60,000 children (length: 60-85 cm). At each distribution, all children over 65 cm had their Mid-Upper Arm Circumference (MUAC) recorded. Admission trends for severe wasting (WFH<70% NCHS) in Maradi, 2002-2005 show an increase every year during the hunger gap. In contrast, in 2007, throughout the period of the distribution, the incidence of severe acute malnutrition (MUAC<110 mm) remained at extremely low levels. Comparison of year-over-year admissions to the therapeutic feeding program shows that the 2007 blanket distribution had essentially the same flattening effect on the seasonal rise in admissions as the 2006 individualized treatment of almost 60,000 children moderately wasted. These results demonstrate the potential for distribution of fortified spreads to reduce the incidence of severe wasting in large population of children 6-36 months of age. Although further information is needed on the cost-effectiveness of such distributions, these results highlight the importance of re-evaluating current nutritional strategies and international recommendations for high burden areas of childhood malnutrition.

  10. Large-scale horizontally aligned ZnO microrod arrays with controlled orientation, periodic distribution as building blocks for chip-in piezo-phototronic LEDs.

    PubMed

    Guo, Zhen; Li, Haiwen; Zhou, Lianqun; Zhao, Dongxu; Wu, Yihui; Zhang, Zhiqiang; Zhang, Wei; Li, Chuanyu; Yao, Jia

    2015-01-27

    A novel method of fabricating large-scale horizontally aligned ZnO microrod arrays with controlled orientation and periodic distribution via combing technology is introduced. Horizontally aligned ZnO microrod arrays with uniform orientation and periodic distribution can be realized based on the conventional bottom-up method prepared vertically aligned ZnO microrod matrix via the combing method. When the combing parameters are changed, the orientation of horizontally aligned ZnO microrod arrays can be adjusted (θ = 90° or 45°) in a plane and a misalignment angle of the microrods (0.3° to 2.3°) with low-growth density can be obtained. To explore the potential applications based on the vertically and horizontally aligned ZnO microrods on p-GaN layer, piezo-phototronic devices such as heterojunction LEDs are built. Electroluminescence (EL) emission patterns can be adjusted for the vertically and horizontally aligned ZnO microrods/p-GaN heterojunction LEDs by applying forward bias. Moreover, the emission color from UV-blue to yellow-green can be tuned by investigating the piezoelectric properties of the materials. The EL emission mechanisms of the LEDs are discussed in terms of band diagrams of the heterojunctions and carrier recombination processes. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Large scale tracking algorithms

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  12. Large scale traffic simulations

    SciTech Connect

    Nagel, K.; Barrett, C.L. |; Rickert, M. |

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  13. The Geographic Distribution of Loa loa in Africa: Results of Large-Scale Implementation of the Rapid Assessment Procedure for Loiasis (RAPLOA)

    PubMed Central

    Zouré, Honorat Gustave Marie; Wanji, Samuel; Noma, Mounkaïla; Amazigo, Uche Veronica; Diggle, Peter J.; Tekle, Afework Hailemariam; Remme, Jan H. F.

    2011-01-01

    Background Loiasis is a major obstacle to ivermectin treatment for onchocerciasis control and lymphatic filariasis elimination in central Africa. In communities with a high level of loiasis endemicity, there is a significant risk of severe adverse reactions to ivermectin treatment. Information on the geographic distribution of loiasis in Africa is urgently needed but available information is limited. The African Programme for Onchocerciasis Control (APOC) undertook large scale mapping of loiasis in 11 potentially endemic countries using a rapid assessment procedure for loiasis (RAPLOA) that uses a simple questionnaire on the history of eye worm. Methodology/Principal Findings RAPLOA surveys were done in a spatial sample of 4798 villages covering an area of 2500×3000 km centred on the heartland of loiasis in Africa. The surveys showed high risk levels of loiasis in 10 countries where an estimated 14.4 million people live in high risk areas. There was a strong spatial correlation among RAPLOA data, and kriging was used to produce spatially smoothed contour maps of the interpolated prevalence of eye worm and the predictive probability that the prevalence exceeds 40%. Conclusion/Significance The contour map of eye worm prevalence provides the first global map of loiasis based on actual survey data. It shows a clear distribution with two zones of hyper endemicity, large areas that are free of loiasis and several borderline or intermediate zones. The surveys detected several previously unknown hyperendemic foci, clarified the distribution of loiasis in the Central African Republic and large parts of the Republic of Congo and the Democratic Republic of Congo for which hardly any information was available, and confirmed known loiasis foci. The new maps of the prevalence of eye worm and the probability that the prevalence exceeds the risk threshold of 40% provide critical information for ivermectin treatment programs among millions of people in Africa. PMID:21738809

  14. The large-scale distribution of ammonia oxidizers in paddy soils is driven by soil pH, geographic distance, and climatic factors

    PubMed Central

    Hu, Hang-Wei; Zhang, Li-Mei; Yuan, Chao-Lei; Zheng, Yong; Wang, Jun-Tao; Chen, Deli; He, Ji-Zheng

    2015-01-01

    Paddy soils distribute widely from temperate to tropical regions, and are characterized by intensive nitrogen fertilization practices in China. Mounting evidence has confirmed the functional importance of ammonia-oxidizing archaea (AOA) and bacteria (AOB) in soil nitrification, but little is known about their biogeographic distribution patterns in paddy ecosystems. Here, we used barcoded pyrosequencing to characterize the effects of climatic, geochemical and spatial factors on the distribution of ammonia oxidizers from 11 representative rice-growing regions (75–1945 km apart) of China. Potential nitrification rates varied greatly by more than three orders of magnitude, and were significantly correlated with the abundances of AOA and AOB. The community composition of ammonia oxidizer was affected by multiple factors, but changes in relative abundances of the major lineages could be best predicted by soil pH. The alpha diversity of AOA and AOB displayed contrasting trends over the gradients of latitude and atmospheric temperature, indicating a possible niche separation between AOA and AOB along the latitude. The Bray–Curtis dissimilarities in ammonia-oxidizing community structure significantly increased with increasing geographical distance, indicating that more geographically distant paddy fields tend to harbor more dissimilar ammonia oxidizers. Variation partitioning analysis revealed that spatial, geochemical and climatic factors could jointly explain majority of the data variation, and were important drivers defining the ecological niches of AOA and AOB. Our findings suggest that both AOA and AOB are of functional importance in paddy soil nitrification, and ammonia oxidizers in paddy ecosystems exhibit large-scale biogeographic patterns shaped by soil pH, geographic distance, and climatic factors. PMID:26388866

  15. Large-Scale Analysis of the Prevalence and Geographic Distribution of HIV-1 Non-B Variants in the United States

    PubMed Central

    Hackett, John; Holzmayer, Vera; Hillyard, David R.

    2013-01-01

    The genetic diversity of human immunodeficiency virus type 1 (HIV-1) has significant implications for diagnosis, vaccine development, and clinical management of patients. Although HIV-1 subtype B is predominant in the United States, factors such as global travel, immigration, and military deployment have the potential to increase the proportion of non-subtype B infections. Limited data are available on the prevalence and distribution of non-B HIV-1 strains in the United States. We sought to retrospectively examine the prevalence, geographic distribution, diversity, and temporal trends of HIV-1 non-B infections in samples obtained by ARUP Laboratories, a national reference laboratory, from all regions of the United States. HIV-1 pol sequences from 24,386 specimens collected from 46 states between 2004 and September 2011 for drug resistance genotyping were analyzed using the REGA HIV-1 Subtyping Tool, version 2.0. Sequences refractory to subtype determination or reported as non-subtype B by this tool were analyzed by PHYLIP version 3.5 and Simplot version 3.5.1. Non-subtype B strains accounted for 3.27% (798/24,386) of specimens. The 798 non-B specimens were received from 37 states and included 5 subtypes, 23 different circulating recombinant forms (CRFs), and 39 unique recombinant forms (URFs). The non-subtype B prevalence varied from 0% in 2004 (0/54) to 4.12% in 2011 (201/4,884). This large-scale analysis reveals that the diversity of HIV-1 in the United States is high, with multiple subtypes, CRFs, and URFs circulating. Moreover, the geographic distribution of non-B variants is widespread. Data from HIV-1 drug resistance testing have the potential to significantly enhance the surveillance of HIV-1 variants in the United States. PMID:23761148

  16. The effects of Reynolds number, rotor incidence angle and surface roughness on the heat transfer distribution in a large-scale turbine rotor passage

    NASA Technical Reports Server (NTRS)

    Blair, M. F.

    1991-01-01

    A combined experimental and computational program was conducted to examine the heat transfer distribution in a turbine rotor passage geometrically similar to the Space Shuttle Main Engine (SSME) High Pressure Fuel Turbopump (HPFTP). Heat transfer was measured and computed for both the full span suction and pressure surfaces of the rotor airfoil as well as for the hub endwall surface. The objective of the program was to provide a benchmark-quality database for the assessment of rotor heat transfer computational techniques. The experimental portion of the study was conducted in a large scale, ambient temperature, rotating turbine model. The computational portion consisted of the application of a well-posed parabolized Navier-Stokes analysis of the calculation of the three-dimensional viscous flow through ducts simulating a gas turbine package. The results of this assessment indicate that the procedure has the potential to predict the aerodynamics and the heat transfer in a gas turbine passage and can be used to develop detailed three dimensional turbulence models for the prediction of skin friction and heat transfer in complex three dimensional flow passages.

  17. Towards a Scalable and Adaptive Application Support Platform for Large-Scale Distributed E-Sciences in High-Performance Network Environments

    SciTech Connect

    Wu, Chase Qishi; Zhu, Michelle Mengxia

    2016-06-06

    The advent of large-scale collaborative scientific applications has demonstrated the potential for broad scientific communities to pool globally distributed resources to produce unprecedented data acquisition, movement, and analysis. System resources including supercomputers, data repositories, computing facilities, network infrastructures, storage systems, and display devices have been increasingly deployed at national laboratories and academic institutes. These resources are typically shared by large communities of users over Internet or dedicated networks and hence exhibit an inherent dynamic nature in their availability, accessibility, capacity, and stability. Scientific applications using either experimental facilities or computation-based simulations with various physical, chemical, climatic, and biological models feature diverse scientific workflows as simple as linear pipelines or as complex as a directed acyclic graphs, which must be executed and supported over wide-area networks with massively distributed resources. Application users oftentimes need to manually configure their computing tasks over networks in an ad hoc manner, hence significantly limiting the productivity of scientists and constraining the utilization of resources. The success of these large-scale distributed applications requires a highly adaptive and massively scalable workflow platform that provides automated and optimized computing and networking services. This project is to design and develop a generic Scientific Workflow Automation and Management Platform (SWAMP), which contains a web-based user interface specially tailored for a target application, a set of user libraries, and several easy-to-use computing and networking toolkits for application scientists to conveniently assemble, execute, monitor, and control complex computing workflows in heterogeneous high-performance network environments. SWAMP will enable the automation and management of the entire process of scientific

  18. Distribution of persistent organic pollutants, polycyclic aromatic hydrocarbons and trace elements in soil and vegetation following a large scale landfill fire in northern Greece.

    PubMed

    Chrysikou, Loukia; Gemenetzis, Panagiotis; Kouras, Athanasios; Manoli, Evangelia; Terzi, Eleni; Samara, Constantini

    2008-02-01

    Polycyclic aromatic hydrocarbons (PAHs), polychlorinated biphenyls (PCBs), organochlorine pesticides (OCPs), including hexaclorocyclohexanes (HCHs) and DDTs, as well as trace elements were determined in soil and vegetation samples collected from the surrounding area of the landfill "Tagarades", the biggest in northern Greece, following a large scale fire involving approximately 50,000 tons of municipal waste. High concentrations of total PAHs, PCBs and heavy metals were found inside the landfill (1475 microg kg(-1) dw, 399 microg kg(-1) dw and 29.8 mg kg(-1) dw, respectively), whereas concentrations in the surrounding soils were by far lower ranging between 11.2-28.1 microg kg(-1) dw for PAHs, 4.02-11.2 microg kg(-1) dw for PCBs and 575-1207 mg kg(-1) dw for heavy metals. The distribution of HCHs and DDTs were quite different since certain soils exhibited equal or higher concentrations than the landfill. In vegetation, the concentrations of PAHs, PCBs, HCHs and DDTs ranged from 14.1-34.7, 3.64-25.9, 1.41-32.1 and 0.61-4.03 microg kg(-1) dw, respectively, while those of heavy metals from 81 to 159 mg kg(-1) dw. The results of the study indicated soil and vegetation pollution levels in the surroundings of the landfill comparable to those reported for other Greek locations. The impact from the landfill fire was not evident partially due to the presence of recent and past inputs from other activities (agriculture, vehicular transport, earlier landfill fires).

  19. The effects of Reynolds number, rotor incidence angle, and surface roughness on the heat transfer distribution in a large-scale turbine rotor passage

    NASA Technical Reports Server (NTRS)

    Blair, Michael F.; Anderson, Olof L.

    1989-01-01

    A combined experimental and computational program was conducted to examine the heat transfer distribution in a turbine rotor passage geometrically similiar to the Space Shuttle Main Engine (SSME) High Pressure Fuel Turbopump (HPFTP). Heat transfer was measured and computed for both the full-span suction and pressure surfaces of the rotor airfoil as well as for the hub endwall surface. The primary objective of the program was to provide a benchmark-quality data base for the assessment of rotor passage heat transfer computational procedures. The experimental portion of the study was conducted in a large-scale, ambient temperature, rotating turbine model. Heat transfer data were obtained using thermocouple and liquid-crystal techniques to measure temperature distributions on the thin, electrically-heated skin of the rotor passage model. Test data were obtained for various combinations of Reynolds number, rotor incidence angle and model surface roughness. The data are reported in the form of contour maps of Stanton number. These heat distribution maps revealed numerous local effects produced by the three-dimensional flows within the rotor passage. Of particular importance were regions of local enhancement produced on the airfoil suction surface by the main-passage and tip-leakage vortices and on the hub endwall by the leading-edge horseshoe vortex system. The computational portion consisted of the application of a well-posed parabolized Navier-Stokes analysis to the calculation of the three-dimensional viscous flow through ducts simulating the a gas turbine passage. These cases include a 90 deg turning duct, a gas turbine cascade simulating a stator passage, and a gas turbine rotor passage including Coriolis forces. The calculated results were evaluated using experimental data of the three-dimensional velocity fields, wall static pressures, and wall heat transfer on the suction surface of the turbine airfoil and on the end wall. Particular attention was paid to an

  20. Impact of interannual changes of large scale circulation and hydrography on the spatial distribution of beaked redfish (Sebastes mentella) in the Irminger Sea

    NASA Astrophysics Data System (ADS)

    Núñez-Riboni, Ismael; Kristinsson, Kristján; Bernreuther, Matthias; van Aken, Hendrik M.; Stransky, Christoph; Cisewski, Boris; Rolskiy, Alexey

    2013-12-01

    This study provides evidence of the influence of hydrography and large scale ocean circulation on the geographical distribution of beaked redfish (Sebastes mentella) in the Irminger Sea on the interannual time scale, from 1992 to 2011. The results reveal the average relationship of adult pelagic redfish to their physical habitat from 100 to 800 m depth: the most preferred latitude, longitude, depth, temperature and salinity for redfish are approximately 58°N, 41°W, 557 m, 4.5 °C and 34.87, respectively. The redfish habitat corresponds in a temperature-salinity (TS) diagram to a mixing triangle between East Greenland Current Water (EGCW), Labrador Sea Water (LSW) and Irminger Current Water (ICW). The geographical centre of mass of the redfish distribution (as revealed by acoustic fish density) indicates displacements from year to year. Changes in hydrographic conditions were investigated in detail for possible reasons for these displacements. Empirical Orthogonal Analysis reveals that maximum variations of water mass volume on an interannual time-scale in the study region correspond to ICW and LSW changes, while EGCW remains comparatively stable. Indices of redfish geographical centroid, LSW volume, ICW temperature and Subpolar Gyre (SPG) intensity suggest that the geographical redfish displacements are closely related to interannual changes of ICW modulated by the SPG intensity with a lag of 1 or 2 years. In comparison, LSW seems to have no impact on the redfish distribution at the studied depth range. The time lag between ICW and redfish displacements indicates an indirect influence of temperature on redfish. Hence, changes of chlorophyll-a (from satellite imagery), as a proxy for primary production, were used in a first approach to study the role of food availability. The analysis is based on acoustic and trawl data from nine expeditions coordinated by the International Council for the Exploration of the Sea (ICES), around 71,000 hydrographic stations from the

  1. Cosmology with Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Ho, Shirley; Cuesta, A.; Ross, A.; Seo, H.; DePutter, R.; Padmanabhan, N.; White, M.; Myers, A.; Bovy, J.; Blanton, M.; Hernandez, C.; Mena, O.; Percival, W.; Prada, F.; Ross, N. P.; Saito, S.; Schneider, D.; Skibba, R.; Smith, K.; Slosar, A.; Strauss, M.; Verde, L.; Weinberg, D.; Bachall, N.; Brinkmann, J.; da Costa, L. A.

    2012-01-01

    The Sloan Digital Sky Survey I-III surveyed 14,000 square degrees, and delivered over a trillion pixels of imaging data. I present cosmological results from this unprecedented data set which contains over a million galaxies distributed between redshift of 0.45 to 0.70. With such a large volume of data set, high precision cosmological constraints can be obtained given a careful control and understanding of observational systematics. I present a novel treatment of observational systematics and its application to the clustering signals from the data set. I will present cosmological constraints on dark components of the Universe and tightest constraints of the non-gaussianity of early Universe to date utilizing Large Scale Structure.

  2. Overlap of Spoilage-Associated Microbiota between Meat and the Meat Processing Environment in Small-Scale and Large-Scale Retail Distributions

    PubMed Central

    Stellato, Giuseppina; La Storia, Antonietta; De Filippis, Francesca; Borriello, Giorgia; Villani, Francesco

    2016-01-01

    ABSTRACT Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The aims of this study were to learn more about the possible influence of the meat processing environment on initial fresh meat contamination and to investigate the differences between small-scale retail distribution (SD) and large-scale retail distribution (LD) facilities. Samples were collected from butcheries (n = 20), including LD (n = 10) and SD (n = 10) facilities, over two sampling campaigns. Samples included fresh beef and pork cuts and swab samples from the knife, the chopping board, and the butcher's hand. The microbiota of both meat samples and environmental swabs were very complex, including more than 800 operational taxonomic units (OTUs) collapsed at the species level. The 16S rRNA sequencing analysis showed that core microbiota were shared by 80% of the samples and included Pseudomonas spp., Streptococcus spp., Brochothrix spp., Psychrobacter spp., and Acinetobacter spp. Hierarchical clustering of the samples based on the microbiota showed a certain separation between meat and environmental samples, with higher levels of Proteobacteria in meat. In particular, levels of Pseudomonas and several Enterobacteriaceae members were significantly higher in meat samples, while Brochothrix, Staphylococcus, lactic acid bacteria, and Psychrobacter prevailed in environmental swab samples. Consistent clustering was also observed when metabolic activities were considered by predictive metagenomic analysis of the samples. An increase in carbohydrate metabolism was predicted for the environmental swabs and was consistently linked to Firmicutes, while increases in pathways related to amino acid and lipid metabolism were predicted for the meat samples and were positively correlated with Proteobacteria. Our results highlighted the importance of the processing environment in contributing to the initial microbial levels of meat and clearly showed that the type

  3. Overlap of Spoilage-Associated Microbiota between Meat and the Meat Processing Environment in Small-Scale and Large-Scale Retail Distributions.

    PubMed

    Stellato, Giuseppina; La Storia, Antonietta; De Filippis, Francesca; Borriello, Giorgia; Villani, Francesco; Ercolini, Danilo

    2016-07-01

    Microbial contamination in food processing plants can play a fundamental role in food quality and safety. The aims of this study were to learn more about the possible influence of the meat processing environment on initial fresh meat contamination and to investigate the differences between small-scale retail distribution (SD) and large-scale retail distribution (LD) facilities. Samples were collected from butcheries (n = 20), including LD (n = 10) and SD (n = 10) facilities, over two sampling campaigns. Samples included fresh beef and pork cuts and swab samples from the knife, the chopping board, and the butcher's hand. The microbiota of both meat samples and environmental swabs were very complex, including more than 800 operational taxonomic units (OTUs) collapsed at the species level. The 16S rRNA sequencing analysis showed that core microbiota were shared by 80% of the samples and included Pseudomonas spp., Streptococcus spp., Brochothrix spp., Psychrobacter spp., and Acinetobacter spp. Hierarchical clustering of the samples based on the microbiota showed a certain separation between meat and environmental samples, with higher levels of Proteobacteria in meat. In particular, levels of Pseudomonas and several Enterobacteriaceae members were significantly higher in meat samples, while Brochothrix, Staphylococcus, lactic acid bacteria, and Psychrobacter prevailed in environmental swab samples. Consistent clustering was also observed when metabolic activities were considered by predictive metagenomic analysis of the samples. An increase in carbohydrate metabolism was predicted for the environmental swabs and was consistently linked to Firmicutes, while increases in pathways related to amino acid and lipid metabolism were predicted for the meat samples and were positively correlated with Proteobacteria Our results highlighted the importance of the processing environment in contributing to the initial microbial levels of meat and clearly showed that the type of retail

  4. Large-scale cortical networks and cognition.

    PubMed

    Bressler, S L

    1995-03-01

    The well-known parcellation of the mammalian cerebral cortex into a large number of functionally distinct cytoarchitectonic areas presents a problem for understanding the complex cortical integrative functions that underlie cognition. How do cortical areas having unique individual functional properties cooperate to accomplish these complex operations? Do neurons distributed throughout the cerebral cortex act together in large-scale functional assemblages? This review examines the substantial body of evidence supporting the view that complex integrative functions are carried out by large-scale networks of cortical areas. Pathway tracing studies in non-human primates have revealed widely distributed networks of interconnected cortical areas, providing an anatomical substrate for large-scale parallel processing of information in the cerebral cortex. Functional coactivation of multiple cortical areas has been demonstrated by neurophysiological studies in non-human primates and several different cognitive functions have been shown to depend on multiple distributed areas by human neuropsychological studies. Electrophysiological studies on interareal synchronization have provided evidence that active neurons in different cortical areas may become not only coactive, but also functionally interdependent. The computational advantages of synchronization between cortical areas in large-scale networks have been elucidated by studies using artificial neural network models. Recent observations of time-varying multi-areal cortical synchronization suggest that the functional topology of a large-scale cortical network is dynamically reorganized during visuomotor behavior.

  5. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  6. On using large scale correlation of the Ly-α forest and redshifted 21-cm signal to probe HI distribution during the post reionization era

    SciTech Connect

    Sarkar, Tapomoy Guha; Datta, Kanan K. E-mail: kanan.physics@presiuniv.ac.in

    2015-08-01

    We investigate the possibility of detecting the 3D cross correlation power spectrum of the Ly-α forest and HI 21 cm signal from the post reionization epoch. (The cross-correlation signal is directly dependent on the dark matter power spectrum and is sensitive to the 21-cm brightness temperature and Ly-α forest biases. These bias parameters dictate the strength of anisotropy in redshift space.) We find that the cross-correlation power spectrum can be detected using 400 hrs observation with SKA-mid (phase 1) and a futuristic BOSS like experiment with a quasar (QSO) density of 30 deg{sup −2} at a peak SNR of 15 for a single field experiment at redshift z = 2.5. on large scales using the linear bias model. We also study the possibility of constraining various bias parameters using the cross power spectrum. We find that with the same experiment 1 σ (conditional errors) on the 21-cm linear redshift space distortion parameter β{sub T} and β{sub F} corresponding to the Ly-α  forest are ∼ 2.7 % and ∼ 1.4 % respectively for 01 independent pointings of the SKA-mid (phase 1). This prediction indicates a significant improvement over existing measurements. We claim that the detection of the 3D cross correlation power spectrum will not only ascertain the cosmological origin of the signal in presence of astrophysical foregrounds but will also provide stringent constraints on large scale HI biases. This provides an independent probe towards understanding cosmological structure formation.

  7. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  8. Large-scale circuit simulation

    NASA Astrophysics Data System (ADS)

    Wei, Y. P.

    1982-12-01

    The simulation of VLSI (Very Large Scale Integration) circuits falls beyond the capabilities of conventional circuit simulators like SPICE. On the other hand, conventional logic simulators can only give the results of logic levels 1 and 0 with the attendent loss of detail in the waveforms. The aim of developing large-scale circuit simulation is to bridge the gap between conventional circuit simulation and logic simulation. This research is to investigate new approaches for fast and relatively accurate time-domain simulation of MOS (Metal Oxide Semiconductors), LSI (Large Scale Integration) and VLSI circuits. New techniques and new algorithms are studied in the following areas: (1) analysis sequencing (2) nonlinear iteration (3) modified Gauss-Seidel method (4) latency criteria and timestep control scheme. The developed methods have been implemented into a simulation program PREMOS which could be used as a design verification tool for MOS circuits.

  9. Large Scale Dynamos in Stars

    NASA Astrophysics Data System (ADS)

    Vishniac, Ethan T.

    2015-01-01

    We show that a differentially rotating conducting fluid automatically creates a magnetic helicity flux with components along the rotation axis and in the direction of the local vorticity. This drives a rapid growth in the local density of current helicity, which in turn drives a large scale dynamo. The dynamo growth rate derived from this process is not constant, but depends inversely on the large scale magnetic field strength. This dynamo saturates when buoyant losses of magnetic flux compete with the large scale dynamo, providing a simple prediction for magnetic field strength as a function of Rossby number in stars. Increasing anisotropy in the turbulence produces a decreasing magnetic helicity flux, which explains the flattening of the B/Rossby number relation at low Rossby numbers. We also show that the kinetic helicity is always a subdominant effect. There is no kinematic dynamo in real stars.

  10. Variations over time in latitudinal distribution of the large-scale magnetic fields in the solar atmosphere at heights from the photosphere to the source surface

    NASA Astrophysics Data System (ADS)

    Akhtemov, Z. S.; Andreyeva, O. A.; Rudenko, G. V.; Stepanian, N. N.; Fainshtein, V. G.

    2015-02-01

    Calculations of magnetic field in the solar atmosphere and the "potential field-source surface" model have been used to study time variations in several parameters of the large-scale magnetic field at various heights during the last four solar cycles. At ten heights from the solar surface (R = Ro) to the source surface (R = 2.5Ro), we have constructed synoptic charts (SC) of the radial component Br of the estimated magnetic field. For these SC, we have identified 10-degree latitudinal zones. Within these zones, we found values of Sp (positive Br values averaged within the latitudinal zone over latitude and longitude), Sm (averaged modulus of negative Br values) and S + fields (a part of the latitudinal zone area (in %) occupied by positive Br values). At lower latitudes, cyclic variations in the Sp + Sm parameter are demonstrated to be similar (but not in detail) to time variations in Wolf numbers. Latitudes of 55° and higher exhibited virtually no cyclic peculiarities of time variations in this parameter. The authors believe that this indicates the diverse nature of the large-scale magnetic field in the near-equatorial and polar regions of the solar atmosphere. At R = 2.5Ro, Sp + Sm cyclic variations are almost invisible at all latitudes and only slightly apparent near the equator. The analysis of S + fields variations revealed that at low latitudes at R = 2.5Ro during solar cycles 21, 22 and ascending phase of cycle 23 there were almost no mixed-polarity periods. However, beginning from the maximum of cycle 23, in the near-equatorial region the mixed polarity was observed until the end of the long solar activity minimum. An assumption has been made that this might have been one of the forerunners and manifestations of the prolonged minimum between cycles 23 and 24. It has been found that during solar activity minima poleward there appears motion of magnetic fields with polarity opposite to that of the field at the pole. We have estimated the velocity of such a

  11. Very Large Scale Integration (VLSI).

    ERIC Educational Resources Information Center

    Yeaman, Andrew R. J.

    Very Large Scale Integration (VLSI), the state-of-the-art production techniques for computer chips, promises such powerful, inexpensive computing that, in the future, people will be able to communicate with computer devices in natural language or even speech. However, before full-scale VLSI implementation can occur, certain salient factors must be…

  12. Large-scale homogeneously distributed Ag-NPs with sub-10 nm gaps assembled on a two-layered honeycomb-like TiO2 film as sensitive and reproducible SERS substrates.

    PubMed

    Hu, Xiaoye; Meng, Guowen; Huang, Qing; Xu, Wei; Han, Fangming; Sun, Kexi; Xu, Qiaoling; Wang, Zhaoming

    2012-09-28

    We present a surface-enhanced Raman scattering (SERS) substrate featured by large-scale homogeneously distributed Ag nanoparticles (Ag-NPs) with sub-10 nm gaps assembled on a two-layered honeycomb-like TiO(2) film. The two-layered honeycomb-like TiO(2) film was achieved by a two-step anodization of pure Ti foil, with its upper layer consisting of hexagonally arranged shallow nano-bowls of 160 nm in diameter, and the lower layer consisting of arrays of about fifty vertically aligned sub-20 nm diameter nanopores. The shallow nano-bowls in the upper layer divide the whole TiO(2) film into regularly arranged arrays of uniform hexagonal nano-cells, leading to a similar distribution pattern for the ion-sputtered Ag-NPs in each nano-cell. The lower layer with sub-20 nm diameter nanopores prevents the aggregation of the sputtered Ag-NPs, so that the Ag-NPs can get much closer with gaps in the sub-10 nm range. Therefore, large-scale high-density and quasi-ordered sub-10 nm gaps between the adjacent Ag-NPs were achieved, which ensures homogeneously distributed 'hot spots' over a large area for the SERS effect. Moreover, the honeycomb-like structure can also facilitate the capture of target analyte molecules. As expected, the SERS substrate exhibits an excellent SERS effect with high sensitivity and reproducibility. As an example, the SERS substrate was utilized to detect polychlorinated biphenyls (PCBs, a kind of persistent organic pollutants as global environmental hazard) such as 3,3',4,4'-pentachlorobiphenyl (PCB-77) with concentrations down to 10(-9) M. Therefore the large-scale Ag-NPs with sub-10 nm gaps assembled on the two-layered honeycomb-like TiO (2) film have potentials in SERS-based rapid trace detection of PCBs.

  13. Estimating the dopant distribution in Ca-doped α-SiAlON: statistical HAADF-STEM analysis and large-scale atomic modeling.

    PubMed

    Sakaguchi, Norihito; Yamaki, Fuuta; Saito, Genki; Kunisada, Yuji

    2016-10-01

    We investigated the dopant distribution in Ca-doped α-SiAlON by using high-angle annular dark-field scanning transmission electron microscopy and a multi-slice image simulation. Our results showed that the electron wave propagated by hopping to adjacent Si(Al) and N(O) columns. The image intensities of the Ca columns had wider dispersions than other columns. To estimate the Ca distribution in the bulk material, we performed a Monte Carlo atomic simulation of the α-SiAlON with Ca dopants. A model including a short-range Coulomb-like repulsive force between adjacent Ca atoms reproduced the dispersion of the intensity distribution of the Ca column in the experimental image. © The Author 2016. Published by Oxford University Press on behalf of The Japanese Society of Microscopy. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  14. Large scale distribution of ultra high energy cosmic rays detected at the Pierre Auger Observatory with zenith angles up to 80°

    SciTech Connect

    Aab, Alexander

    2015-03-30

    In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in the $E\\gt 8$ EeV energy bin, with an amplitude for the first harmonic in right ascension $r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$, that has a chance probability $P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$, reinforcing the hint previously reported with vertical events alone.

  15. Large-scale gene flow in the barnacle Jehlius cirratus and contrasts with other broadly-distributed taxa along the Chilean coast

    PubMed Central

    Guo, Baoying

    2017-01-01

    We evaluate the population genetic structure of the intertidal barnacle Jehlius cirratus across a broad portion of its geographic distribution using data from the mitochondrial cytochrome oxidase I (COI) gene region. Despite sampling diversity from over 3,000 km of the linear range of this species, there is only slight regional structure indicated, with overall Φ CT of 0.036 (p < 0.001) yet no support for isolation by distance. While these results suggest greater structure than previous studies of J. cirratus had indicated, the pattern of diversity is still far more subtle than in other similarly-distributed species with similar larval and life history traits. We compare these data and results with recent findings in four other intertidal species that have planktotrophic larvae. There are no clear patterns among these taxa that can be associated with intertidal depth or other known life history traits. PMID:28194316

  16. Large Scale Distribution of Ultra High Energy Cosmic Rays Detected at the Pierre Auger Observatory with Zenith Angles up to 80°

    NASA Astrophysics Data System (ADS)

    Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Samarai, I. Al; Albuquerque, I. F. M.; Allekotte, I.; Allen, J.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Alves Batista, R.; Ambrosio, M.; Aminaei, A.; Anchordoqui, L.; Andringa, S.; Aramo, C.; Aranda, V. M.; Arqueros, F.; Asorey, H.; Assis, P.; Aublin, J.; Ave, M.; Avenier, M.; Avila, G.; Awal, N.; Badescu, A. M.; Barber, K. B.; Bäuml, J.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Blaess, S. G.; Blanco, M.; Bleve, C.; Blümer, H.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Bonino, R.; Borodai, N.; Brack, J.; Brancus, I.; Bridgeman, A.; Brogueira, P.; Brown, W. C.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Candusso, M.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Cilmo, M.; Clay, R. W.; Cocciolo, G.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Cordier, A.; Coutu, S.; Covault, C. E.; Cronin, J.; Curutiu, A.; Dallier, R.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; De Domenico, M.; de Jong, S. J.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; del Peral, L.; Deligny, O.; Dembinski, H.; Dhital, N.; Di Giulio, C.; Di Matteo, A.; Diaz, J. C.; Díaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; Docters, W.; D'Olivo, J. C.; Dorofeev, A.; Dorosti Hasankiadeh, Q.; Dova, M. T.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Facal San Luis, P.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Ferguson, A. P.; Fernandes, M.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fox, B. D.; Fratu, O.; Freire, M. M.; Fröhlich, U.; Fuchs, B.; Fujii, T.; Gaior, R.; García, B.; Garcia-Gamez, D.; Garcia-Pinto, D.; Garilli, G.; Gascon Bravo, A.; Gate, F.; Gemmeke, H.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Glaser, C.; Glass, H.; Gómez Berisso, M.; Gómez Vitale, P. F.; Gonçalves, P.; Gonzalez, J. G.; González, N.; Gookin, B.; Gordon, J.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grebe, S.; Griffith, N.; Grillo, A. F.; Grubb, T. D.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Hartmann, S.; Harton, J. L.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Hollon, N.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huber, D.; Huege, T.; Insolia, A.; Isar, P. G.; Jandt, I.; Jansen, S.; Jarne, C.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Kégl, B.; Keilhauer, B.; Keivani, A.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Krömer, O.; Kruppke-Hansen, D.; Kuempel, D.; Kunka, N.; LaHurd, D.; Latronico, L.; Lauer, R.; Lauscher, M.; Lautridou, P.; Le Coz, S.; Leão, M. S. A. B.; Lebrun, D.; Lebrun, P.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; López, R.; Louedec, K.; Lozano Bahilo, J.; Lu, L.; Lucero, A.; Ludwig, M.; Malacari, M.; Maldera, S.; Mallamaci, M.; Maller, J.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Marin, V.; Mariş, I. C.; Marsella, G.; Martello, D.; Martin, L.; Martinez, H.; Martínez Bravo, O.; Martraire, D.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Maurel, D.; Maurizio, D.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Meissner, R.; Melissas, M.; Melo, D.; Menshikov, A.; Messina, S.; Meyhandan, R.; Mićanović, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Molina-Bueno, L.; Mollerach, S.; Monasor, M.; Monnier Ragaigne, D.; Montanet, F.; Morello, C.; Mostafá, M.; Moura, C. A.; Muller, M. A.; Müller, G.; Müller, S.; Münchmeyer, M.; Mussa, R.; Navarra, G.; Navas, S.; Necesal, P.; Nellen, L.; Nelles, A.; Neuser, J.; Nguyen, P. H.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, L.; Ochilo, L.; Oikonomou, F.; Olinto, A.; Oliveira, M.; Pacheco, N.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Palmieri, N.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pȩkala, J.; Pelayo, R.; Pepe, I. M.; Perrone, L.; Petermann, E.; Peters, C.; Petrera, S.; Petrov, Y.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porcelli, A.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Purrello, V.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Rautenberg, J.; Ravel, O.; Ravignani, D.; Revenu, B.; Ridky, J.; Riggi, S.; Risse, M.; Ristori, P.; Rizi, V.; Rodrigues de Carvalho, W.; Rodriguez Fernandez, G.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Rogozin, D.; Ros, G.; Rosado, J.; Rossler, T.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salamida, F.; Salazar, H.; Saleh, A.; Salesa Greus, F.; Salina, G.; Sánchez, F.; Sanchez-Lucas, P.; Santo, C. E.; Santos, E.; Santos, E. M.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sato, R.; Scharf, N.; Scherini, V.; Schieler, H.; Schiffer, P.; Schmidt, D.; Scholten, O.; Schoorlemmer, H.; Schovánek, P.; Schröder, F. G.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sidelnik, I.; Sigl, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sorokin, J.; Squartini, R.; Srivastava, Y. N.; Stanič, S.; Stapleton, J.; Stasielak, J.; Stephan, M.; Stutz, A.; Suarez, F.; Suomijärvi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Szuba, M.; Taborda, O. A.; Tapia, A.; Tepe, A.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Toma, G.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Travnicek, P.; Trovato, E.; Ulrich, R.; Unger, M.; Urban, M.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van Bodegom, P.; van den Berg, A. M.; van Velzen, S.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Videla, M.; Villaseñor, L.; Vlcek, B.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weidenhaupt, K.; Weindl, A.; Werner, F.; Widom, A.; Wiencke, L.; Wilczyńska, B.; Wilczyński, H.; Williams, C.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yamamoto, T.; Yapici, T.; Yuan, G.; Yushkov, A.; Zamorano, B.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zepeda, A.; Zhou, J.; Zhu, Y.; Zimbres Silva, M.; Ziolkowski, M.; Zuccarello, F.

    2015-04-01

    We present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in the E\\gt 8 EeV energy bin, with an amplitude for the first harmonic in right ascension r1α =(4.4+/- 1.0)× {{10}-2}, that has a chance probability P(≥slant r1α )=6.4× {{10}-5}, reinforcing the hint previously reported with vertical events alone.

  17. Large scale distribution of ultra high energy cosmic rays detected at the Pierre Auger Observatory with zenith angles up to 80°

    DOE PAGES

    Aab, Alexander

    2015-03-30

    In this study, we present the results of an analysis of the large angular scale distribution of the arrival directions of cosmic rays with energy above 4 EeV detected at the Pierre Auger Observatory including for the first time events with zenith angle between 60° and 80°. We perform two Rayleigh analyses, one in the right ascension and one in the azimuth angle distributions, that are sensitive to modulations in right ascension and declination, respectively. The largest departure from isotropy appears in themore » $$E\\gt 8$$ EeV energy bin, with an amplitude for the first harmonic in right ascension $$r_{1}^{\\alpha }=(4.4\\pm 1.0)\\times {{10}^{-2}}$$, that has a chance probability $$P(\\geqslant r_{1}^{\\alpha })=6.4\\times {{10}^{-5}}$$, reinforcing the hint previously reported with vertical events alone.« less

  18. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation.

    PubMed

    Qin, Changbo; Jia, Yangwen; Su, Z; Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-07-29

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems.

  19. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation

    PubMed Central

    Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-01-01

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems. PMID:27879946

  20. Spatial heterogeneity in zooplankton summer distribution in the eastern Chukchi Sea in 2012-2013 as a result of large-scale interactions of water masses

    NASA Astrophysics Data System (ADS)

    Pinchuk, Alexei I.; Eisner, Lisa B.

    2017-01-01

    Interest in the Arctic shelf ecosystems has increased in recent years as the climate has rapidly warmed and sea ice declined. These changing conditions prompted the broad-scale multidisciplinary Arctic Ecosystem integrated survey (Arctic Eis) aimed at systematic, comparative analyses of interannual variability of the shelf ecosystem. In this study, we compared zooplankton composition and geographical distribution in relation to water properties on the eastern Chukchi and northern Bering Sea shelves during the summers of 2012 and 2013. In 2012, waters of Pacific origin prevailed over the study area carrying expatriate oceanic species (e.g. copepods Neocalanus spp., Eucalanus bungii) from the Bering Sea outer shelf well onto the northeastern Chukchi shelf. In contrast, in 2013, zooplankton of Pacific origin was mainly distributed over the southern Chukchi shelf, suggesting a change of advection pathways into the Arctic. These changes also manifested in the emergence of large lipid-rich Arctic zooplankton (e.g. Calanus hyperboreus) on the northeastern Chukchi shelf in 2013. The predominant copepod Calanus glacialis was composed of two distinct populations originating from the Bering Sea and from the Arctic, with the Arctic population expanding over a broader range in 2013. The observed interannual variability in zooplankton distribution on the Chukchi Sea shelf may be explained by previously described systematic oceanographic patterns derived from long-term observations. Variability in oceanic circulation and related zooplankton distributions (e.g. changes in southwestward advection of C. hyperboreus) may impact keystone predators such as Arctic Cod (Boreogadus saida) that feed on energy-rich zooplankton.

  1. Large-scale distribution patterns of the mussel Mytilus edulis in the Wadden Sea of Schleswig-Holstein: Do storms structure the ecosystem?

    NASA Astrophysics Data System (ADS)

    Nehls, Georg; Thiel, Martin

    The distribution of mussel beds in the Wadden Sea of Schleswig-Holstein was mapped by aerial surveys from 1989 to 1991. The number of mussel beds decreased from 94 in 1989 to 49 in 1990, as a result of severe storms in early 1990. Thereafter only small changes were observed. The mussel beds that remained in 1990 were found only in the shelter of islands; all beds in exposed areas had disappeared between the surveys of 1989 and 1990, leaving large areas without mussel beds. Storms are thus identified as a major factor limiting the distribution of mussel beds to the sheltered parts of the Wadden Sea. Beds in the exposed parts of the Wadden Sea are highly dynamic, whereas beds in sheltered areas may persist over long times. A comparison with distribution patterns of older surveys (from 1937, 1968 and 1978) revealed great similarities with the results of recent investigations, indicating a constant distribution pattern over a long period. The results are discussed in relation to eutrophication and the structure of the benthic communities of the Wadden Sea. It is concluded that any eutrophication-induced increase of the mussel population would be restricted to the sheltered parts of the Wadden Sea. Storms will largely determine whether the communities of a given area have to compete with mussels, which are most important filter feeders of the ecosystem. As competition for food is a major factor structuring the benthic communities of the Wadden Sea, it is assumed that storms indirectly affect all other communities, giving deeper-burying, storm-tolerant species a competitive advantage in exposed areas where epibenthic mussels are excluded. The impact of mussel fisheries will be different for persisting and dynamic beds: fishing on persisting beds in sheltered areas may remove the crucial reserve which mussel-feeding birds such as eiders or oystercatchers need in times of low mussel populations.

  2. Stochastic properties of radiation-induced DSB: DSB distributions in large scale chromatin loops, the HPRT gene and within the visible volumes of DNA repair foci.

    PubMed

    Ponomarev, Artem L; Costes, Sylvain V; Cucinotta, Francis A

    2008-11-01

    We computed probabilities to have multiple double-strand breaks (DSB), which are produced in DNA on a regional scale, and not in close vicinity, in volumes matching the size of DNA damage foci, of a large chromatin loop, and in the physical volume of DNA containing the HPRT (human hypoxanthine phosphoribosyltransferase) locus. The model is based on a Monte Carlo description of DSB formation by heavy ions in the spatial context of the entire human genome contained within the cell nucleus, as well as at the gene sequence level. We showed that a finite physical volume corresponding to a visible DNA repair focus, believed to be associated with one DSB, can contain multiple DSB due to heavy ion track structure and the DNA supercoiled topography. A corrective distribution was introduced, which was a conditional probability to have excess DSB in a focus volume, given that there was already one present. The corrective distribution was calculated for 19.5 MeV/amu N ions, 3.77 MeV/amu alpha-particles, 1000 MeV/amu Fe ions, and X-rays. The corrected initial DSB yield from the experimental data on DNA repair foci was calculated. The DSB yield based on the corrective function converts the focus yield into the DSB yield, which is comparable with the DSB yield based on the earlier PFGE experiments. The distribution of DSB within the physical limits of the HPRT gene was analyzed by a similar method as well. This corrective procedure shows the applicability of the model and empowers the researcher with a tool to better analyze focus statistics. The model enables researchers to analyze the DSB yield based on focus statistics in real experimental situations that lack one-to-one focus-to-DSB correspondance.

  3. Two distinct mtDNA lineages of the blue crab reveal large-scale population structure in its native Atlantic distribution

    NASA Astrophysics Data System (ADS)

    Alaniz Rodrigues, Marcos; Dumont, Luiz Felipe Cestari; dos Santos, Cléverson Rannieri Meira; D'Incao, Fernando; Weiss, Steven; Froufe, Elsa

    2017-10-01

    For the first time, a molecular approach was used to evaluate the phylogenetic structure of the disjunct native American distribution of the blue crab Callinectes sapidus. Population structure was investigated by sequencing 648bp of the Cytochrome oxidase subunit 1 (COI), in a total of 138 sequences stemming from individual samples from both the northern and southern hemispheres of the Western Atlantic distribution of the species. A Bayesian approach was used to construct a phylogenetic tree for all samples, and a 95% confidence parsimony network was created to depict the relationship among haplotypes. Results revealed two highly distinct lineages, one containing all samples from the United States and some from Brazil (lineage 1) and the second restricted to Brazil (lineage 2). In addition, gene flow (at least for females) was detected among estuaries at local scales and there is evidence for shared haplotypes in the south. Furthermore, the findings of this investigation support the contemporary introduction of haplotypes that have apparently spread from the south to the north Atlantic.

  4. Size distribution of airborne particulate matter emitted by the front-end processing of municipal solid waste feed material for large-scale anaerobic digesters

    SciTech Connect

    Gerrish, H.P.; Narasimhan, R.; Daly, E.L. Jr.; Sengupta, S.; Nemerow, N.L.; Wong, K.V.

    1984-07-01

    A 100-ton/day proof-of-concept facility has been constructed in Pompano Beach, Florida, to examine the feasibility of producing methane-rich gas from the anaerobic digestion of municipal solid waste. One of the possible environmental impacts is from the particulate matter emitted into the atmosphere by the secondary shredding and conveying of light fraction feed material to the digesters. It has been found that the amount of particulate matter emitted into the atmosphere by the front-end processing is an order of magnitude higher when the plant is operating compared to when it is not operating. It has been found that the particle size distribution is bimodal both when the plant is operating as well as when it is not operating. Particle concentrations of episodic nature were found in July 1981 which were four times the concentration found during normal plant operation.

  5. Column Store for GWAC: A High-cadence, High-density, Large-scale Astronomical Light Curve Pipeline and Distributed Shared-nothing Database

    NASA Astrophysics Data System (ADS)

    Wan, Meng; Wu, Chao; Wang, Jing; Qiu, Yulei; Xin, Liping; Mullender, Sjoerd; Mühleisen, Hannes; Scheers, Bart; Zhang, Ying; Nes, Niels; Kersten, Martin; Huang, Yongpan; Deng, Jinsong; Wei, Jianyan

    2016-11-01

    The ground-based wide-angle camera array (GWAC), a part of the SVOM space mission, will search for various types of optical transients by continuously imaging a field of view (FOV) of 5000 degrees2 every 15 s. Each exposure consists of 36 × 4k × 4k pixels, typically resulting in 36 × ∼175,600 extracted sources. For a modern time-domain astronomy project like GWAC, which produces massive amounts of data with a high cadence, it is challenging to search for short timescale transients in both real-time and archived data, and to build long-term light curves for variable sources. Here, we develop a high-cadence, high-density light curve pipeline (HCHDLP) to process the GWAC data in real-time, and design a distributed shared-nothing database to manage the massive amount of archived data which will be used to generate a source catalog with more than 100 billion records during 10 years of operation. First, we develop HCHDLP based on the column-store DBMS of MonetDB, taking advantage of MonetDB’s high performance when applied to massive data processing. To realize the real-time functionality of HCHDLP, we optimize the pipeline in its source association function, including both time and space complexity from outside the database (SQL semantic) and inside (RANGE-JOIN implementation), as well as in its strategy of building complex light curves. The optimized source association function is accelerated by three orders of magnitude. Second, we build a distributed database using a two-level time partitioning strategy via the MERGE TABLE and REMOTE TABLE technology of MonetDB. Intensive tests validate that our database architecture is able to achieve both linear scalability in response time and concurrent access by multiple users. In summary, our studies provide guidance for a solution to GWAC in real-time data processing and management of massive data.

  6. Changes in the distribution of the grey mangrove Avicennia marina (Forsk.) using large scale aerial color infrared photographs: are the changes related to habitat modification for mosquito control?

    NASA Astrophysics Data System (ADS)

    Jones, J.; Dale, P. E. R.; Chandica, A. L.; Breitfuss, M. J.

    2004-09-01

    Runnelling, a method of habitat modification used for mosquito management in intertidal saltmarshes in Australia, alters marsh hydrology. The objective of this research was to assess if runnelling had affected the distribution of the grey mangrove ( Avicennia marina (Forsk.)) at a study site in southeast Queensland. Since runnelling is carried out in diverse marshes a second aim was to assess differences in mangrove colonisation in the two main saltmarsh species in the area. These are marine couch [ Sporobolus virginicus (L.) Kunth.] and samphire [ Sarcocornia quinqueflora (Bunge ex Ung.-Stern.)]. Runnels at the study site were in an area dominated by Sporobolus. The mangrove area was measured by classifying digital color infrared (CIR) data obtained from aerial photographs acquired in 1982, which was 3 years before runnelling, and in 1987, 1991 and 1999, 2-14 years after. Changes in the spatial extent of A. marina were identified using difference images produced from post-classification change detection. The results showed that runnels did not significantly influence the distribution of A. marina at the study site. At a more detailed level differences in A. marina establishment in the Sporobolus and Sarcocornia areas were determined from counts of trees on the aerial photographs. There was a greater proportion of mangroves in Sarcocornia than Sporobolus and this increased over time. This may be related to differences in density between the plant species, to grapsid crab activity or to other edaphic conditions. There may be implications for runnelling in Sarcocornia marshes. The large increase observed in A. marina in the area generally is likely to be related to factors such as catchment modification or tidal/sea-level changes. It is concluded that runnelling has not led to mangrove establishment in the Sporobolus dominated saltmarsh.

  7. Preparing for national school-based deworming in Kenya: the validation and large-scale distribution of school questionnaires with urinary schistosomiasis

    PubMed Central

    Kihara, Jimmy; Mwandawiro, Charles; Waweru, Beth; Gitonga, Caroline W; Brooker, Simon

    2011-01-01

    Objective School questionnaires of self-reported schistosomiasis provide a rapid and simple approach for identifying schools at high risk of Schistosoma haematobium and requiring mass treatment. This study investigates the reliability of school questionnaires to identify such schools and infected children within the context of a national school-based deworming programme in Kenya. Methods Between November 2008 and March 2009, 6182 children from 61 schools in Coast Province, Kenya were asked by an interviewer whether they had blood in urine or urinary schistosomiasis (kichocho), and their results were compared with results from microscopic examination of urine samples. Subsequently, in 2009, a school-based questionnaire survey for self-reported schistosomiasis was distributed by the Ministry of Education to all schools in Coast Province, and its results were compared against results from the parasitological survey. The questionnaire survey results were linked to a schools database and mapped. Results Prevalence of self-reported blood in urine was lower among girls than boys among all ages. The use of a 30% threshold of reported blood in urine was both highly sensitive (91.7%) and specific (100%) in identifying high (>50%) prevalence schools in Coast Province. Questionnaires were however less reliable in diagnosing S. haematobium infection in individuals, particularly among young girls. Comparable levels of reliability were observed when the questionnaire was distributed through the existing education systems and administered by class teachers. Conclusions The results confirm that blood in urine questionnaires can be reliably used to target mass treatment with praziquantel at national scales. The mapped results of the Ministry of Education survey serve to describe the spatial variation of urinary schistosomiasis and identify schools requiring mass treatment. PMID:21767334

  8. The role of fine material and grain size distribution on excess pore pressure dissipation and particle support mechanisms in granular deposits based in large-scale physical experiments

    NASA Astrophysics Data System (ADS)

    Palucis, M. C.; Kaitna, R.; Tewoldebrhan, B.; Hill, K. M.; Dietrich, W. E.

    2011-12-01

    The dominant mechanisms behind sustained mobilization in granular debris flows are poorly understood, and experiments are needed to determine the conditions under which the fluid can fully support the coarse fraction. However, field-scale studies are difficult to instrument and constrain and laboratory studies suffer from scaling issues. A 4-m rotating drum located at UC Berkeley's Richmond Field Station allowed us to perform reproducible experiments with materials similar to those in the field to explore mechanisms relevant to slow pore fluid pressure dissipation. Specifically, we performed a series of experiments to assess the role of fines and grain size distribution on the rate of pore fluid pressure dissipation upon deposition of a granular mass. For each experiment we kept the total mass of the gravel particles constant and varied the amount of fines (from no fines to amounts found in an actual debris flow deposit) and the gravel particle size distribution (from a single grain size to a range found in natural flows). We first rotated each mixture in the drum, during which we monitored fluid pressures at the base of the flows (near the wall of the drum and at the center). Then we stopped the drum and continued to monitor the fluid pressures. Immediately upon stopping, the pore fluid pressure was nearly hydrostatic for the gravel-water flows, and any elevated pore pressure quickly dissipated. On the other hand, the mixtures with fines contents close to those found in actual debris flows had elevated pore pressures indicating they were almost fully liquefied. Furthermore, the rate of pore pressure dissipation was an order of magnitude slower than when no fines were present; the grain size distribution of the coarse fraction did not strongly influence the dissipation rates in either case. We also placed a cobble upon a fines-rich mixture after cessation of motion above the center pressure sensor, and observed that the pore fluid pressure rose instantly, bearing

  9. Spatial and temporal distributions of contaminant body burden and disease in Gulf of Mexico oyster populations: The role of local and large-scale climatic controls

    NASA Astrophysics Data System (ADS)

    Wilson, E. A.; Powell, E. N.; Wade, T. L.; Taylor, R. J.; Presley, B. J.; Brooks, J. M.

    1992-06-01

    As part of NOAA's Status and Trends Program, oysters were sampled from 43 sites throughout the Gulf of Mexico from Brownsville, Texas, to the Florida Everglades from 1986 to 1989. Oysters were analysed for body burden of a suite of metals and petroleum aromatic hydrocarbons (PAHs), the prevalence and intensity of the oyster pathogen, Perkinsus marinus, and condition index. The contaminants fell into two groups based on the spatial distribution of body burden throughout the Gulf. Arsenic, selenium, mercury and cadmium were characterized by clinal reduction in similarity with distance reminiscent of that followed by mean monthly temperature and precipitation. Zinc, copper, PAHs and silver showed no consistent geographic trend. Within local regions, industrial and agricultural and use and P. marinus prevalence and infection intensity frequently correlated with body burden. Contaminants and biological attributes followed one of three temporal trends. Zinc, copper and PAHs showed concordant shifts over 4 years throughout the eastern and southern Gulf. Mercury and cadmium showed concordant shifts in the northwestern Gulf. Selenium, arsenic, length, condition index and P. marinus prevalence and infection intensity showed concordant shifts throughout most of the entire Gulf. Concordant shifts suggest that climatic factors, the El Niño/Southern Oscillation being one example, exert a strong influence on biological attributes and contaminant body burdens in the Gulf. Correlative factors are those that probably affect or indicate the rate of tissue turnover and the frequency of reproduction; namely, temperature, disease intensity, condition index and length.

  10. Study of plasma pressure distribution in the inner magnetosphere using low-altitude satellites and its importance for the large-scale magnetospheric dynamics

    NASA Astrophysics Data System (ADS)

    Stepanova, M.; Antonova, E. E.; Bosqued, J.-M.

    2006-01-01

    Plasma pressure distribution in the inner magnetosphere is one of the key parameters for understanding main processes of the magnetospheric dynamics including geomagnetic substroms. However, the pressure profiles obtained from in situ measurements by the high-altitude satellites do not allow the tracking of the magnetospheric dynamics, because a time necessary to obtain these profiles (hours) generally exceeds the characteristic times of the main magnetospheric processes (minutes or less). On contrary, fast movement of low-altitude satellites makes it possible to obtain quasi-instantaneous profiles of plasma pressure along the satellite trajectory - radial or azimuthal, depending on the satellite orbit. Precipitating particle fluxes, measured by the low-altitude Aureol-3 satellite were used. Mapping into the equatorial plane and determination of a volume of the magnetic flux tube were made using the Tsyganenko 96 and 01 geomagnetic field models. Study of radial plasma gradients showed that the inner magnetosphere is stable for development of flute (interchange) instability. Modified interchange instability related to azimuthal plasma pressure gradients and field-aligned currents in equilibrium was proposed as a source of substorm expansion phase onset. It can develop when the density of the field-aligned current reaches a definite threshold value. The growth rate of the instability is higher in the region of upward field-aligned current, where the existing field-aligned potential drop leads to the magnetosphere-ionosphere decoupling.

  11. ADHydro: A Parallel Implementation of a Large-scale High-Resolution Multi-Physics Distributed Water Resources Model Using the Charm++ Run Time System

    NASA Astrophysics Data System (ADS)

    Steinke, R. C.; Ogden, F. L.; Lai, W.; Moreno, H. A.; Pureza, L. G.

    2014-12-01

    Physics-based watershed models are useful tools for hydrologic studies, water resources management and economic analyses in the contexts of climate, land-use, and water-use changes. This poster presents a parallel implementation of a quasi 3-dimensional, physics-based, high-resolution, distributed water resources model suitable for simulating large watersheds in a massively parallel computing environment. Developing this model is one of the objectives of the NSF EPSCoR RII Track II CI-WATER project, which is joint between Wyoming and Utah EPSCoR jurisdictions. The model, which we call ADHydro, is aimed at simulating important processes in the Rocky Mountain west, including: rainfall and infiltration, snowfall and snowmelt in complex terrain, vegetation and evapotranspiration, soil heat flux and freezing, overland flow, channel flow, groundwater flow, water management and irrigation. Model forcing is provided by the Weather Research and Forecasting (WRF) model, and ADHydro is coupled with the NOAH-MP land-surface scheme for calculating fluxes between the land and atmosphere. The ADHydro implementation uses the Charm++ parallel run time system. Charm++ is based on location transparent message passing between migrateable C++ objects. Each object represents an entity in the model such as a mesh element. These objects can be migrated between processors or serialized to disk allowing the Charm++ system to automatically provide capabilities such as load balancing and checkpointing. Objects interact with each other by passing messages that the Charm++ system routes to the correct destination object regardless of its current location. This poster discusses the algorithms, communication patterns, and caching strategies used to implement ADHydro with Charm++. The ADHydro model code will be released to the hydrologic community in late 2014.

  12. Large scale biomimetic membrane arrays.

    PubMed

    Hansen, Jesper S; Perry, Mark; Vogel, Jörg; Groth, Jesper S; Vissing, Thomas; Larsen, Marianne S; Geschke, Oliver; Emneús, Jenny; Bohr, Henrik; Nielsen, Claus H

    2009-10-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO(2) laser micro-structured 8 x 8 aperture partition arrays with average aperture diameters of 301 +/- 5 microm. We addressed the electro-physical properties of the lipid bilayers established across the micro-structured scaffold arrays by controllable reconstitution of biotechnological and physiological relevant membrane peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays, and furthermore demonstrate that the design can conveniently be scaled up to support planar lipid bilayers in large square-centimeter partition arrays.

  13. Challenges for Large Scale Simulations

    NASA Astrophysics Data System (ADS)

    Troyer, Matthias

    2010-03-01

    With computational approaches becoming ubiquitous the growing impact of large scale computing on research influences both theoretical and experimental work. I will review a few examples in condensed matter physics and quantum optics, including the impact of computer simulations in the search for supersolidity, thermometry in ultracold quantum gases, and the challenging search for novel phases in strongly correlated electron systems. While only a decade ago such simulations needed the fastest supercomputers, many simulations can now be performed on small workstation clusters or even a laptop: what was previously restricted to a few experts can now potentially be used by many. Only part of the gain in computational capabilities is due to Moore's law and improvement in hardware. Equally impressive is the performance gain due to new algorithms - as I will illustrate using some recently developed algorithms. At the same time modern peta-scale supercomputers offer unprecedented computational power and allow us to tackle new problems and address questions that were impossible to solve numerically only a few years ago. While there is a roadmap for future hardware developments to exascale and beyond, the main challenges are on the algorithmic and software infrastructure side. Among the problems that face the computational physicist are: the development of new algorithms that scale to thousands of cores and beyond, a software infrastructure that lifts code development to a higher level and speeds up the development of new simulation programs for large scale computing machines, tools to analyze the large volume of data obtained from such simulations, and as an emerging field provenance-aware software that aims for reproducibility of the complete computational workflow from model parameters to the final figures. Interdisciplinary collaborations and collective efforts will be required, in contrast to the cottage-industry culture currently present in many areas of computational

  14. The large-scale distribution and internal geometry of the fall 2000 Po River flood deposit: Evidence from digital X-radiography

    USGS Publications Warehouse

    Wheatcroft, R.A.; Stevens, A.W.; Hunt, L.M.; Milligan, T.G.

    2006-01-01

    Event-response coring on the Po River prodelta (northern Adriatic Sea) coupled with shipboard digital X-radiography, resistivity profiling, and grain-size analyses permitted documentation of the initial distribution and physical properties of the October 2000 flood deposit. The digital X-radiography system comprises a constant-potential X-ray source and an amorphous silicon imager with an active area of 29??42 cm and 12-bit depth resolution. Objective image segmentation algorithms based on bulk density (brightness), layer contacts (edge detection) and small-scale texture (fabric) were used to identify the flood deposit. Results indicate that the deposit formed in water depths of 6-29 m immediately adjacent to the three main distributary mouths of the Po (Pila, Tolle and Gnocca/Goro). Maximal thickness was 36 cm at a 20-m site off the main mouth (Pila), but many other sites hadthicknesses >20 cm. The Po flood deposit has a complex internal stratigraphy, with multiple layers, a diverse suite of physical sedimentary structures (e.g., laminations, ripple cross bedding, lenticular bedding, soft-sediment deformation structures), and dramatic changes in grain size that imply rapid deposition and fluctuations in energy during emplacement. Based on the flood deposit volume and well-constrained measurements of deposit bulk density the mass of the flood deposit was estimated to be 16??109 kg, which is about two-thirds of the estimated suspended sediment load delivered by the river during the event. The locus of deposition, overall thickness, and stratigraphic complexity of the flood deposit can best be explained by the relatively long sediment throughput times of the Po River, whereby sediment is delivered to the ocean during a range of conditions (i.e., the storm responsible for the precipitation is long gone), the majority of which are reflective of the fair-weather condition. Sediment is therefore deposited proximal to the river mouths, where it can form thick, but

  15. Large-scale distribution and activity patterns of an extremely low-light-adapted population of green sulfur bacteria in the Black Sea.

    PubMed

    Marschall, Evelyn; Jogler, Mareike; Hessge, Uta; Overmann, Jörg

    2010-05-01

    The Black Sea chemocline represents the largest extant habitat of anoxygenic phototrophic bacteria and harbours a monospecific population of Chlorobium phylotype BS-1. High-sensitivity measurements of underwater irradiance and sulfide revealed that the optical properties of the overlying water column were similar across the Black Sea basin, whereas the vertical profiles of sulfide varied strongly between sampling sites and caused a dome-shaped three-dimensional distribution of the green sulfur bacteria. In the centres of the western and eastern basins the population of BS-1 reached upward to depths of 80 and 95 m, respectively, but were detected only at 145 m depth close to the shelf. Using highly concentrated chemocline samples from the centres of the western and eastern basins, the cells were found to be capable of anoxygenic photosynthesis under in situ light conditions and exhibited a photosynthesis-irradiance curve similar to low-light-adapted laboratory cultures of Chlorobium BS-1. Application of a highly specific RT-qPCR method which targets the internal transcribed spacer (ITS) region of the rrn operon of BS-1 demonstrated that only cells at the central station are physiologically active in contrast to those at the Black Sea periphery. Based on the detection of ITS-DNA sequences in the flocculent surface layer of deep-sea sediments across the Black Sea, the population of BS-1 has occupied the major part of the basin for the last decade. The continued presence of intact but non-growing BS-1 cells at the periphery of the Black Sea indicates that the cells can survive long-distant transport and exhibit unusually low maintenance energy requirements. According to laboratory measurements, Chlorobium BS-1 has a maintenance energy requirement of approximately 1.6-4.9.10(-15) kJ cell(-1) day(-1) which is the lowest value determined for any bacterial culture so far. Chlorobium BS-1 thus is particularly well adapted to survival under the extreme low-light conditions

  16. Large-Scale PV Integration Study

    SciTech Connect

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  17. Large-scale PACS implementation.

    PubMed

    Carrino, J A; Unkel, P J; Miller, I D; Bowser, C L; Freckleton, M W; Johnson, T G

    1998-08-01

    The transition to filmless radiology is a much more formidable task than making the request for proposal to purchase a (Picture Archiving and Communications System) PACS. The Department of Defense and the Veterans Administration have been pioneers in the transformation of medical diagnostic imaging to the electronic environment. Many civilian sites are expected to implement large-scale PACS in the next five to ten years. This presentation will related the empirical insights gleaned at our institution from a large-scale PACS implementation. Our PACS integration was introduced into a fully operational department (not a new hospital) in which work flow had to continue with minimal impact. Impediments to user acceptance will be addressed. The critical components of this enormous task will be discussed. The topics covered during this session will include issues such as phased implementation, DICOM (digital imaging and communications in medicine) standard-based interaction of devices, hospital information system (HIS)/radiology information system (RIS) interface, user approval, networking, workstation deployment and backup procedures. The presentation will make specific suggestions regarding the implementation team, operating instructions, quality control (QC), training and education. The concept of identifying key functional areas is relevant to transitioning the facility to be entirely on line. Special attention must be paid to specific functional areas such as the operating rooms and trauma rooms where the clinical requirements may not match the PACS capabilities. The printing of films may be necessary for certain circumstances. The integration of teleradiology and remote clinics into a PACS is a salient topic with respect to the overall role of the radiologists providing rapid consultation. A Web-based server allows a clinician to review images and reports on a desk-top (personal) computer and thus reduce the number of dedicated PACS review workstations. This session

  18. An Adaptive Multiscale Finite Element Method for Large Scale Simulations

    DTIC Science & Technology

    2015-09-28

    the method . Using the above definitions , the weak statement of the non-linear local problem at the kth 4 DISTRIBUTION A: Distribution approved for...AFRL-AFOSR-VA-TR-2015-0305 An Adaptive Multiscale Finite Element Method for Large Scale Simulations Carlos Duarte UNIVERSITY OF ILLINOIS CHAMPAIGN...14-07-2015 4. TITLE AND SUBTITLE An Adaptive Multiscale Generalized Finite Element Method for Large Scale Simulations 5a.  CONTRACT NUMBER 5b

  19. Large-Scale Distributed Coalition Formation

    DTIC Science & Technology

    2009-09-01

    such as mobile devices, and constructing hierarchical networks to conform to dynamic constraints. Their system, Chordella, dynamically adjusts the...Proceedings of the IEEE/WIC/ACM International Conference on Intelligent Agent Techonology , IAT, 162–168. China, September 2004. URL http...16(3):617– 627, June 2008. ISSN 1063-6692. 17. Cao, Y. Uny, Alex S. Fukunaga, and Andrew B. Kahng. “Cooperative Mobile Robotics: Antecedents and

  20. Large scale cluster computing workshop

    SciTech Connect

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  1. Large-Scale Sequence Comparison.

    PubMed

    Lal, Devi; Verma, Mansi

    2017-01-01

    There are millions of sequences deposited in genomic databases, and it is an important task to categorize them according to their structural and functional roles. Sequence comparison is a prerequisite for proper categorization of both DNA and protein sequences, and helps in assigning a putative or hypothetical structure and function to a given sequence. There are various methods available for comparing sequences, alignment being first and foremost for sequences with a small number of base pairs as well as for large-scale genome comparison. Various tools are available for performing pairwise large sequence comparison. The best known tools either perform global alignment or generate local alignments between the two sequences. In this chapter we first provide basic information regarding sequence comparison. This is followed by the description of the PAM and BLOSUM matrices that form the basis of sequence comparison. We also give a practical overview of currently available methods such as BLAST and FASTA, followed by a description and overview of tools available for genome comparison including LAGAN, MumMER, BLASTZ, and AVID.

  2. Large Scale Homing in Honeybees

    PubMed Central

    Pahl, Mario; Zhu, Hong; Tautz, Jürgen; Zhang, Shaowu

    2011-01-01

    Honeybee foragers frequently fly several kilometres to and from vital resources, and communicate those locations to their nest mates by a symbolic dance language. Research has shown that they achieve this feat by memorizing landmarks and the skyline panorama, using the sun and polarized skylight as compasses and by integrating their outbound flight paths. In order to investigate the capacity of the honeybees' homing abilities, we artificially displaced foragers to novel release spots at various distances up to 13 km in the four cardinal directions. Returning bees were individually registered by a radio frequency identification (RFID) system at the hive entrance. We found that homing rate, homing speed and the maximum homing distance depend on the release direction. Bees released in the east were more likely to find their way back home, and returned faster than bees released in any other direction, due to the familiarity of global landmarks seen from the hive. Our findings suggest that such large scale homing is facilitated by global landmarks acting as beacons, and possibly the entire skyline panorama. PMID:21602920

  3. Large Scale Magnetostrictive Valve Actuator

    NASA Technical Reports Server (NTRS)

    Richard, James A.; Holleman, Elizabeth; Eddleman, David

    2008-01-01

    Marshall Space Flight Center's Valves, Actuators and Ducts Design and Development Branch developed a large scale magnetostrictive valve actuator. The potential advantages of this technology are faster, more efficient valve actuators that consume less power and provide precise position control and deliver higher flow rates than conventional solenoid valves. Magnetostrictive materials change dimensions when a magnetic field is applied; this property is referred to as magnetostriction. Magnetostriction is caused by the alignment of the magnetic domains in the material s crystalline structure and the applied magnetic field lines. Typically, the material changes shape by elongating in the axial direction and constricting in the radial direction, resulting in no net change in volume. All hardware and testing is complete. This paper will discuss: the potential applications of the technology; overview of the as built actuator design; discuss problems that were uncovered during the development testing; review test data and evaluate weaknesses of the design; and discuss areas for improvement for future work. This actuator holds promises of a low power, high load, proportionally controlled actuator for valves requiring 440 to 1500 newtons load.

  4. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  5. Large-scale Intelligent Transporation Systems simulation

    SciTech Connect

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  6. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  7. Voids in the Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    El-Ad, Hagai; Piran, Tsvi

    1997-12-01

    Voids are the most prominent feature of the large-scale structure of the universe. Still, their incorporation into quantitative analysis of it has been relatively recent, owing essentially to the lack of an objective tool to identify the voids and to quantify them. To overcome this, we present here the VOID FINDER algorithm, a novel tool for objectively quantifying voids in the galaxy distribution. The algorithm first classifies galaxies as either wall galaxies or field galaxies. Then, it identifies voids in the wall-galaxy distribution. Voids are defined as continuous volumes that do not contain any wall galaxies. The voids must be thicker than an adjustable limit, which is refined in successive iterations. In this way, we identify the same regions that would be recognized as voids by the eye. Small breaches in the walls are ignored, avoiding artificial connections between neighboring voids. We test the algorithm using Voronoi tesselations. By appropriate scaling of the parameters with the selection function, we apply it to two redshift surveys, the dense SSRS2 and the full-sky IRAS 1.2 Jy. Both surveys show similar properties: ~50% of the volume is filled by voids. The voids have a scale of at least 40 h-1 Mpc and an average -0.9 underdensity. Faint galaxies do not fill the voids, but they do populate them more than bright ones. These results suggest that both optically and IRAS-selected galaxies delineate the same large-scale structure. Comparison with the recovered mass distribution further suggests that the observed voids in the galaxy distribution correspond well to underdense regions in the mass distribution. This confirms the gravitational origin of the voids.

  8. Methane emissions on large scales

    NASA Astrophysics Data System (ADS)

    Beswick, K. M.; Simpson, T. W.; Fowler, D.; Choularton, T. W.; Gallagher, M. W.; Hargreaves, K. J.; Sutton, M. A.; Kaye, A.

    with previous results from the area, indicating that this method of data analysis provided good estimates of large scale methane emissions.

  9. Large Scale Nanolaminate Deformable Mirror

    SciTech Connect

    Papavasiliou, A; Olivier, S; Barbee, T; Miles, R; Chang, K

    2005-11-30

    This work concerns the development of a technology that uses Nanolaminate foils to form light-weight, deformable mirrors that are scalable over a wide range of mirror sizes. While MEMS-based deformable mirrors and spatial light modulators have considerably reduced the cost and increased the capabilities of adaptive optic systems, there has not been a way to utilize the advantages of lithography and batch-fabrication to produce large-scale deformable mirrors. This technology is made scalable by using fabrication techniques and lithography that are not limited to the sizes of conventional MEMS devices. Like many MEMS devices, these mirrors use parallel plate electrostatic actuators. This technology replicates that functionality by suspending a horizontal piece of nanolaminate foil over an electrode by electroplated nickel posts. This actuator is attached, with another post, to another nanolaminate foil that acts as the mirror surface. Most MEMS devices are produced with integrated circuit lithography techniques that are capable of very small line widths, but are not scalable to large sizes. This technology is very tolerant of lithography errors and can use coarser, printed circuit board lithography techniques that can be scaled to very large sizes. These mirrors use small, lithographically defined actuators and thin nanolaminate foils allowing them to produce deformations over a large area while minimizing weight. This paper will describe a staged program to develop this technology. First-principles models were developed to determine design parameters. Three stages of fabrication will be described starting with a 3 x 3 device using conventional metal foils and epoxy to a 10-across all-metal device with nanolaminate mirror surfaces.

  10. Measurement of the steady surface pressure distribution on a single rotation large scale advanced prop-fan blade at Mach numbers from 0.03 to 0.78

    NASA Technical Reports Server (NTRS)

    Bushnell, Peter

    1988-01-01

    The aerodynamic pressure distribution was determined on a rotating Prop-Fan blade at the S1-MA wind tunnel facility operated by the Office National D'Etudes et de Recherches Aerospatiale (ONERA) in Modane, France. The pressure distributions were measured at thirteen radial stations on a single rotation Large Scale Advanced Prop-Fan (LAP/SR7) blade, for a sequence of operating conditions including inflow Mach numbers ranging from 0.03 to 0.78. Pressure distributions for more than one power coefficient and/or advanced ratio setting were measured for most of the inflow Mach numbers investigated. Due to facility power limitations the Prop-Fan test installation was a two bladed version of the eight design configuration. The power coefficient range investigated was therefore selected to cover typical power loading per blade conditions which occur within the Prop-Fan operating envelope. The experimental results provide an extensive source of information on the aerodynamic behavior of the swept Prop-Fan blade, including details which were elusive to current computational models and do not appear in the two-dimensional airfoil data.

  11. Large scale study of tooth enamel

    SciTech Connect

    Bodart, F.; Deconninck, G.; Martin, M.Th.

    1981-04-01

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. One hundred eighty samples of teeth were first analysed using PIXE, backscattering and nuclear reaction techniques. The results were analysed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population.

  12. Organised convection embedded in a large-scale flow

    NASA Astrophysics Data System (ADS)

    Naumann, Ann Kristin; Stevens, Bjorn; Hohenegger, Cathy

    2017-04-01

    In idealised simulations of radiative convective equilibrium, convection aggregates spontaneously from randomly distributed convective cells into organized mesoscale convection despite homogeneous boundary conditions. Although these simulations apply very idealised setups, the process of self-aggregation is thought to be relevant for the development of tropical convective systems. One feature that idealised simulations usually neglect is the occurrence of a large-scale background flow. In the tropics, organised convection is embedded in a large-scale circulation system, which advects convection in along-wind direction and alters near surface convergence in the convective areas. A large-scale flow also modifies the surface fluxes, which are expected to be enhanced upwind of the convective area if a large-scale flow is applied. Convective clusters that are embedded in a large-scale flow therefore experience an asymmetric component of the surface fluxes, which influences the development and the pathway of a convective cluster. In this study, we use numerical simulations with explicit convection and add a large-scale flow to the established setup of radiative convective equilibrium. We then analyse how aggregated convection evolves when being exposed to wind forcing. The simulations suggest that convective line structures are more prevalent if a large-scale flow is present and that convective clusters move considerably slower than advection by the large-scale flow would suggest. We also study the asymmetric component of convective aggregation due to enhanced surface fluxes, and discuss the pathway and speed of convective clusters as a function of the large-scale wind speed.

  13. Needs, opportunities, and options for large scale systems research

    SciTech Connect

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  14. Supporting large scale applications on networks of workstations

    NASA Technical Reports Server (NTRS)

    Cooper, Robert; Birman, Kenneth P.

    1989-01-01

    Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.

  15. Large-Scale Reform Comes of Age

    ERIC Educational Resources Information Center

    Fullan, Michael

    2009-01-01

    This article reviews the history of large-scale education reform and makes the case that large-scale or whole system reform policies and strategies are becoming increasingly evident. The review briefly addresses the pre 1997 period concluding that while the pressure for reform was mounting that there were very few examples of deliberate or…

  16. Automating large-scale reactor systems

    SciTech Connect

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig.

  17. Large Scale EOF Analysis of Climate Data

    NASA Astrophysics Data System (ADS)

    Prabhat, M.; Gittens, A.; Kashinath, K.; Cavanaugh, N. R.; Mahoney, M.

    2016-12-01

    We present a distributed approach towards extracting EOFs from 3D climate data. We implement the method in Apache Spark, and process multi-TB sized datasets on O(1000-10,000) cores. We apply this method to latitude-weighted ocean temperature data from CSFR, a 2.2 terabyte-sized data set comprising ocean and subsurface reanalysis measurements collected at 41 levels in the ocean, at 6 hour intervals over 31 years. We extract the first 100 EOFs of this full data set and compare to the EOFs computed simply on the surface temperature field. Our analyses provide evidence of Kelvin and Rossy waves and components of large-scale modes of oscillation including the ENSO and PDO that are not visible in the usual SST EOFs. Further, they provide information on the the most influential parts of the ocean, such as the thermocline, that exist below the surface. Work is ongoing to understand the factors determining the depth-varying spatial patterns observed in the EOFs. We will experiment with weighting schemes to appropriately account for the differing depths of the observations. We also plan to apply the same distributed approach to analysis of analysis of 3D atmospheric climatic data sets, including multiple variables. Because the atmosphere changes on a quicker time-scale than the ocean, we expect that the results will demonstrate an even greater advantage to computing 3D EOFs in lieu of 2D EOFs.

  18. Statistical Measures of Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Vogeley, Michael; Geller, Margaret; Huchra, John; Park, Changbom; Gott, J. Richard

    1993-12-01

    \\inv Mpc} To quantify clustering in the large-scale distribution of galaxies and to test theories for the formation of structure in the universe, we apply statistical measures to the CfA Redshift Survey. This survey is complete to m_{B(0)}=15.5 over two contiguous regions which cover one-quarter of the sky and include ~ 11,000 galaxies. The salient features of these data are voids with diameter 30-50\\hmpc and coherent dense structures with a scale ~ 100\\hmpc. Comparison with N-body simulations rules out the ``standard" CDM model (Omega =1, b=1.5, sigma_8 =1) at the 99% confidence level because this model has insufficient power on scales lambda >30\\hmpc. An unbiased open universe CDM model (Omega h =0.2) and a biased CDM model with non-zero cosmological constant (Omega h =0.24, lambda_0 =0.6) match the observed power spectrum. The amplitude of the power spectrum depends on the luminosity of galaxies in the sample; bright (L>L(*) ) galaxies are more strongly clustered than faint galaxies. The paucity of bright galaxies in low-density regions may explain this dependence. To measure the topology of large-scale structure, we compute the genus of isodensity surfaces of the smoothed density field. On scales in the ``non-linear" regime, <= 10\\hmpc, the high- and low-density regions are multiply-connected over a broad range of density threshold, as in a filamentary net. On smoothing scales >10\\hmpc, the topology is consistent with statistics of a Gaussian random field. Simulations of CDM models fail to produce the observed coherence of structure on non-linear scales (>95% confidence level). The underdensity probability (the frequency of regions with density contrast delta rho //lineρ=-0.8) depends strongly on the luminosity of galaxies; underdense regions are significantly more common (>2sigma ) in bright (L>L(*) ) galaxy samples than in samples which include fainter galaxies.

  19. Large Scale Metal Additive Techniques Review

    SciTech Connect

    Nycz, Andrzej; Adediran, Adeola I; Noakes, Mark W; Love, Lonnie J

    2016-01-01

    In recent years additive manufacturing made long strides toward becoming a main stream production technology. Particularly strong progress has been made in large-scale polymer deposition. However, large scale metal additive has not yet reached parity with large scale polymer. This paper is a review study of the metal additive techniques in the context of building large structures. Current commercial devices are capable of printing metal parts on the order of several cubic feet compared to hundreds of cubic feet for the polymer side. In order to follow the polymer progress path several factors are considered: potential to scale, economy, environment friendliness, material properties, feedstock availability, robustness of the process, quality and accuracy, potential for defects, and post processing as well as potential applications. This paper focuses on current state of art of large scale metal additive technology with a focus on expanding the geometric limits.

  20. Large-scale regions of antimatter

    SciTech Connect

    Grobov, A. V. Rubin, S. G.

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  1. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  2. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  3. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  4. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  5. The CLASSgal code for relativistic cosmological large scale structure

    NASA Astrophysics Data System (ADS)

    Di Dio, Enea; Montanari, Francesco; Lesgourgues, Julien; Durrer, Ruth

    2013-11-01

    We present accurate and efficient computations of large scale structure observables, obtained with a modified version of the CLASS code which is made publicly available. This code includes all relativistic corrections and computes both the power spectrum Cl(z1,z2) and the corresponding correlation function ξ(θ,z1,z2) of the matter density and the galaxy number fluctuations in linear perturbation theory. For Gaussian initial perturbations, these quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory. We illustrate the usefulness of our code for cosmological parameter estimation through a few simple examples.

  6. EINSTEIN'S SIGNATURE IN COSMOLOGICAL LARGE-SCALE STRUCTURE

    SciTech Connect

    Bruni, Marco; Hidalgo, Juan Carlos; Wands, David

    2014-10-10

    We show how the nonlinearity of general relativity generates a characteristic nonGaussian signal in cosmological large-scale structure that we calculate at all perturbative orders in a large-scale limit. Newtonian gravity and general relativity provide complementary theoretical frameworks for modeling large-scale structure in ΛCDM cosmology; a relativistic approach is essential to determine initial conditions, which can then be used in Newtonian simulations studying the nonlinear evolution of the matter density. Most inflationary models in the very early universe predict an almost Gaussian distribution for the primordial metric perturbation, ζ. However, we argue that it is the Ricci curvature of comoving-orthogonal spatial hypersurfaces, R, that drives structure formation at large scales. We show how the nonlinear relation between the spatial curvature, R, and the metric perturbation, ζ, translates into a specific nonGaussian contribution to the initial comoving matter density that we calculate for the simple case of an initially Gaussian ζ. Our analysis shows the nonlinear signature of Einstein's gravity in large-scale structure.

  7. The Influence of Large-scale Environments on Galaxy Properties

    NASA Astrophysics Data System (ADS)

    Wei, Yu-qing; Wang, Lei; Dai, Cai-ping

    2017-07-01

    The star formation properties of galaxies and their dependence on environments play an important role for understanding the formation and evolution of galaxies. Using the galaxy sample of the Sloan Digital Sky Survey (SDSS), different research groups have studied the physical properties of galaxies and their large-scale environments. Here, using the filament catalog from Tempel et al. and the galaxy catalog of large-scale structure classification from Wang et al., and taking the influence of the galaxy morphology, high/low local density environment, and central (satellite) galaxy into consideration, we have found that the properties of galaxies are correlated with their residential large-scale environments: the SSFR (specific star formation rate) and SFR (star formation rate) strongly depend on the large-scale environment for spiral galaxies and satellite galaxies, but this dependence is very weak for elliptical galaxies and central galaxies, and the influence of large-scale environments on galaxies in low density region is more sensitive than that in high density region. The above conclusions remain valid even for the galaxies with the same mass. In addition, the SSFR distributions derived from the catalogs of Tempel et al. and Wang et al. are not entirely consistent.

  8. PKI security in large-scale healthcare networks.

    PubMed

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  9. Large Scale Density Estimation of Blue and Fin Whales (LSD)

    DTIC Science & Technology

    2014-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...estimating blue and fin whale density that is effective over large spatial scales and is designed to cope with spatial variation in animal density utilizing...a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse hydrophone arrays in the

  10. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  11. Large-scale nanophotonic phased array.

    PubMed

    Sun, Jie; Timurdogan, Erman; Yaacobi, Ami; Hosseini, Ehsan Shah; Watts, Michael R

    2013-01-10

    Electromagnetic phased arrays at radio frequencies are well known and have enabled applications ranging from communications to radar, broadcasting and astronomy. The ability to generate arbitrary radiation patterns with large-scale phased arrays has long been pursued. Although it is extremely expensive and cumbersome to deploy large-scale radiofrequency phased arrays, optical phased arrays have a unique advantage in that the much shorter optical wavelength holds promise for large-scale integration. However, the short optical wavelength also imposes stringent requirements on fabrication. As a consequence, although optical phased arrays have been studied with various platforms and recently with chip-scale nanophotonics, all of the demonstrations so far are restricted to one-dimensional or small-scale two-dimensional arrays. Here we report the demonstration of a large-scale two-dimensional nanophotonic phased array (NPA), in which 64 × 64 (4,096) optical nanoantennas are densely integrated on a silicon chip within a footprint of 576 μm × 576 μm with all of the nanoantennas precisely balanced in power and aligned in phase to generate a designed, sophisticated radiation pattern in the far field. We also show that active phase tunability can be realized in the proposed NPA by demonstrating dynamic beam steering and shaping with an 8 × 8 array. This work demonstrates that a robust design, together with state-of-the-art complementary metal-oxide-semiconductor technology, allows large-scale NPAs to be implemented on compact and inexpensive nanophotonic chips. In turn, this enables arbitrary radiation pattern generation using NPAs and therefore extends the functionalities of phased arrays beyond conventional beam focusing and steering, opening up possibilities for large-scale deployment in applications such as communication, laser detection and ranging, three-dimensional holography and biomedical sciences, to name just a few.

  12. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  13. Large-scale multimedia modeling applications

    SciTech Connect

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications.

  14. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  15. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  16. A Cloud Computing Platform for Large-Scale Forensic Computing

    NASA Astrophysics Data System (ADS)

    Roussev, Vassil; Wang, Liqiang; Richard, Golden; Marziale, Lodovico

    The timely processing of massive digital forensic collections demands the use of large-scale distributed computing resources and the flexibility to customize the processing performed on the collections. This paper describes MPI MapReduce (MMR), an open implementation of the MapReduce processing model that outperforms traditional forensic computing techniques. MMR provides linear scaling for CPU-intensive processing and super-linear scaling for indexing-related workloads.

  17. Semantic Concept Discovery for Large Scale Zero Shot Event Detection

    DTIC Science & Technology

    2015-07-25

    NO. 0704-0188 3. DATES COVERED (From - To) - UU UU UU UU 18-08-2015 Approved for public release; distribution is unlimited. Semantic Concept Discovery ...Office P.O. Box 12211 Research Triangle Park, NC 27709-2211 zero shot event detection, semantic concept discovery REPORT DOCUMENTATION PAGE 11...Mellon University 5000 Forbes Avenue Pittsburgh, PA 15213 -3815 ABSTRACT Semantic Concept Discovery for Large-Scale Zero-Shot Event Detection Report

  18. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  19. Large-scale Advanced Propfan (LAP) program

    NASA Technical Reports Server (NTRS)

    Sagerser, D. A.; Ludemann, S. G.

    1985-01-01

    The propfan is an advanced propeller concept which maintains the high efficiencies traditionally associated with conventional propellers at the higher aircraft cruise speeds associated with jet transports. The large-scale advanced propfan (LAP) program extends the research done on 2 ft diameter propfan models to a 9 ft diameter article. The program includes design, fabrication, and testing of both an eight bladed, 9 ft diameter propfan, designated SR-7L, and a 2 ft diameter aeroelastically scaled model, SR-7A. The LAP program is complemented by the propfan test assessment (PTA) program, which takes the large-scale propfan and mates it with a gas generator and gearbox to form a propfan propulsion system and then flight tests this system on the wing of a Gulfstream 2 testbed aircraft.

  20. Large-scale fibre-array multiplexing

    SciTech Connect

    Cheremiskin, I V; Chekhlova, T K

    2001-05-31

    The possibility of creating a fibre multiplexer/demultiplexer with large-scale multiplexing without any basic restrictions on the number of channels and the spectral spacing between them is shown. The operating capacity of a fibre multiplexer based on a four-fibre array ensuring a spectral spacing of 0.7 pm ({approx} 10 GHz) between channels is demonstrated. (laser applications and other topics in quantum electronics)

  1. Modeling Human Behavior at a Large Scale

    DTIC Science & Technology

    2012-01-01

    Discerning intentions in dynamic human action. Trends in Cognitive Sciences , 5(4):171 – 178, 2001. Shirli Bar-David, Israel Bar-David, Paul C. Cross, Sadie...Limits of predictability in human mobility. Science , 327(5968):1018, 2010. S.A. Stouffer. Intervening opportunities: a theory relating mobility and...Modeling Human Behavior at a Large Scale by Adam Sadilek Submitted in Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy

  2. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2008-09-30

    aerosol species up to six days in advance anywhere on the globe. NAAPS and COAMPS are particularly useful for forecasts of dust storms in areas...impact cloud processes globally. With increasing dust storms due to climate change and land use changes in desert regions, the impact of the...bacteria in large-scale dust storms is expected to significantly impact warm ice cloud formation, human health, and ecosystems globally. In Niemi et al

  3. Large-scale instabilities of helical flows

    NASA Astrophysics Data System (ADS)

    Cameron, Alexandre; Alexakis, Alexandros; Brachet, Marc-Étienne

    2016-10-01

    Large-scale hydrodynamic instabilities of periodic helical flows of a given wave number K are investigated using three-dimensional Floquet numerical computations. In the Floquet formalism the unstable field is expanded in modes of different spacial periodicity. This allows us (i) to clearly distinguish large from small scale instabilities and (ii) to study modes of wave number q of arbitrarily large-scale separation q ≪K . Different flows are examined including flows that exhibit small-scale turbulence. The growth rate σ of the most unstable mode is measured as a function of the scale separation q /K ≪1 and the Reynolds number Re. It is shown that the growth rate follows the scaling σ ∝q if an AKA effect [Frisch et al., Physica D: Nonlinear Phenomena 28, 382 (1987), 10.1016/0167-2789(87)90026-1] is present or a negative eddy viscosity scaling σ ∝q2 in its absence. This holds both for the Re≪1 regime where previously derived asymptotic results are verified but also for Re=O (1 ) that is beyond their range of validity. Furthermore, for values of Re above a critical value ReSc beyond which small-scale instabilities are present, the growth rate becomes independent of q and the energy of the perturbation at large scales decreases with scale separation. The nonlinear behavior of these large-scale instabilities is also examined in the nonlinear regime where the largest scales of the system are found to be the most dominant energetically. These results are interpreted by low-order models.

  4. Economically viable large-scale hydrogen liquefaction

    NASA Astrophysics Data System (ADS)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  5. Large-Scale Visual Data Analysis

    NASA Astrophysics Data System (ADS)

    Johnson, Chris

    2014-04-01

    Modern high performance computers have speeds measured in petaflops and handle data set sizes measured in terabytes and petabytes. Although these machines offer enormous potential for solving very large-scale realistic computational problems, their effectiveness will hinge upon the ability of human experts to interact with their simulation results and extract useful information. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most most important tools in helping to understand such large-scale information. Our research at the Scientific Computing and Imaging (SCI) Institute at the University of Utah has focused on innovative, scalable techniques for large-scale 3D visual data analysis. In this talk, I will present state- of-the-art visualization techniques, including scalable visualization algorithms and software, cluster-based visualization methods and innovate visualization techniques applied to problems in computational science, engineering, and medicine. I will conclude with an outline for a future high performance visualization research challenges and opportunities.

  6. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  7. Large-scale biodiversity patterns in freshwater phytoplankton.

    PubMed

    Stomp, Maayke; Huisman, Jef; Mittelbach, Gary G; Litchman, Elena; Klausmeier, Christopher A

    2011-11-01

    Our planet shows striking gradients in the species richness of plants and animals, from high biodiversity in the tropics to low biodiversity in polar and high-mountain regions. Recently, similar patterns have been described for some groups of microorganisms, but the large-scale biogeographical distribution of freshwater phytoplankton diversity is still largely unknown. We examined the species diversity of freshwater phytoplankton sampled from 540 lakes and reservoirs distributed across the continental United States and found strong latitudinal, longitudinal, and altitudinal gradients in phytoplankton biodiversity, demonstrating that microorganisms can show substantial geographic variation in biodiversity. Detailed analysis using structural equation models indicated that these large-scale biodiversity gradients in freshwater phytoplankton diversity were mainly driven by local environmental factors, although there were residual direct effects of latitude, longitude, and altitude as well. Specifically, we found that phytoplankton species richness was an increasing saturating function of lake chlorophyll a concentration, increased with lake surface area and possibly increased with water temperature, resembling effects of productivity, habitat area, and temperature on diversity patterns commonly observed for macroorganisms. In turn, these local environmental factors varied along latitudinal, longitudinal, and altitudinal gradients. These results imply that changes in land use or climate that affect these local environmental factors are likely to have major impacts on large-scale biodiversity patterns of freshwater phytoplankton.

  8. Population generation for large-scale simulation

    NASA Astrophysics Data System (ADS)

    Hannon, Andrew C.; King, Gary; Morrison, Clayton; Galstyan, Aram; Cohen, Paul

    2005-05-01

    Computer simulation is used to research phenomena ranging from the structure of the space-time continuum to population genetics and future combat.1-3 Multi-agent simulations in particular are now commonplace in many fields.4, 5 By modeling populations whose complex behavior emerges from individual interactions, these simulations help to answer questions about effects where closed form solutions are difficult to solve or impossible to derive.6 To be useful, simulations must accurately model the relevant aspects of the underlying domain. In multi-agent simulation, this means that the modeling must include both the agents and their relationships. Typically, each agent can be modeled as a set of attributes drawn from various distributions (e.g., height, morale, intelligence and so forth). Though these can interact - for example, agent height is related to agent weight - they are usually independent. Modeling relations between agents, on the other hand, adds a new layer of complexity, and tools from graph theory and social network analysis are finding increasing application.7, 8 Recognizing the role and proper use of these techniques, however, remains the subject of ongoing research. We recently encountered these complexities while building large scale social simulations.9-11 One of these, the Hats Simulator, is designed to be a lightweight proxy for intelligence analysis problems. Hats models a "society in a box" consisting of many simple agents, called hats. Hats gets its name from the classic spaghetti western, in which the heroes and villains are known by the color of the hats they wear. The Hats society also has its heroes and villains, but the challenge is to identify which color hat they should be wearing based on how they behave. There are three types of hats: benign hats, known terrorists, and covert terrorists. Covert terrorists look just like benign hats but act like terrorists. Population structure can make covert hat identification significantly more

  9. LARGE SCALE PURIFICATION OF PROTEINASES FROM CLOSTRIDIUM HISTOLYTICUM FILTRATES

    PubMed Central

    Conklin, David A.; Webster, Marion E.; Altieri, Patricia L.; Berman, Sanford; Lowenthal, Joseph P.; Gochenour, Raymond B.

    1961-01-01

    Conklin, David A. (Walter Reed Army Institute of Research, Washington, D. C.), Marion E. Webster, Patricia L. Altieri, Sanford Berman, Joseph P. Lowenthal, and Raymond B. Gochenour. Large scale purification of proteinases from Clostridium histolyticum filtrates. J. Bacteriol. 82:589–594. 1961.—A method for the large scale preparation and partial purification of Clostridium histolyticum proteinases by fractional precipitation with ammonium sulfate is described. Conditions for adequate separation and purification of the δ-proteinase and the gelatinase were obtained. Collagenase, on the other hand, was found distributed in four to five fractions and little increase in purity was achieved as compared to the crude ammonium sulfate precipitates. PMID:13880849

  10. The CLASSgal code for relativistic cosmological large scale structure

    SciTech Connect

    Dio, Enea Di; Montanari, Francesco; Durrer, Ruth; Lesgourgues, Julien E-mail: Francesco.Montanari@unige.ch E-mail: Ruth.Durrer@unige.ch

    2013-11-01

    We present accurate and efficient computations of large scale structure observables, obtained with a modified version of the CLASS code which is made publicly available. This code includes all relativistic corrections and computes both the power spectrum C{sub ℓ}(z{sub 1},z{sub 2}) and the corresponding correlation function ξ(θ,z{sub 1},z{sub 2}) of the matter density and the galaxy number fluctuations in linear perturbation theory. For Gaussian initial perturbations, these quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory. We illustrate the usefulness of our code for cosmological parameter estimation through a few simple examples.

  11. Large-Scale Optimization for Bayesian Inference in Complex Systems

    SciTech Connect

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their

  12. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  13. What is a large-scale dynamo?

    NASA Astrophysics Data System (ADS)

    Nigro, G.; Pongkitiwanichakul, P.; Cattaneo, F.; Tobias, S. M.

    2017-01-01

    We consider kinematic dynamo action in a sheared helical flow at moderate to high values of the magnetic Reynolds number (Rm). We find exponentially growing solutions which, for large enough shear, take the form of a coherent part embedded in incoherent fluctuations. We argue that at large Rm large-scale dynamo action should be identified by the presence of structures coherent in time, rather than those at large spatial scales. We further argue that although the growth rate is determined by small-scale processes, the period of the coherent structures is set by mean-field considerations.

  14. Large-scale brightenings associated with flares

    NASA Technical Reports Server (NTRS)

    Mandrini, Cristina H.; Machado, Marcos E.

    1992-01-01

    It is shown that large-scale brightenings (LSBs) associated with solar flares, similar to the 'giant arches' discovered by Svestka et al. (1982) in images obtained by the SSM HXIS hours after the onset of two-ribbon flares, can also occur in association with confined flares in complex active regions. For these events, a clear link between the LSB and the underlying flare is clearly evident from the active-region magnetic field topology. The implications of these findings are discussed within the framework of the interacting loops of flares and the giant arch phenomenology.

  15. Large scale phononic metamaterials for seismic isolation

    SciTech Connect

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  16. Large-scale planar lightwave circuits

    NASA Astrophysics Data System (ADS)

    Bidnyk, Serge; Zhang, Hua; Pearson, Matt; Balakrishnan, Ashok

    2011-01-01

    By leveraging advanced wafer processing and flip-chip bonding techniques, we have succeeded in hybrid integrating a myriad of active optical components, including photodetectors and laser diodes, with our planar lightwave circuit (PLC) platform. We have combined hybrid integration of active components with monolithic integration of other critical functions, such as diffraction gratings, on-chip mirrors, mode-converters, and thermo-optic elements. Further process development has led to the integration of polarization controlling functionality. Most recently, all these technological advancements have been combined to create large-scale planar lightwave circuits that comprise hundreds of optical elements integrated on chips less than a square inch in size.

  17. Colloquium: Large scale simulations on GPU clusters

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  18. Neutrinos and large-scale structure

    SciTech Connect

    Eisenstein, Daniel J.

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  19. Large-scale Heterogeneous Network Data Analysis

    DTIC Science & Technology

    2012-07-31

    Data for Multi-Player Influence Maximization on Social Networks.” KDD 2012 (Demo).  Po-Tzu Chang , Yen-Chieh Huang, Cheng-Lun Yang, Shou-De Lin, Pu...Jen Cheng. “Learning-Based Time-Sensitive Re-Ranking for Web Search.” SIGIR 2012 (poster)  Hung -Che Lai, Cheng-Te Li, Yi-Chen Lo, and Shou-De Lin...Exploiting and Evaluating MapReduce for Large-Scale Graph Mining.” ASONAM 2012 (Full, 16% acceptance ratio).  Hsun-Ping Hsieh , Cheng-Te Li, and Shou

  20. Internationalization Measures in Large Scale Research Projects

    NASA Astrophysics Data System (ADS)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  1. Large-scale Globally Propagating Coronal Waves.

    PubMed

    Warmuth, Alexander

    Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the "classical" interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which "pseudo waves" are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  2. [A large-scale accident in Alpine terrain].

    PubMed

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  3. Reliability assessment for components of large scale photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Ahadi, Amir; Ghadimi, Noradin; Mirabbasi, Davar

    2014-10-01

    Photovoltaic (PV) systems have significantly shifted from independent power generation systems to a large-scale grid-connected generation systems in recent years. The power output of PV systems is affected by the reliability of various components in the system. This study proposes an analytical approach to evaluate the reliability of large-scale, grid-connected PV systems. The fault tree method with an exponential probability distribution function is used to analyze the components of large-scale PV systems. The system is considered in the various sequential and parallel fault combinations in order to find all realistic ways in which the top or undesired events can occur. Additionally, it can identify areas that the planned maintenance should focus on. By monitoring the critical components of a PV system, it is possible not only to improve the reliability of the system, but also to optimize the maintenance costs. The latter is achieved by informing the operators about the system component's status. This approach can be used to ensure secure operation of the system by its flexibility in monitoring system applications. The implementation demonstrates that the proposed method is effective and efficient and can conveniently incorporate more system maintenance plans and diagnostic strategies.

  4. Equivalent common path method in large-scale laser comparator

    NASA Astrophysics Data System (ADS)

    He, Mingzhao; Li, Jianshuang; Miao, Dongjing

    2015-02-01

    Large-scale laser comparator is main standard device that providing accurate, reliable and traceable measurements for high precision large-scale line and 3D measurement instruments. It mainly composed of guide rail, motion control system, environmental parameters monitoring system and displacement measurement system. In the laser comparator, the main error sources are temperature distribution, straightness of guide rail and pitch and yaw of measuring carriage. To minimize the measurement uncertainty, an equivalent common optical path scheme is proposed and implemented. Three laser interferometers are adjusted to parallel with the guide rail. The displacement in an arbitrary virtual optical path is calculated using three displacements without the knowledge of carriage orientations at start and end positions. The orientation of air floating carriage is calculated with displacements of three optical path and position of three retroreflectors which are precisely measured by Laser Tracker. A 4th laser interferometer is used in the virtual optical path as reference to verify this compensation method. This paper analyzes the effect of rail straightness on the displacement measurement. The proposed method, through experimental verification, can improve the measurement uncertainty of large-scale laser comparator.

  5. Large-scale flow generation by inhomogeneous helicity.

    PubMed

    Yokoi, N; Brandenburg, A

    2016-03-01

    The effect of kinetic helicity (velocity-vorticity correlation) on turbulent momentum transport is investigated. The turbulent kinetic helicity (pseudoscalar) enters the Reynolds stress (mirror-symmetric tensor) expression in the form of a helicity gradient as the coupling coefficient for the mean vorticity and/or the angular velocity (axial vector), which suggests the possibility of mean-flow generation in the presence of inhomogeneous helicity. This inhomogeneous helicity effect, which was previously confirmed at the level of a turbulence- or closure-model simulation, is examined with the aid of direct numerical simulations of rotating turbulence with nonuniform helicity sustained by an external forcing. The numerical simulations show that the spatial distribution of the Reynolds stress is in agreement with the helicity-related term coupled with the angular velocity, and that a large-scale flow is generated in the direction of angular velocity. Such a large-scale flow is not induced in the case of homogeneous turbulent helicity. This result confirms the validity of the inhomogeneous helicity effect in large-scale flow generation and suggests that a vortex dynamo is possible even in incompressible turbulence where there is no baroclinicity effect.

  6. Multitree Algorithms for Large-Scale Astrostatistics

    NASA Astrophysics Data System (ADS)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  7. Efficient, large scale separation of coal macerals

    SciTech Connect

    Dyrkacz, G.R.; Bloomquist, C.A.A.

    1988-01-01

    The authors believe that the separation of macerals by continuous flow centrifugation offers a simple technique for the large scale separation of macerals. With relatively little cost (/approximately/ $10K), it provides an opportunity for obtaining quite pure maceral fractions. Although they have not completely worked out all the nuances of this separation system, they believe that the problems they have indicated can be minimized to pose only minor inconvenience. It cannot be said that this system completely bypasses the disagreeable tedium or time involved in separating macerals, nor will it by itself overcome the mental inertia required to make maceral separation an accepted necessary fact in fundamental coal science. However, they find their particular brand of continuous flow centrifugation is considerably faster than sink/float separation, can provide a good quality product with even one separation cycle, and permits the handling of more material than a conventional sink/float centrifuge separation.

  8. Primer design for large scale sequencing.

    PubMed Central

    Haas, S; Vingron, M; Poustka, A; Wiemann, S

    1998-01-01

    We have developed PRIDE, a primer design program that automatically designs primers in single contigs or whole sequencing projects to extend the already known sequence and to double strand single-stranded regions. The program is fully integrated into the Staden package (GAP4) and accessible with a graphical user interface. PRIDE uses a fuzzy logic-based system to calculate primer qualities. The computational performance of PRIDE is enhanced by using suffix trees to store the huge amount of data being produced. A test set of 110 sequencing primers and 11 PCR primer pairs has been designed on genomic templates, cDNAs and sequences containing repetitive elements to analyze PRIDE's success rate. The high performance of PRIDE, combined with its minimal requirement of user interaction and its fast algorithm, make this program useful for the large scale design of primers, especially in large sequencing projects. PMID:9611248

  9. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  10. Large-Scale Organization of Glycosylation Networks

    NASA Astrophysics Data System (ADS)

    Kim, Pan-Jun; Lee, Dong-Yup; Jeong, Hawoong

    2009-03-01

    Glycosylation is a highly complex process to produce a diverse repertoire of cellular glycans that are frequently attached to proteins and lipids. Glycans participate in fundamental biological processes including molecular trafficking and clearance, cell proliferation and apoptosis, developmental biology, immune response, and pathogenesis. N-linked glycans found on proteins are formed by sequential attachments of monosaccharides with the help of a relatively small number of enzymes. Many of these enzymes can accept multiple N-linked glycans as substrates, thus generating a large number of glycan intermediates and their intermingled pathways. Motivated by the quantitative methods developed in complex network research, we investigate the large-scale organization of such N-glycosylation pathways in a mammalian cell. The uncovered results give the experimentally-testable predictions for glycosylation process, and can be applied to the engineering of therapeutic glycoproteins.

  11. Large-scale optimization of neuron arbors

    NASA Astrophysics Data System (ADS)

    Cherniak, Christopher; Changizi, Mark; Won Kang, Du

    1999-05-01

    At the global as well as local scales, some of the geometry of types of neuron arbors-both dendrites and axons-appears to be self-organizing: Their morphogenesis behaves like flowing water, that is, fluid dynamically; waterflow in branching networks in turn acts like a tree composed of cords under tension, that is, vector mechanically. Branch diameters and angles and junction sites conform significantly to this model. The result is that such neuron tree samples globally minimize their total volume-rather than, for example, surface area or branch length. In addition, the arbors perform well at generating the cheapest topology interconnecting their terminals: their large-scale layouts are among the best of all such possible connecting patterns, approaching 5% of optimum. This model also applies comparably to arterial and river networks.

  12. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  13. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.

  14. Large scale preparation of pure phycobiliproteins.

    PubMed

    Padgett, M P; Krogmann, D W

    1987-01-01

    This paper describes simple procedures for the purification of large amounts of phycocyanin and allophycocyanin from the cyanobacterium Microcystis aeruginosa. A homogeneous natural bloom of this organism provided hundreds of kilograms of cells. Large samples of cells were broken by freezing and thawing. Repeated extraction of the broken cells with distilled water released phycocyanin first, then allophycocyanin, and provides supporting evidence for the current models of phycobilisome structure. The very low ionic strength of the aqueous extracts allowed allophycocyanin release in a particulate form so that this protein could be easily concentrated by centrifugation. Other proteins in the extract were enriched and concentrated by large scale membrane filtration. The biliproteins were purified to homogeneity by chromatography on DEAE cellulose. Purity was established by HPLC and by N-terminal amino acid sequence analysis. The proteins were examined for stability at various pHs and exposures to visible light.

  15. Primer design for large scale sequencing.

    PubMed

    Haas, S; Vingron, M; Poustka, A; Wiemann, S

    1998-06-15

    We have developed PRIDE, a primer design program that automatically designs primers in single contigs or whole sequencing projects to extend the already known sequence and to double strand single-stranded regions. The program is fully integrated into the Staden package (GAP4) and accessible with a graphical user interface. PRIDE uses a fuzzy logic-based system to calculate primer qualities. The computational performance of PRIDE is enhanced by using suffix trees to store the huge amount of data being produced. A test set of 110 sequencing primers and 11 PCR primer pairs has been designed on genomic templates, cDNAs and sequences containing repetitive elements to analyze PRIDE's success rate. The high performance of PRIDE, combined with its minimal requirement of user interaction and its fast algorithm, make this program useful for the large scale design of primers, especially in large sequencing projects.

  16. Large-scale synthesis of peptides.

    PubMed

    Andersson, L; Blomberg, L; Flegel, M; Lepsa, L; Nilsson, B; Verlander, M

    2000-01-01

    Recent advances in the areas of formulation and delivery have rekindled the interest of the pharmaceutical community in peptides as drug candidates, which, in turn, has provided a challenge to the peptide industry to develop efficient methods for the manufacture of relatively complex peptides on scales of up to metric tons per year. This article focuses on chemical synthesis approaches for peptides, and presents an overview of the methods available and in use currently, together with a discussion of scale-up strategies. Examples of the different methods are discussed, together with solutions to some specific problems encountered during scale-up development. Finally, an overview is presented of issues common to all manufacturing methods, i.e., methods used for the large-scale purification and isolation of final bulk products and regulatory considerations to be addressed during scale-up of processes to commercial levels. Copyright 2000 John Wiley & Sons, Inc. Biopolymers (Pept Sci) 55: 227-250, 2000

  17. Large Scale Quantum Simulations of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 < ρ < 0 . 10 fm-3, proton fractions 0 . 05

  18. Jovian large-scale stratospheric circulation

    NASA Technical Reports Server (NTRS)

    West, R. A.; Friedson, A. J.; Appleby, J. F.

    1992-01-01

    An attempt is made to diagnose the annual-average mean meridional residual Jovian large-scale stratospheric circulation from observations of the temperature and reflected sunlight that reveal the morphology of the aerosol heating. The annual mean solar heating, total radiative flux divergence, mass stream function, and Eliassen-Palm flux divergence are shown. The stratospheric radiative flux divergence is dominated the high latitudes by aerosol absorption. Between the 270 and 100 mbar pressure levels, where there is no aerosol heating in the model, the structure of the circulation at low- to midlatitudes is governed by the meridional variation of infrared cooling in association with the variation of zonal mean temperatures observed by IRIS. The principal features of the vertical velocity profile found by Gierasch et al. (1986) are recovered in the present calculation.

  19. Large-scale parametric survival analysis.

    PubMed

    Mittal, Sushil; Madigan, David; Cheng, Jerry Q; Burd, Randall S

    2013-10-15

    Survival analysis has been a topic of active statistical research in the past few decades with applications spread across several areas. Traditional applications usually consider data with only a small numbers of predictors with a few hundreds or thousands of observations. Recent advances in data acquisition techniques and computation power have led to considerable interest in analyzing very-high-dimensional data where the number of predictor variables and the number of observations range between 10(4) and 10(6). In this paper, we present a tool for performing large-scale regularized parametric survival analysis using a variant of the cyclic coordinate descent method. Through our experiments on two real data sets, we show that application of regularized models to high-dimensional data avoids overfitting and can provide improved predictive performance and calibration over corresponding low-dimensional models.

  20. Large-Scale Parametric Survival Analysis†

    PubMed Central

    Mittal, Sushil; Madigan, David; Cheng, Jerry; Burd, Randall S.

    2013-01-01

    Survival analysis has been a topic of active statistical research in the past few decades with applications spread across several areas. Traditional applications usually consider data with only small numbers of predictors with a few hundreds or thousands of observations. Recent advances in data acquisition techniques and computation power has led to considerable interest in analyzing very high-dimensional data where the number of predictor variables and the number of observations range between 104 – 106. In this paper, we present a tool for performing large-scale regularized parametric survival analysis using a variant of cyclic coordinate descent method. Through our experiments on two real data sets, we show that application of regularized models to high-dimensional data avoids overfitting and can provide improved predictive performance and calibration over corresponding low-dimensional models. PMID:23625862

  1. The challenge of large-scale structure

    NASA Astrophysics Data System (ADS)

    Gregory, S. A.

    1996-03-01

    The tasks that I have assumed for myself in this presentation include three separate parts. The first, appropriate to the particular setting of this meeting, is to review the basic work of the founding of this field; the appropriateness comes from the fact that W. G. Tifft made immense contributions that are not often realized by the astronomical community. The second task is to outline the general tone of the observational evidence for large scale structures. (Here, in particular, I cannot claim to be complete. I beg forgiveness from any workers who are left out by my oversight for lack of space and time.) The third task is to point out some of the major aspects of the field that may represent the clues by which some brilliant sleuth will ultimately figure out how galaxies formed.

  2. Modeling the Internet's large-scale topology

    PubMed Central

    Yook, Soon-Hyung; Jeong, Hawoong; Barabási, Albert-László

    2002-01-01

    Network generators that capture the Internet's large-scale topology are crucial for the development of efficient routing protocols and modeling Internet traffic. Our ability to design realistic generators is limited by the incomplete understanding of the fundamental driving forces that affect the Internet's evolution. By combining several independent databases capturing the time evolution, topology, and physical layout of the Internet, we identify the universal mechanisms that shape the Internet's router and autonomous system level topology. We find that the physical layout of nodes form a fractal set, determined by population density patterns around the globe. The placement of links is driven by competition between preferential attachment and linear distance dependence, a marked departure from the currently used exponential laws. The universal parameters that we extract significantly restrict the class of potentially correct Internet models and indicate that the networks created by all available topology generators are fundamentally different from the current Internet. PMID:12368484

  3. Large-scale sequential quadratic programming algorithms

    SciTech Connect

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  4. Supporting large-scale computational science

    SciTech Connect

    Musick, R., LLNL

    1998-02-19

    Business needs have driven the development of commercial database systems since their inception. As a result, there has been a strong focus on supporting many users, minimizing the potential corruption or loss of data, and maximizing performance metrics like transactions per second, or TPC-C and TPC-D results. It turns out that these optimizations have little to do with the needs of the scientific community, and in particular have little impact on improving the management and use of large-scale high-dimensional data. At the same time, there is an unanswered need in the scientific community for many of the benefits offered by a robust DBMS. For example, tying an ad-hoc query language such as SQL together with a visualization toolkit would be a powerful enhancement to current capabilities. Unfortunately, there has been little emphasis or discussion in the VLDB community on this mismatch over the last decade. The goal of the paper is to identify the specific issues that need to be resolved before large-scale scientific applications can make use of DBMS products. This topic is addressed in the context of an evaluation of commercial DBMS technology applied to the exploration of data generated by the Department of Energy`s Accelerated Strategic Computing Initiative (ASCI). The paper describes the data being generated for ASCI as well as current capabilities for interacting with and exploring this data. The attraction of applying standard DBMS technology to this domain is discussed, as well as the technical and business issues that currently make this an infeasible solution.

  5. Improving Recent Large-Scale Pulsar Surveys

    NASA Astrophysics Data System (ADS)

    Cardoso, Rogerio Fernando; Ransom, S.

    2011-01-01

    Pulsars are unique in that they act as celestial laboratories for precise tests of gravity and other extreme physics (Kramer 2004). There are approximately 2000 known pulsars today, which is less than ten percent of pulsars in the Milky Way according to theoretical models (Lorimer 2004). Out of these 2000 known pulsars, approximately ten percent are known millisecond pulsars, objects used for their period stability for detailed physics tests and searches for gravitational radiation (Lorimer 2008). As the field and instrumentation progress, pulsar astronomers attempt to overcome observational biases and detect new pulsars, consequently discovering new millisecond pulsars. We attempt to improve large scale pulsar surveys by examining three recent pulsar surveys. The first, the Green Bank Telescope 350MHz Drift Scan, a low frequency isotropic survey of the northern sky, has yielded a large number of candidates that were visually inspected and identified, resulting in over 34.000 thousands candidates viewed, dozens of detections of known pulsars, and the discovery of a new low-flux pulsar, PSRJ1911+22. The second, the PALFA survey, is a high frequency survey of the galactic plane with the Arecibo telescope. We created a processing pipeline for the PALFA survey at the National Radio Astronomy Observatory in Charlottesville- VA, in addition to making needed modifications upon advice from the PALFA consortium. The third survey examined is a new GBT 820MHz survey devoted to find new millisecond pulsars by observing the target-rich environment of unidentified sources in the FERMI LAT catalogue. By approaching these three pulsar surveys at different stages, we seek to improve the success rates of large scale surveys, and hence the possibility for ground-breaking work in both basic physics and astrophysics.

  6. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  7. Supporting large-scale computational science

    SciTech Connect

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  8. Large-scale intermittency in the atmospheric boundary layer.

    PubMed

    Kholmyansky, M; Moriconi, L; Tsinober, A

    2007-08-01

    We find actual evidence, relying upon vorticity time series taken in a high-Reynolds-number atmospheric experiment, that to a very good approximation the surface boundary layer flow may be described, in a statistical sense and under certain regimes, as an advected ensemble of homogeneous turbulent systems, characterized by a log-normal distribution of fluctuating intensities. Our analysis suggests that the usual direct numerical simulations of homogeneous and isotropic turbulence, performed at moderate Reynolds numbers, may play an important role in the study of turbulent boundary layer flows, if supplemented with appropriate statistical information concerned with the structure of large-scale fluctuations.

  9. Artificial intelligence and large scale computation: A physics perspective

    NASA Astrophysics Data System (ADS)

    Hogg, Tad; Huberman, B. A.

    1987-12-01

    We study the macroscopic behavior of computation and examine both emergent collective phenomena and dynamical aspects with an emphasis on software issues, which are at the core of large scale distributed computation and artificial intelligence systems. By considering large systems, we exhibit novel phenomena which cannot be foreseen from examination of their smaller counterparts. We review both the symbolic and connectionist views of artificial intelligence, provide a number of examples which display these phenomena, and resort to statistical mechanics, dynamical systems theory and the theory of random graphs to elicit the range of possible behaviors.

  10. Large-scale sodium spray fire code validation (SOFICOV) test

    SciTech Connect

    Jeppson, D.W.; Muhlestein, L.D.

    1985-01-01

    A large-scale, sodium, spray fire code validation test was performed in the HEDL 850-m/sup 3/ Containment System Test Facility (CSTF) as part of the Sodium Spray Fire Code Validation (SOFICOV) program. Six hundred fifty eight kilograms of sodium spray was sprayed in an air atmosphere for a period of 2400 s. The sodium spray droplet sizes and spray pattern distribution were estimated. The containment atmosphere temperature and pressure response, containment wall temperature response and sodium reaction rate with oxygen were measured. These results are compared to post-test predictions using SPRAY and NACOM computer codes.

  11. Statistical analysis of large-scale neuronal recording data

    PubMed Central

    Reed, Jamie L.; Kaas, Jon H.

    2010-01-01

    Relating stimulus properties to the response properties of individual neurons and neuronal networks is a major goal of sensory research. Many investigators implant electrode arrays in multiple brain areas and record from chronically implanted electrodes over time to answer a variety of questions. Technical challenges related to analyzing large-scale neuronal recording data are not trivial. Several analysis methods traditionally used by neurophysiologists do not account for dependencies in the data that are inherent in multi-electrode recordings. In addition, when neurophysiological data are not best modeled by the normal distribution and when the variables of interest may not be linearly related, extensions of the linear modeling techniques are recommended. A variety of methods exist to analyze correlated data, even when data are not normally distributed and the relationships are nonlinear. Here we review expansions of the Generalized Linear Model designed to address these data properties. Such methods are used in other research fields, and the application to large-scale neuronal recording data will enable investigators to determine the variable properties that convincingly contribute to the variances in the observed neuronal measures. Standard measures of neuron properties such as response magnitudes can be analyzed using these methods, and measures of neuronal network activity such as spike timing correlations can be analyzed as well. We have done just that in recordings from 100-electrode arrays implanted in the primary somatosensory cortex of owl monkeys. Here we illustrate how one example method, Generalized Estimating Equations analysis, is a useful method to apply to large-scale neuronal recordings. PMID:20472395

  12. Management of large-scale multimedia conferencing

    NASA Astrophysics Data System (ADS)

    Cidon, Israel; Nachum, Youval

    1998-12-01

    The goal of this work is to explore management strategies and algorithms for large-scale multimedia conferencing over a communication network. Since the use of multimedia conferencing is still limited, the management of such systems has not yet been studied in depth. A well organized and human friendly multimedia conference management should utilize efficiently and fairly its limited resources as well as take into account the requirements of the conference participants. The ability of the management to enforce fair policies and to quickly take into account the participants preferences may even lead to a conference environment that is more pleasant and more effective than a similar face to face meeting. We suggest several principles for defining and solving resource sharing problems in this context. The conference resources which are addressed in this paper are the bandwidth (conference network capacity), time (participants' scheduling) and limitations of audio and visual equipment. The participants' requirements for these resources are defined and translated in terms of Quality of Service requirements and the fairness criteria.

  13. Large-scale wind turbine structures

    NASA Technical Reports Server (NTRS)

    Spera, David A.

    1988-01-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  14. Large-scale tides in general relativity

    NASA Astrophysics Data System (ADS)

    Ip, Hiu Yan; Schmidt, Fabian

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the "separate universe" paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  15. Large scale mechanical metamaterials as seismic shields

    NASA Astrophysics Data System (ADS)

    Miniaci, Marco; Krushynska, Anastasiia; Bosia, Federico; Pugno, Nicola M.

    2016-08-01

    Earthquakes represent one of the most catastrophic natural events affecting mankind. At present, a universally accepted risk mitigation strategy for seismic events remains to be proposed. Most approaches are based on vibration isolation of structures rather than on the remote shielding of incoming waves. In this work, we propose a novel approach to the problem and discuss the feasibility of a passive isolation strategy for seismic waves based on large-scale mechanical metamaterials, including for the first time numerical analysis of both surface and guided waves, soil dissipation effects, and adopting a full 3D simulations. The study focuses on realistic structures that can be effective in frequency ranges of interest for seismic waves, and optimal design criteria are provided, exploring different metamaterial configurations, combining phononic crystals and locally resonant structures and different ranges of mechanical properties. Dispersion analysis and full-scale 3D transient wave transmission simulations are carried out on finite size systems to assess the seismic wave amplitude attenuation in realistic conditions. Results reveal that both surface and bulk seismic waves can be considerably attenuated, making this strategy viable for the protection of civil structures against seismic risk. The proposed remote shielding approach could open up new perspectives in the field of seismology and in related areas of low-frequency vibration damping or blast protection.

  16. Food appropriation through large scale land acquisitions

    NASA Astrophysics Data System (ADS)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  17. Large scale structure of the sun's corona

    NASA Astrophysics Data System (ADS)

    Kundu, Mukul R.

    Results concerning the large-scale structure of the solar corona obtained by observations at meter-decameter wavelengths are reviewed. Coronal holes observed on the disk at multiple frequencies show the radial and azimuthal geometry of the hole. At the base of the hole there is good correspondence to the chromospheric signature in He I 10,830 A, but at greater heights the hole may show departures from symmetry. Two-dimensional imaging of weak-type III bursts simultaneously with the HAO SMM coronagraph/polarimeter measurements indicate that these bursts occur along elongated features emanating from the quiet sun, corresponding in position angle to the bright coronal streamers. It is shown that the densest regions of streamers and the regions of maximum intensity of type II bursts coincide closely. Non-flare-associated type II/type IV bursts associated with coronal streamer disruption events are studied along with correlated type II burst emissions originating from distant centers on the sun.

  18. Large-scale carbon fiber tests

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A realistic release of carbon fibers was established by burning a minimum of 45 kg of carbon fiber composite aircraft structural components in each of five large scale, outdoor aviation jet fuel fire tests. This release was quantified by several independent assessments with various instruments developed specifically for these tests. The most likely values for the mass of single carbon fibers released ranged from 0.2 percent of the initial mass of carbon fiber for the source tests (zero wind velocity) to a maximum of 0.6 percent of the initial carbon fiber mass for dissemination tests (5 to 6 m/s wind velocity). Mean fiber lengths for fibers greater than 1 mm in length ranged from 2.5 to 3.5 mm. Mean diameters ranged from 3.6 to 5.3 micrometers which was indicative of significant oxidation. Footprints of downwind dissemination of the fire released fibers were measured to 19.1 km from the fire.

  19. Large-scale clustering of cosmic voids

    NASA Astrophysics Data System (ADS)

    Chan, Kwan Chuen; Hamaus, Nico; Desjacques, Vincent

    2014-11-01

    We study the clustering of voids using N -body simulations and simple theoretical models. The excursion-set formalism describes fairly well the abundance of voids identified with the watershed algorithm, although the void formation threshold required is quite different from the spherical collapse value. The void cross bias bc is measured and its large-scale value is found to be consistent with the peak background split results. A simple fitting formula for bc is found. We model the void auto-power spectrum taking into account the void biasing and exclusion effect. A good fit to the simulation data is obtained for voids with radii ≳30 Mpc h-1 , especially when the void biasing model is extended to 1-loop order. However, the best-fit bias parameters do not agree well with the peak-background results. Being able to fit the void auto-power spectrum is particularly important not only because it is the direct observable in galaxy surveys, but also our method enables us to treat the bias parameters as nuisance parameters, which are sensitive to the techniques used to identify voids.

  20. Large-scale autostereoscopic outdoor display

    NASA Astrophysics Data System (ADS)

    Reitterer, Jörg; Fidler, Franz; Saint Julien-Wallsee, Ferdinand; Schmid, Gerhard; Gartner, Wolfgang; Leeb, Walter; Schmid, Ulrich

    2013-03-01

    State-of-the-art autostereoscopic displays are often limited in size, effective brightness, number of 3D viewing zones, and maximum 3D viewing distances, all of which are mandatory requirements for large-scale outdoor displays. Conventional autostereoscopic indoor concepts like lenticular lenses or parallax barriers cannot simply be adapted for these screens due to the inherent loss of effective resolution and brightness, which would reduce both image quality and sunlight readability. We have developed a modular autostereoscopic multi-view laser display concept with sunlight readable effective brightness, theoretically up to several thousand 3D viewing zones, and maximum 3D viewing distances of up to 60 meters. For proof-of-concept purposes a prototype display with two pixels was realized. Due to various manufacturing tolerances each individual pixel has slightly different optical properties, and hence the 3D image quality of the display has to be calculated stochastically. In this paper we present the corresponding stochastic model, we evaluate the simulation and measurement results of the prototype display, and we calculate the achievable autostereoscopic image quality to be expected for our concept.

  1. Numerical Modeling for Large Scale Hydrothermal System

    NASA Astrophysics Data System (ADS)

    Sohrabi, Reza; Jansen, Gunnar; Malvoisin, Benjamin; Mazzini, Adriano; Miller, Stephen A.

    2017-04-01

    Moderate-to-high enthalpy systems are driven by multiphase and multicomponent processes, fluid and rock mechanics, and heat transport processes, all of which present challenges in developing realistic numerical models of the underlying physics. The objective of this work is to present an approach, and some initial results, for modeling and understanding dynamics of the birth of large scale hydrothermal systems. Numerical modeling of such complex systems must take into account a variety of coupled thermal, hydraulic, mechanical and chemical processes, which is numerically challenging. To provide first estimates of the behavior of this deep complex systems, geological structures must be constrained, and the fluid dynamics, mechanics and the heat transport need to be investigated in three dimensions. Modeling these processes numerically at adequate resolution and reasonable computation times requires a suite of tools that we are developing and/or utilizing to investigate such systems. Our long-term goal is to develop 3D numerical models, based on a geological models, which couples mechanics with the hydraulics and thermal processes driving hydrothermal system. Our first results from the Lusi hydrothermal system in East Java, Indonesia provide a basis for more sophisticated studies, eventually in 3D, and we introduce a workflow necessary to achieve these objectives. Future work focuses with the aim and parallelization suitable for High Performance Computing (HPC). Such developments are necessary to achieve high-resolution simulations to more fully understand the complex dynamics of hydrothermal systems.

  2. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  3. Large-scale implementation of disease control programmes: a cost-effectiveness analysis of long-lasting insecticide-treated bed net distribution channels in a malaria-endemic area of western Kenya—a study protocol

    PubMed Central

    Gama, Elvis; Were, Vincent; Ouma, Peter; Desai, Meghna; Niessen, Louis; Buff, Ann M; Kariuki, Simon

    2016-01-01

    Introduction Historically, Kenya has used various distribution models for long-lasting insecticide-treated bed nets (LLINs) with variable results in population coverage. The models presently vary widely in scale, target population and strategy. There is limited information to determine the best combination of distribution models, which will lead to sustained high coverage and are operationally efficient and cost-effective. Standardised cost information is needed in combination with programme effectiveness estimates to judge the efficiency of LLIN distribution models and options for improvement in implementing malaria control programmes. The study aims to address the information gap, estimating distribution cost and the effectiveness of different LLIN distribution models, and comparing them in an economic evaluation. Methods and analysis Evaluation of cost and coverage will be determined for 5 different distribution models in Busia County, an area of perennial malaria transmission in western Kenya. Cost data will be collected retrospectively from health facilities, the Ministry of Health, donors and distributors. Programme-effectiveness data, defined as the number of people with access to an LLIN per 1000 population, will be collected through triangulation of data from a nationally representative, cross-sectional malaria survey, a cross-sectional survey administered to a subsample of beneficiaries in Busia County and LLIN distributors’ records. Descriptive statistics and regression analysis will be used for the evaluation. A cost-effectiveness analysis will be performed from a health-systems perspective, and cost-effectiveness ratios will be calculated using bootstrapping techniques. Ethics and dissemination The study has been evaluated and approved by Kenya Medical Research Institute, Scientific and Ethical Review Unit (SERU number 2997). All participants will provide written informed consent. The findings of this economic evaluation will be disseminated through

  4. Dark energy from large-scale structure lensing information

    SciTech Connect

    Lu Tingting; Pen Ueli; Dore, Oliver

    2010-06-15

    Wide area large-scale structure (LSS) surveys are planning to map a substantial fraction of the visible Universe to quantify dark energy through baryon acoustic oscillations. At increasing redshift, for example, that probed by proposed 21-cm intensity mapping surveys, gravitational lensing potentially limits the fidelity (Hui et al., 2007) because it distorts the apparent matter distribution. In this paper we show that these distortions can be reconstructed, and actually used to map the distribution of intervening dark matter. The lensing information for sources at z=1-3 allows accurate reconstruction of the gravitational potential on large scales, l < or approx. 100, which is well matched for integrated Sachs-Wolfe effect measurements of dark energy and its sound speed, and a strong constraint for modified gravity models of dark energy. We built an optimal quadratic lensing estimator for non-Gaussian sources, which is necessary for LSS. The phenomenon of 'information saturation' (Rimes and Hamilton, 2005) saturates reconstruction at mildly nonlinear scales, where the linear source power spectrum {Delta}{sup 2{approx}}0.2-0.5, depending on power spectrum slope. Naive Gaussian estimators with nonlinear cutoff can be tuned to reproduce the optimal non-Gaussian errors within a factor of 2. We compute the effective number densities of independent lensing sources for LSS lensing, and find that they increase rapidly with redshifts. For LSS/21-cm sources at z{approx}2-4, the lensing reconstruction is limited by cosmic variance at l < or approx. 100.

  5. Sensitivity technologies for large scale simulation.

    SciTech Connect

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  6. Large Scale Flame Spread Environmental Characterization Testing

    NASA Technical Reports Server (NTRS)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  7. Synchronization of coupled large-scale Boolean networks

    SciTech Connect

    Li, Fangfei

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  8. The School Principal's Role in Large-Scale Assessment

    ERIC Educational Resources Information Center

    Newton, Paul; Tunison, Scott; Viczko, Melody

    2010-01-01

    This paper reports on an interpretive study in which 25 elementary principals were asked about their assessment knowledge, the use of large-scale assessments in their schools, and principals' perceptions on their roles with respect to large-scale assessments. Principals in this study suggested that the current context of large-scale assessment and…

  9. Synchronization of coupled large-scale Boolean networks

    NASA Astrophysics Data System (ADS)

    Li, Fangfei

    2014-03-01

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  10. Scalable Parallel Distance Field Construction for Large-Scale Applications.

    PubMed

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan-Liu; Kolla, Hemanth; Chen, Jacqueline H

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. A new distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking over time, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. Our work greatly extends the usability of distance fields for demanding applications.

  11. Large-scale quantum networks based on graphs

    NASA Astrophysics Data System (ADS)

    Epping, Michael; Kampermann, Hermann; Bruß, Dagmar

    2016-05-01

    Society relies and depends increasingly on information exchange and communication. In the quantum world, security and privacy is a built-in feature for information processing. The essential ingredient for exploiting these quantum advantages is the resource of entanglement, which can be shared between two or more parties. The distribution of entanglement over large distances constitutes a key challenge for current research and development. Due to losses of the transmitted quantum particles, which typically scale exponentially with the distance, intermediate quantum repeater stations are needed. Here we show how to generalise the quantum repeater concept to the multipartite case, by describing large-scale quantum networks, i.e. network nodes and their long-distance links, consistently in the language of graphs and graph states. This unifying approach comprises both the distribution of multipartite entanglement across the network, and the protection against errors via encoding. The correspondence to graph states also provides a tool for optimising the architecture of quantum networks.

  12. Large scale structure of the globular cluster population in Coma

    NASA Astrophysics Data System (ADS)

    Gagliano, Alexander T.; O'Neill, Conor; Madrid, Juan P.

    2016-01-01

    A search for globular cluster candidates in the Coma Cluster was carried out using Hubble Space Telescope data taken with the Advanced Camera for Surveys. We combine different observing programs including the Coma Treasury Survey in order to obtain the large scale distribution of globular clusters in Coma. Globular cluster candidates were selected through careful morphological inspection and a detailed analysis of their magnitude and colors in the two available wavebands, F475W (Sloan g) and F814W (I). Color Magnitude Diagrams, radial density plots and density maps were then created to characterize the globular cluster population in Coma. Preliminary results show the structure of the intergalactic globular cluster system throughout Coma, among the largest globular clusters catalogues to date. The spatial distribution of globular clusters shows clear overdensities, or bridges, between Coma galaxies. It also becomes evident that galaxies of similar luminosity have vastly different numbers of associated globular clusters.

  13. Scalable parallel distance field construction for large-scale applications

    SciTech Connect

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan -Liu; Kolla, Hemanth; Chen, Jacqueline H.

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. Anew distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking overtime, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. In conclusion, our work greatly extends the usability of distance fields for demanding applications.

  14. Large scale dynamics of protoplanetary discs

    NASA Astrophysics Data System (ADS)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  15. Large-Scale Spacecraft Fire Safety Tests

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  16. Large scale simulations of Brownian suspensions

    NASA Astrophysics Data System (ADS)

    Viera, Marc Nathaniel

    Particle suspensions occur in a wide variety of natural and engineering materials. Some examples are colloids, polymers, paints, and slurries. These materials exhibit complex behavior owing to the forces which act among the particles and are transmitted through the fluid medium. Depending on the application, particle sizes range from large macroscopic molecules of 100mum to smaller colloidal particles in the range of 10nm to 1mum. Particles of this size interact though interparticle forces such as electrostatic and van der Waals, as well as hydrodynamic forces transmitted through the fluid medium. Additionally, the particles are subjected to random thermal fluctuations in the fluid giving rise to Brownian motion. The central objective of our research is to develop efficient numerical algorithms for the large scale dynamic simulation of particle suspensions. While previous methods have incurred a computational cost of O(N3), where N is the number of particles, we have developed a novel algorithm capable of solving this problem in O(N ln N) operations. This has allowed us to perform dynamic simulations with up to 64,000 particles and Monte Carlo realizations of up to 1 million particles. Our algorithm follows a Stokesian dynamics formulation by evaluating many-body hydrodynamic interactions using a far-field multipole expansion combined with a near-field lubrication correction. The breakthrough O(N ln N) scaling is obtained by employing a Particle-Mesh-Ewald (PME) approach whereby near-field interactions are evaluated directly and far-field interactions are evaluated using a grid based velocity computed with FFT's. This approach is readily extended to include the effects of Brownian motion. For interacting particles, the fluctuation-dissipation theorem requires that the individual Brownian forces satisfy a correlation based on the N body resistance tensor R. The accurate modeling of these forces requires the computation of a matrix square root R 1/2 for matrices up

  17. Ecohydrological modeling for large-scale environmental impact assessment.

    PubMed

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model.

  18. The combustion behavior of large scale lithium titanate battery

    PubMed Central

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-01

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(NixCoyMnz)O2/Li4Ti5O12 batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(NixCoyMnz)O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112–121°C on anode tab and 139 to 147°C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li+ distribution are the main causes that lead to the difference. PMID:25586064

  19. Large-scale forcing on lightning in Portugal

    NASA Astrophysics Data System (ADS)

    Santos, J. A.; Sousa, J.; Reis, M. A.; Leite, S. M.; Correia, S.; Fraga, H.; Fragoso, M.

    2012-04-01

    An overview of the large-scale atmospheric forcing on the occurrence of cloud-to-ground lightning activity over Portugal is presented here. A dataset generated by a network of nine sensors, maintained by the Portuguese Meteorological Institute (four sensors) and by Spanish Meteorological Agency (five sensors), with available data over the 2003-2009 time period (7 years) is used for this purpose. For the same time period, a state-of-the-art high-resolution reanalysis dataset in a 1.0° latitude × 1.0° longitude grid (Modern Era Retrospective - Analysis for Research and Applications; MERRA300) is also considered in order to assess the atmospheric large-scale features over the target region. Three lightning regimes of the atmospheric general circulation within the Euro-Atlantic sector can be clearly detected. These regimes are characterized according to their underlying dynamical conditions (sea surface pressure, 500 hPa geopotential height and air temperature, streamlines of the 10 m wind vectors, and best 4-layer lifted index at 500 hPa). The spatial distribution of lighting activity in Portugal (patterns of the density of the atmospheric electrical discharges) is also analyzed for each regime separately. Considerations regarding seasonality, flash polarity and daily cycles in the lighting activity are also given for each lightning regime.

  20. Semantic overlay network for large-scale spatial information indexing

    NASA Astrophysics Data System (ADS)

    Zou, Zhiqiang; Wang, Yue; Cao, Kai; Qu, Tianshan; Wang, Zhongmin

    2013-08-01

    The increased demand for online services of spatial information poses new challenges to the combined filed of Computer Science and Geographic Information Science. Amongst others, these include fast indexing of spatial data in distributed networks. In this paper we propose a novel semantic overlay network for large-scale multi-dimensional spatial information indexing, called SON_LSII, which has a hybrid structure integrating a semantic quad-tree and Chord ring. The SON_LSII is a small world overlay network that achieves a very competitive trade-off between indexing efficiency and maintenance overhead. To create SON_LSII, we use an effective semantic clustering strategy that considers two aspects, i.e., the semantic of spatial information that peer holds in overlay network and physical network performances. Based on SON_LSII, a mapping method is used to reduce the multi-dimensional features into a single dimension and an efficient indexing algorithm is presented to support complex range queries of the spatial information with a massive number of concurrent users. The results from extensive experiments demonstrate that SON_LSII is superior to existing overlay networks in various respects, including scalability, maintenance, rate of indexing hits, indexing logical hops, and adaptability. Thus, the proposed SON_LSII can be used for large-scale spatial information indexing.

  1. The combustion behavior of large scale lithium titanate battery.

    PubMed

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-14

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(Ni(x)Co(y)Mn(z))O2/Li(4)Ti(5)O(12) batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(Ni(x)Co(y)Mn(z))O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112-121 °C on anode tab and 139 to 147 °C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li(+) distribution are the main causes that lead to the difference.

  2. Large Scale Land Acquisition as a driver of slope instability

    NASA Astrophysics Data System (ADS)

    Danilo Chiarelli, Davide; Rulli, Maria Cristina; Davis, Kyle F.; D'Odorico, Paolo

    2017-04-01

    Forests play a key role in preventing shallow landslides and deforestation has been analyzed as one of the main causes of increased mass wasting in hillsplopes undergoing land cover change. In the last few years vast tracts of lands have been acquired by foreign investors to satisfy an increasing demand for agricultural products. Large Scale Land Acquisitions (LSLA) often entail the conversion of forested landscapes into agricultural fields. Mozambique has been a major target of LSLAs and there is evidence that many of the acquired land have recently undergone forest clearing. The Zambezia Province in Mozambique has lost more than 500000ha of forest from 2000 to 2014; 25.4% of them were in areas acquired by large scale land investors. According to Land Matrix, an open-source database of reported land deals, there are currently 123 intended and confirmed deals in Mozambique; collectively, they account for 2.34million ha, the majority of which are located in forested areas. This study analyses the relationship between deforestation taking place inside LSLA areas(usually for agricultural purpose) and the likelihood of landslides occurrence in the Zambezia province in Mozambique. To this aim we use a spatially distributed and physically based model that couples slope stability analysis with a hillslope scale hydrological model and we compare the change in slope stability associated the forest loss documented by satellite imagery.

  3. Systematic renormalization of the effective theory of Large Scale Structure

    SciTech Connect

    Abolhasani, Ali Akbar; Mirbabayi, Mehrdad; Pajer, Enrico

    2016-05-31

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we show that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to large scale density contrast δ and momentum density π(k) scale as k{sup 2} and k, respectively. Third, we demonstrate that the common practice of introducing counterterms only in the Euler equation when one is interested in correlators of δ is indeed valid to all orders.

  4. Tracing the origin of green macroalgal blooms based on the large scale spatio-temporal distribution of Ulva microscopic propagules and settled mature Ulva vegetative thalli in coastal regions of the Yellow Sea, China.

    PubMed

    Huo, Yuanzi; Han, Hongbin; Hua, Liang; Wei, Zhangliang; Yu, Kefeng; Shi, Honghua; Kim, Jang Kyun; Yarish, Charles; He, Peimin

    2016-11-01

    From 2008 to 2016, massive floating green macroalgal blooms occurred annually during the summer months in the Yellow Sea. The original source of these blooms was traced based on the spatio-temporal distribution and species composition of Ulva microscopic propagules and settled Ulva vegetative thalli monthly from December 2012 to May 2013 in the Yellow Sea. High quantities of Ulva microscopic propagules in both the water column and sediments were found in the Pyropia aquaculture area along the Jiangsu coast before a green macroalgal bloom appeared in the Yellow Sea. The abundance of Ulva microscopic propagules was significantly lower in outer areas compared to in Pyropia aquaculture areas. A molecular phylogenetic analysis suggested that Ulva prolifera microscopic propagules were the dominant microscopic propagules present during the study period. The extremely low biomass of settled Ulva vegetative thalli along the coast indicated that somatic cells of settled Ulva vegetative thalli did not provide a propagule bank for the green macroalgal blooms in the Yellow Sea. The results of this study provide further supporting evidence that the floating green macroalgal blooms originate from green macroalgae attached to Pyropia aquaculture rafts along the Jiangsu coastline of the southern Yellow Sea.

  5. Detecting the Spatio-temporal Distribution of Soil Salinity and Its Relationship to Crop Growth in a Large-scale Arid Irrigation District Based on Sampling Experiment and Remote Sensing

    NASA Astrophysics Data System (ADS)

    Ren, D.; Huang, G., Sr.; Xu, X.; Huang, Q., Sr.; Xiong, Y.

    2016-12-01

    Soil salinity analysis on a regional scale is of great significance for protecting agriculture production and maintaining eco-environmental health in arid and semi-arid irrigated areas. In this study, the Hetao Irrigation District (Hetao) in Inner Mongolia Autonomous Region, with suffering long-term soil salinization problems, was selected as the case study area. Field sampling experiments and investigations related to soil salt contents, crop growth and yields were carried out across the whole area, during April to August in 2015. Soil salinity characteristics in space and time were systematically analyzed for Hetao as well as the corresponding impacts on crops. Remotely sensed map of soil salinity distribution for surface soil was also derived based on the Landsat OLI data with a 30 m resolution. The results elaborated the temporal and spatial dynamics of soil salinity and the relationships with irrigation, groundwater depth and crop water consumption in Hetao. In addition, the strong spatial variability of salinization was clearly presented by the remotely sensed map of soil salinity. Further, the relationship between soil salinity and crop growth was analyzed, and then the impact degrees of soil salinization on cropping pattern, leaf area index, plant height and crop yield were preliminarily revealed. Overall, this study can provide very useful information for salinization control and guide the future agricultural production and soil-water management for the arid irrigation districts analogous to Hetao.

  6. Autonomic Computing Paradigm For Large Scale Scientific And Engineering Applications

    NASA Astrophysics Data System (ADS)

    Hariri, S.; Yang, J.; Zhang, Y.

    2005-12-01

    Large-scale distributed scientific applications are highly adaptive and heterogeneous in terms of their computational requirements. The computational complexity associated with each computational region or domain varies continuously and dramatically both in space and time throughout the whole life cycle of the application execution. Furthermore, the underlying distributed computing environment is similarly complex and dynamic in the availabilities and capacities of the computing resources. These challenges combined together make the current paradigms, which are based on passive components and static compositions, ineffectual. Autonomic Computing paradigm is an approach that efficiently addresses the complexity and dynamism of large scale scientific and engineering applications and realizes the self-management of these applications. In this presentation, we present an Autonomic Runtime Manager (ARM) that supports the development of autonomic applications. The ARM includes two modules: online monitoring and analysis module and autonomic planning and scheduling module. The ARM behaves as a closed-loop control system that dynamically controls and manages the execution of the applications at runtime. It regularly senses the state changes of both the applications and the underlying computing resources. It then uses these runtime information and prior knowledge about the application behavior and its physics to identify the appropriate solution methods as well as the required computing and storage resources. Consequently this approach enables us to develop autonomic applications, which are capable of self-management and self-optimization. We have developed and implemented the autonomic computing paradigms for several large scale applications such as wild fire simulations, simulations of flow through variably saturated geologic formations, and life sciences. The distributed wildfire simulation models the wildfire spread behavior by considering such factors as fuel

  7. Planck data versus large scale structure: Methods to quantify discordance

    NASA Astrophysics Data System (ADS)

    Charnock, Tom; Battye, Richard A.; Moss, Adam

    2017-06-01

    Discordance in the Λ cold dark matter cosmological model can be seen by comparing parameters constrained by cosmic microwave background (CMB) measurements to those inferred by probes of large scale structure. Recent improvements in observations, including final data releases from both Planck and SDSS-III BOSS, as well as improved astrophysical uncertainty analysis of CFHTLenS, allows for an update in the quantification of any tension between large and small scales. This paper is intended, primarily, as a discussion on the quantifications of discordance when comparing the parameter constraints of a model when given two different data sets. We consider Kullback-Leibler divergence, comparison of Bayesian evidences and other statistics which are sensitive to the mean, variance and shape of the distributions. However, as a byproduct, we present an update to the similar analysis in [R. A. Battye, T. Charnock, and A. Moss, Phys. Rev. D 91, 103508 (2015), 10.1103/PhysRevD.91.103508], where we find that, considering new data and treatment of priors, the constraints from the CMB and from a combination of large scale structure (LSS) probes are in greater agreement and any tension only persists to a minor degree. In particular, we find the parameter constraints from the combination of LSS probes which are most discrepant with the Planck 2015 +Pol +BAO parameter distributions can be quantified at a ˜2.55 σ tension using the method introduced in [R. A. Battye, T. Charnock, and A. Moss, Phys. Rev. D 91, 103508 (2015), 10.1103/PhysRevD.91.103508]. If instead we use the distributions constrained by the combination of LSS probes which are in greatest agreement with those from Planck 2015 +Pol +BAO this tension is only 0.76 σ .

  8. Nivicolous Stemonitales from the Austral Andes: analysis of morphological variability, distribution and phenology as a first step toward testing the large-scale coherence of species and biogeographical properties.

    PubMed

    Ronikier, Anna; Lado, Carlos

    2015-01-01

    Nivicolous myxomycetes occur at the edge of spring-melting snow in mountainous areas. They are mostly considered cosmopolitan species morphologically and ecologically uniform across their entire distribution ranges. Thus, long-distance dispersal has been suggested to be the main mechanism shaping their ranges and geographical variability patterns. To test this hypothesis we conducted the first detailed analysis of morphological variability, occurrence frequency and phenology of nivicolous myxomycetes collected in the hitherto unexplored Austral Andes of South America (southern hemisphere = SH) in the comparative context of data from the northern hemisphere (NH). We used Stemonitales, the most representative and numerous taxonomic order in nivicolous myxomycetes, as a model. A total of 131 South American collections represented 13 species or morphotypes. One of them, Lamproderma andinum, is new to science and described here. Several others, L. aeneum, L. album, L. pulveratum, "Meriderma aff. aggregatum ad. int.", M. carestiae and "M. spinulosporum ad. int.", were previously unknown from the SH. Lamproderma ovoideum is reported for the first time from South America and Collaria nigricapillitia is new for Argentina. The fine-scale morphological analysis of all species from the study area and reference NH material demonstrated a high intraspecific variability in most of them. This suggests isolation and independent evolutionary processes among remote populations. On the other hand, the uniform morphology of a few species indicates that long-distance dispersal is also an effective mechanism, although not as universal as usually assumed, in some nivicolous myxomycetes. Analysis of nivicolous species assemblages also showed significant differences among major geographic regions in that the Stemonitales were significantly less common in the SH than in the NH. Furthermore, the occurrence of nivicolous species in summer and autumn, out of the typical phenological season, is

  9. Large Scale Hierarchical K-Means Based Image Retrieval With MapReduce

    DTIC Science & Technology

    2014-03-27

    LARGE SCALE HIERARCHICAL K-MEANS BASED IMAGE RETRIEVAL WITH MAPREDUCE THESIS William E. Murphy, Second Lieutenant, USAF AFIT-ENG-14-M-56 DEPARTMENT...RELEASE; DISTRIBUTION UNLIMITED The views expressed in this thesis are those of the author and do not reflect the official policy or position of the...subject to copyright protection in the United States. AFIT-ENG-14-M-56 LARGE SCALE HIERARCHICAL K-MEANS BASED IMAGE RETRIEVAL WITH MAPREDUCE THESIS

  10. Large-scale Fractal Motion of Clouds

    NASA Image and Video Library

    2017-09-27

    waters surrounding the island.) The “swallowed” gulps of clear island air get carried along within the vortices, but these are soon mixed into the surrounding clouds. Landsat is unique in its ability to image both the small-scale eddies that mix clear and cloudy air, down to the 30 meter pixel size of Landsat, but also having a wide enough field-of-view, 180 km, to reveal the connection of the turbulence to large-scale flows such as the subtropical oceanic gyres. Landsat 7, with its new onboard digital recorder, has extended this capability away from the few Landsat ground stations to remote areas such as Alejandro Island, and thus is gradually providing a global dynamic picture of evolving human-scale phenomena. For more details on von Karman vortices, refer to climate.gsfc.nasa.gov/~cahalan. Image and caption courtesy Bob Cahalan, NASA GSFC Instrument: Landsat 7 - ETM+ Credit: NASA/GSFC/Landsat NASA Goddard Space Flight Center enables NASA’s mission through four scientific endeavors: Earth Science, Heliophysics, Solar System Exploration, and Astrophysics. Goddard plays a leading role in NASA’s accomplishments by contributing compelling scientific knowledge to advance the Agency’s mission. Follow us on Twitter Join us on Facebook

  11. Large-scale assembly of colloidal particles

    NASA Astrophysics Data System (ADS)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  12. Modeling Failure Propagation in Large-Scale Engineering Networks

    NASA Astrophysics Data System (ADS)

    Schläpfer, Markus; Shapiro, Jonathan L.

    The simultaneous unavailability of several technical components within large-scale engineering systems can lead to high stress, rendering them prone to cascading events. In order to gain qualitative insights into the failure propagation mechanisms resulting from independent outages, we adopt a minimalistic model representing the components and their interdependencies by an undirected, unweighted network. The failure dynamics are modeled by an anticipated accelerated “wearout” process being dependent on the initial degree of a node and on the number of failed nearest neighbors. The results of the stochastic simulations imply that the influence of the network topology on the speed of the cascade highly depends on how the number of failed nearest neighbors shortens the life expectancy of a node. As a formal description of the decaying networks we propose a continuous-time mean field approximation, estimating the average failure rate of the nearest neighbors of a node based on the degree-degree distribution.

  13. Large-scale and global features of complex genomic signals

    NASA Astrophysics Data System (ADS)

    Cristea, Paul D.

    2003-10-01

    The paper briefly reviews the methodology of the symbolic nucleic sequence conversion into genomic signals and presents large scale and global features of the resulting genomic signals. Whole chromosomes or whole genomes are converted into complex signals and phase analysis is performed. The phase, cumulated phase and unwrapped phase of genomic signals are studied as tools for revealing important features of to the first and second order statistics of nucleotide distribution along DNA strands. It is shown that the unwrapped phase displays an almost linear variation along whole chromosomes. The property holds for all the investigated genomes, being shared by both prokaryotes and eukaryotes, while the magnitude and sign of the unwrapped phase slope is specific for each taxon and chromosome. The comparison between the behavior of the cumulated phase and of the unwrapped phase across the putative origins and termini of the replichores suggests a model of the 'patchy' structure of the chromosomes.

  14. Galaxies and large scale structure at high redshifts

    PubMed Central

    Steidel, Charles C.

    1998-01-01

    It is now straightforward to assemble large samples of very high redshift (z ∼ 3) field galaxies selected by their pronounced spectral discontinuity at the rest frame Lyman limit of hydrogen (at 912 Å). This makes possible both statistical analyses of the properties of the galaxies and the first direct glimpse of the progression of the growth of their large-scale distribution at such an early epoch. Here I present a summary of the progress made in these areas to date and some preliminary results of and future plans for a targeted redshift survey at z = 2.7–3.4. Also discussed is how the same discovery method may be used to obtain a “census” of star formation in the high redshift Universe, and the current implications for the history of galaxy formation as a function of cosmic epoch. PMID:9419319

  15. Testing Inflation with Large Scale Structure: Connecting Hopes with Reality

    SciTech Connect

    Alvarez, Marcello; Baldauf, T.; Bond, J. Richard; Dalal, N.; Putter, R. D.; Dore, O.; Green, Daniel; Hirata, Chris; Huang, Zhiqi; Huterer, Dragan; Jeong, Donghui; Johnson, Matthew C.; Krause, Elisabeth; Loverde, Marilena; Meyers, Joel; Meeburg, Daniel; Senatore, Leonardo; Shandera, Sarah; Silverstein, Eva; Slosar, Anze; Smith, Kendrick; Zaldarriaga, Matias; Assassi, Valentin; Braden, Jonathan; Hajian, Amir; Kobayashi, Takeshi; Stein, George; Engelen, Alexander van

    2014-12-15

    The statistics of primordial curvature fluctuations are our window into the period of inflation, where these fluctuations were generated. To date, the cosmic microwave background has been the dominant source of information about these perturbations. Large-scale structure is, however, from where drastic improvements should originate. In this paper, we explain the theoretical motivations for pursuing such measurements and the challenges that lie ahead. In particular, we discuss and identify theoretical targets regarding the measurement of primordial non-Gaussianity. We argue that when quantified in terms of the local (equilateral) template amplitude f$loc\\atop{NL}$ (f$eq\\atop{NL}$), natural target levels of sensitivity are Δf$loc, eq\\atop{NL}$ ≃ 1. We highlight that such levels are within reach of future surveys by measuring 2-, 3- and 4-point statistics of the galaxy spatial distribution. This paper summarizes a workshop held at CITA (University of Toronto) on October 23-24, 2014.

  16. Statistics of Caustics in Large-Scale Structure Formation

    NASA Astrophysics Data System (ADS)

    Feldbrugge, Job L.; Hidding, Johan; van de Weygaert, Rien

    2016-10-01

    The cosmic web is a complex spatial pattern of walls, filaments, cluster nodes and underdense void regions. It emerged through gravitational amplification from the Gaussian primordial density field. Here we infer analytical expressions for the spatial statistics of caustics in the evolving large-scale mass distribution. In our analysis, following the quasi-linear Zel'dovich formalism and confined to the 1D and 2D situation, we compute number density and correlation properties of caustics in cosmic density fields that evolve from Gaussian primordial conditions. The analysis can be straightforwardly extended to the 3D situation. We moreover, are currently extending the approach to the non-linear regime of structure formation by including higher order Lagrangian approximations and Lagrangian effective field theory.

  17. Large-scale structure non-Gaussianities with modal methods

    NASA Astrophysics Data System (ADS)

    Schmittfull, Marcel

    2016-10-01

    Relying on a separable modal expansion of the bispectrum, the implementation of a fast estimator for the full bispectrum of a 3d particle distribution is presented. The computational cost of accurate bispectrum estimation is negligible relative to simulation evolution, so the bispectrum can be used as a standard diagnostic whenever the power spectrum is evaluated. As an application, the time evolution of gravitational and primordial dark matter bispectra was measured in a large suite of N-body simulations. The bispectrum shape changes characteristically when the cosmic web becomes dominated by filaments and halos, therefore providing a quantitative probe of 3d structure formation. Our measured bispectra are determined by ~ 50 coefficients, which can be used as fitting formulae in the nonlinear regime and for non-Gaussian initial conditions. We also compare the measured bispectra with predictions from the Effective Field Theory of Large Scale Structures (EFTofLSS).

  18. Investigation of flow fields within large scale hypersonic inlet models

    NASA Technical Reports Server (NTRS)

    Gnos, A. V.; Watson, E. C.; Seebaugh, W. R.; Sanator, R. J.; Decarlo, J. P.

    1973-01-01

    Analytical and experimental investigations were conducted to determine the internal flow characteristics in model passages representative of hypersonic inlets for use at Mach numbers to about 12. The passages were large enough to permit measurements to be made in both the core flow and boundary layers. The analytical techniques for designing the internal contours and predicting the internal flow-field development accounted for coupling between the boundary layers and inviscid flow fields by means of a displacement-thickness correction. Three large-scale inlet models, each having a different internal compression ratio, were designed to provide high internal performance with an approximately uniform static-pressure distribution at the throat station. The models were tested in the Ames 3.5-Foot Hypersonic Wind Tunnel at a nominal free-stream Mach number of 7.4 and a unit free-stream Reynolds number of 8.86 X one million per meter.

  19. Unfolding large-scale online collaborative human dynamics

    PubMed Central

    Zha, Yilong; Zhou, Tao; Zhou, Changsong

    2016-01-01

    Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with a precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double–power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to previous actions with a power-law waiting time, and (iii) population growth due to the increasing number of interacting individuals. This unfolding allows us to obtain an analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal “simplicity” beyond complex interacting human activities. PMID:27911766

  20. Design boundaries of large-scale falling particle receivers

    NASA Astrophysics Data System (ADS)

    Kim, Jin-Soo; Kumar, Apurv; Corsi, Clotilde

    2017-06-01

    A free falling particle receiver has been studied to investigate the design boundary of large-scale falling particle receivers. Preliminary receiver geometry and condition of falling particle curtain were scoped according to the nominal receiver capacity (135 MWth), receiver outlet temperature (800 °C) and temperature difference (147 °C) recommended by the research program. Particle volume fraction and solar energy absorptivity were analyzed for two particle sizes (280 µm and 697 µm) in different flow range. The results were then converted to part load efficiency of the receiver. Ray tracing with a scoped receiver design provided the amount of spillage and overall performance of the receiver which comprises multiple cavities with different solar energy inputs. The study revealed and quantified some inherent problems in designing falling particle receivers such as, transmission energy loss caused by low solar energy absorption, efficiency decrease in part load operation, and uneven temperature distribution across falling particle curtain.

  1. Modelling large-scale halo bias using the bispectrum

    NASA Astrophysics Data System (ADS)

    Pollack, Jennifer E.; Smith, Robert E.; Porciani, Cristiano

    2012-03-01

    We study the relation between the density distribution of tracers for large-scale structure and the underlying matter distribution - commonly termed bias - in the Λ cold dark matter framework. In particular, we examine the validity of the local model of biasing at quadratic order in the matter density. This model is characterized by parameters b1 and b2. Using an ensemble of N-body simulations, we apply several statistical methods to estimate the parameters. We measure halo and matter fluctuations smoothed on various scales. We find that, whilst the fits are reasonably good, the parameters vary with smoothing scale. We argue that, for real-space measurements, owing to the mixing of wavemodes, no smoothing scale can be found for which the parameters are independent of smoothing. However, this is not the case in Fourier space. We measure halo and halo-mass power spectra and from these construct estimates of the effective large-scale bias as a guide for b1. We measure the configuration dependence of the halo bispectra Bhhh and reduced bispectra Qhhh for very large-scale k-space triangles. From these data, we constrain b1 and b2, taking into account the full bispectrum covariance matrix. Using the lowest order perturbation theory, we find that for Bhhh the best-fitting parameters are in reasonable agreement with one another as the triangle scale is varied, although the fits become poor as smaller scales are included. The same is true for Qhhh. The best-fitting values were found to depend on the discreteness correction. This led us to consider halo-mass cross-bispectra. The results from these statistics supported our earlier findings. We then developed a test to explore whether the inconsistency in the recovered bias parameters could be attributed to missing higher order corrections in the models. We prove that low-order expansions are not sufficiently accurate to model the data, even on scales k1˜ 0.04 h Mpc-1. If robust inferences concerning bias are to be drawn

  2. Investigations of Large Scale Storm Systems.

    DTIC Science & Technology

    1982-06-08

    Barnes, A.A., Jr. (1980) Cirrus Particle Distribution Study, Part 7, AFGL-TR-80-0324, AD A100269. B2. CONTRACTOR REPORTS Belsky , L. E., Francis, M. W...Data, AFGL-TR-78-0170, AD A064781. Belsky , L. W., Lally, J. P., Roberts, K., and O’Toole, T. (DPSI) (1981) Development and Applications of Techniques to

  3. Investigation of Coronal Large Scale Structures Utilizing Spartan 201 Data

    NASA Technical Reports Server (NTRS)

    Guhathakurta, Madhulika

    1998-01-01

    Two telescopes aboard Spartan 201, a small satellite has been launched from the Space Shuttles, on April 8th, 1993, September 8th, 1994, September 7th, 1995 and November 20th, 1997. The main objective of the mission was to answer some of the most fundamental unanswered questions of solar physics-What accelerates the solar wind and what heats the corona? The two telescopes are 1) Ultraviolet Coronal Spectrometer (UVCS) provided by the Smithsonian Astrophysical Observatory which uses ultraviolet emissions from neutral hydrogen and ions in the corona to determine velocities of the coronal plasma within the solar wind source region, and the temperature and density distributions of protons and 2) White Light Coronagraph (WLC) provided by NASA's Goddard Space Flight Center which measures visible light to determine the density distribution of coronal electrons within the same region. The PI has had the primary responsibility in the development and application of computer codes necessary for scientific data analysis activities, end instrument calibration for the white-light coronagraph for the entire Spartan mission. The PI was responsible for the science output from the WLC instrument. PI has also been involved in the investigation of coronal density distributions in large-scale structures by use of numerical models which are (mathematically) sufficient to reproduce the details of the observed brightness and polarized brightness distributions found in SPARTAN 201 data.

  4. Large-scale spatial population databases in infectious disease research

    PubMed Central

    2012-01-01

    Modelling studies on the spatial distribution and spread of infectious diseases are becoming increasingly detailed and sophisticated, with global risk mapping and epidemic modelling studies now popular. Yet, in deriving populations at risk of disease estimates, these spatial models must rely on existing global and regional datasets on population distribution, which are often based on outdated and coarse resolution data. Moreover, a variety of different methods have been used to model population distribution at large spatial scales. In this review we describe the main global gridded population datasets that are freely available for health researchers and compare their construction methods, and highlight the uncertainties inherent in these population datasets. We review their application in past studies on disease risk and dynamics, and discuss how the choice of dataset can affect results. Moreover, we highlight how the lack of contemporary, detailed and reliable data on human population distribution in low income countries is proving a barrier to obtaining accurate large-scale estimates of population at risk and constructing reliable models of disease spread, and suggest research directions required to further reduce these barriers. PMID:22433126

  5. Locating inefficient links in a large-scale transportation network

    NASA Astrophysics Data System (ADS)

    Sun, Li; Liu, Like; Xu, Zhongzhi; Jie, Yang; Wei, Dong; Wang, Pu

    2015-02-01

    Based on data from geographical information system (GIS) and daily commuting origin destination (OD) matrices, we estimated the distribution of traffic flow in the San Francisco road network and studied Braess's paradox in a large-scale transportation network with realistic travel demand. We measured the variation of total travel time Δ T when a road segment is closed, and found that | Δ T | follows a power-law distribution if Δ T < 0 or Δ T > 0. This implies that most roads have a negligible effect on the efficiency of the road network, while the failure of a few crucial links would result in severe travel delays, and closure of a few inefficient links would counter-intuitively reduce travel costs considerably. Generating three theoretical networks, we discovered that the heterogeneously distributed travel demand may be the origin of the observed power-law distributions of | Δ T | . Finally, a genetic algorithm was used to pinpoint inefficient link clusters in the road network. We found that closing specific road clusters would further improve the transportation efficiency.

  6. Double inflation: A possible resolution of the large-scale structure problem

    SciTech Connect

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an ..cap omega.. = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of approx.100 Mpc, while the small-scale structure over less than or equal to 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs.

  7. Double inflation - A possible resolution of the large-scale structure problem

    NASA Technical Reports Server (NTRS)

    Turner, Michael S.; Villumsen, Jens V.; Vittorio, Nicola; Silk, Joseph; Juszkiewicz, Roman

    1987-01-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Omega = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of about 100 Mpc, while the small-scale structure over less than about 10 Mpc resembles that in a low-density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations.

  8. Interaction of a cumulus cloud ensemble with the large-scale environment. IV - The discrete model

    NASA Technical Reports Server (NTRS)

    Lord, S. J.; Chao, W. C.; Arakawa, A.

    1982-01-01

    The Arakawa-Schubert (1974) parameterization is applied to a prognostic model of large-scale atmospheric circulations and used to analyze data in a general circulation model (GCM). The vertical structure of the large-scale model and the solution for the cloud subensemble thermodynamical properties are examined to choose cloud levels and representative regions. A mass flux distribution equation is adapted to formulate algorithms for calculating the large-scale forcing and the mass flux kernel, using either direct solution or linear programming. Finally, the feedback of the cumulus ensemble on the large-scale environment for a given subensemble mass flux is calculated. All cloud subensemble properties were determined from the conservation of mass, moist static energy, and total water.

  9. Large Scale Self-Organizing Information Distribution System

    DTIC Science & Technology

    2005-12-31

    record-breaking experiment utilizes Abilene ( Internet2 backbone and the trans-Atlantic high-energy physics network between Chicago PoP and CERN in...Jim Pool, CACR, Caltech * Les Cottrell of SLAC, Stanford University * Guy Almes of Internet2 (now NSF) * David Lapsley of Haystack Observatory, MIT

  10. Managing large-scale multi-voltage distribution system analysis

    SciTech Connect

    Walton, C.M.

    1994-12-31

    The challenge for electricity utilities in the 90`s to deliver ever more reliable service at reduced cost and with fewer technical staff is the driver towards the next generation of automated network analysis tools. The paper discusses the application of an Automatic Loss Minimiser (ALM) and Fault Study Package (FSP) under the control of a Sequence Processor to an existing high resolution graphics network analysis package. Automated, sorted management summaries enable limited resources and system automation to be directed at those networks with the poorest performance and/or the largest potential savings from reduced system losses. The impact regular automatic monitoring has on the quality of the database and the scope for integration of the modules with other System Automation initiatives is also considered.

  11. Software architecture for large scale, distributed, data-intensive systems

    NASA Technical Reports Server (NTRS)

    Mattmann, Chris A.; Medvidovic, Nenad; Ramirez, Paul M.

    2004-01-01

    This paper presents our experience with OODT, a novel software architectual style, and middlware-based implementation for data-intensive systems. To date, OODT has been successfully evaluated in several different science domains including Cancer Research with the National Cancer Institute (NCI), and Planetary Science with NASA's Planetary Data System (PDS).

  12. Large-Scale Structures of Planetary Systems

    NASA Astrophysics Data System (ADS)

    Murray-Clay, Ruth; Rogers, Leslie A.

    2015-12-01

    A class of solar system analogs has yet to be identified among the large crop of planetary systems now observed. However, since most observed worlds are more easily detectable than direct analogs of the Sun's planets, the frequency of systems with structures similar to our own remains unknown. Identifying the range of possible planetary system architectures is complicated by the large number of physical processes that affect the formation and dynamical evolution of planets. I will present two ways of organizing planetary system structures. First, I will suggest that relatively few physical parameters are likely to differentiate the qualitative architectures of different systems. Solid mass in a protoplanetary disk is perhaps the most obvious possible controlling parameter, and I will give predictions for correlations between planetary system properties that we would expect to be present if this is the case. In particular, I will suggest that the solar system's structure is representative of low-metallicity systems that nevertheless host giant planets. Second, the disk structures produced as young stars are fed by their host clouds may play a crucial role. Using the observed distribution of RV giant planets as a function of stellar mass, I will demonstrate that invoking ice lines to determine where gas giants can form requires fine tuning. I will suggest that instead, disk structures built during early accretion have lasting impacts on giant planet distributions, and disk clean-up differentially affects the orbital distributions of giant and lower-mass planets. These two organizational hypotheses have different implications for the solar system's context, and I will suggest observational tests that may allow them to be validated or falsified.

  13. Evolution of baryons in cosmic large scale structure

    NASA Astrophysics Data System (ADS)

    Snedden, Ali

    We introduce a new self-consistent structure finding algorithm that parses large scale cosmological structure into clusters, filaments and voids. This algorithm probes the structure at multiple scales and classifies the appropriate regions with the most probable structure type and size. We use this structure finding algorithm to parse and follow the evolution of poor clusters, filaments and voids in large scale simulations. We trace the complete evolution of the baryons in the gas phase and the star formation history within each structure. We vary the structure measure threshold to probe the complex inner structure of star forming regions in poor clusters, filaments and voids. We find the majority of star formation occurs in cold condensed gas in filaments at all redshifts and that it peaks at intermediate redshifts (z ~ 3). We also show that much of the star formation above a redshift z = 3 occurs in low contrast regions of filaments, but as the density contrast increases at lower redshift, star formation switches to high contrast regions or the inner parts of filaments. Since filaments bridge between void and cluster regions, this suggests that the majority of star formation occurs in galaxies in intermediate density regions prior to the accretion onto poor clusters. We find that at the present epoch, the gas phase distribution is 43.1%, 30.0%, 24.7% and 2.2% in the diffuse, WHIM, hot halo and condensed phases, respectively. Most of the WHIM is found to be in filamentary structures. Moreover 8.77%, 79.1%, 2.11% and 9.98% of the gas is located in poor clusters, filaments, voids and unassigned regions respectively. We find that both filaments and poor clusters are multiphase environments at redshift z = 0.

  14. High Fidelity Simulations of Large-Scale Wireless Networks

    SciTech Connect

    Onunkwo, Uzoma; Benz, Zachary

    2015-11-01

    The worldwide proliferation of wireless connected devices continues to accelerate. There are 10s of billions of wireless links across the planet with an additional explosion of new wireless usage anticipated as the Internet of Things develops. Wireless technologies do not only provide convenience for mobile applications, but are also extremely cost-effective to deploy. Thus, this trend towards wireless connectivity will only continue and Sandia must develop the necessary simulation technology to proactively analyze the associated emerging vulnerabilities. Wireless networks are marked by mobility and proximity-based connectivity. The de facto standard for exploratory studies of wireless networks is discrete event simulations (DES). However, the simulation of large-scale wireless networks is extremely difficult due to prohibitively large turnaround time. A path forward is to expedite simulations with parallel discrete event simulation (PDES) techniques. The mobility and distance-based connectivity associated with wireless simulations, however, typically doom PDES and fail to scale (e.g., OPNET and ns-3 simulators). We propose a PDES-based tool aimed at reducing the communication overhead between processors. The proposed solution will use light-weight processes to dynamically distribute computation workload while mitigating communication overhead associated with synchronizations. This work is vital to the analytics and validation capabilities of simulation and emulation at Sandia. We have years of experience in Sandia’s simulation and emulation projects (e.g., MINIMEGA and FIREWHEEL). Sandia’s current highly-regarded capabilities in large-scale emulations have focused on wired networks, where two assumptions prevent scalable wireless studies: (a) the connections between objects are mostly static and (b) the nodes have fixed locations.

  15. Thermal activation of dislocations in large scale obstacle bypass

    NASA Astrophysics Data System (ADS)

    Sobie, Cameron; Capolungo, Laurent; McDowell, David L.; Martinez, Enrique

    2017-08-01

    Dislocation dynamics simulations have been used extensively to predict hardening caused by dislocation-obstacle interactions, including irradiation defect hardening in the athermal case. Incorporating the role of thermal energy on these interactions is possible with a framework provided by harmonic transition state theory (HTST) enabling direct access to thermally activated reaction rates using the Arrhenius equation, including rates of dislocation-obstacle bypass processes. Moving beyond unit dislocation-defect reactions to a representative environment containing a large number of defects requires coarse-graining the activation energy barriers of a population of obstacles into an effective energy barrier that accurately represents the large scale collective process. The work presented here investigates the relationship between unit dislocation-defect bypass processes and the distribution of activation energy barriers calculated for ensemble bypass processes. A significant difference between these cases is observed, which is attributed to the inherent cooperative nature of dislocation bypass processes. In addition to the dislocation-defect interaction, the morphology of the dislocation segments pinned to the defects play an important role on the activation energies for bypass. A phenomenological model for activation energy stress dependence is shown to describe well the effect of a distribution of activation energies, and a probabilistic activation energy model incorporating the stress distribution in a material is presented.

  16. Episodic memory in aspects of large-scale brain networks

    PubMed Central

    Jeong, Woorim; Chung, Chun Kee; Kim, June Sic

    2015-01-01

    Understanding human episodic memory in aspects of large-scale brain networks has become one of the central themes in neuroscience over the last decade. Traditionally, episodic memory was regarded as mostly relying on medial temporal lobe (MTL) structures. However, recent studies have suggested involvement of more widely distributed cortical network and the importance of its interactive roles in the memory process. Both direct and indirect neuro-modulations of the memory network have been tried in experimental treatments of memory disorders. In this review, we focus on the functional organization of the MTL and other neocortical areas in episodic memory. Task-related neuroimaging studies together with lesion studies suggested that specific sub-regions of the MTL are responsible for specific components of memory. However, recent studies have emphasized that connectivity within MTL structures and even their network dynamics with other cortical areas are essential in the memory process. Resting-state functional network studies also have revealed that memory function is subserved by not only the MTL system but also a distributed network, particularly the default-mode network (DMN). Furthermore, researchers have begun to investigate memory networks throughout the entire brain not restricted to the specific resting-state network (RSN). Altered patterns of functional connectivity (FC) among distributed brain regions were observed in patients with memory impairments. Recently, studies have shown that brain stimulation may impact memory through modulating functional networks, carrying future implications of a novel interventional therapy for memory impairment. PMID:26321939

  17. Response of deep and shallow tropical maritime cumuli to large-scale processes

    NASA Technical Reports Server (NTRS)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  18. Large-scale Stratospheric Transport Processes

    NASA Technical Reports Server (NTRS)

    Plumb, R. Alan

    2003-01-01

    The PI has undertaken a theoretical analysis of the existence and nature of compact tracer-tracer relationships of the kind observed in the stratosphere, augmented with three-dimensional model simulations of stratospheric tracers (the latter being an extension of modeling work the group did during the SOLVE experiment). This work achieves a rigorous theoretical basis for the existence and shape of these relationships, as well as a quantitative theory of their width and evolution, in terms of the joint tracer-tracer PDF distribution. A paper on this work is almost complete and will soon be submitted to Rev. Geophys. We have analyzed lower stratospheric water in simulations with an isentropic-coordinate version of the MATCH transport model which we recently helped to develop. The three-dimensional structure of lower stratospheric water, in particular, attracted our attention: dry air is, below about 400K potential temperature, localized in the regions of the west Pacific and equatorial South America. We have been analyzing air trajectories to determine how air passes through the tropopause cold trap. This work is now being completed, and a paper will be submitted to Geophys. Res. Lett. before the end of summer. We are continuing to perform experiments with the 'MATCH' CTM, in both sigma- and entropy-coordinate forms. We earlier found (in collaboration with Dr Natalie Mahowald, and as part of an NSF-funded project) that switching to isentropic coordinates made a substantial improvement to the simulation of the age of stratospheric air. We are now running experiments with near-tropopause sources in both versions of the model, to see if and to what extent the simulation of stratosphere-troposphere transport is dependent on the model coordinate. Personnel Research is supervised by the PI, Prof. Alan Plumb. Mr William Heres conducts the tracer modeling work and performs other modeling tasks. Two graduate students, Ms Irene Lee and Mr Michael Ring, have been participating

  19. Star formation associated with a large-scale infrared bubble

    NASA Astrophysics Data System (ADS)

    Xu, Jin-Long; Ju, Bing-Gang

    2014-09-01

    Aims: To investigate how a large-scale infrared bubble centered at l = 53.9° and b = 0.2° forms, and to study if star formation is taking place at the periphery of the bubble, we performed a multiwavelength study. Methods: Using the data from the Galactic Ring Survey (GRS) and Galactic Legacy Infrared Mid-Plane Survey Extraordinaire (GLIMPSE), we performed a study of a large-scale infrared bubble with a size of about 16 pc at a distance of 2.0 kpc. We present the 12CO J = 1-0, 13CO J = 1-0, and C18O J = 1-0 observations of HII region G53.54-0.01 (Sh2-82) obtained at the Purple Mountain Observation (PMO) 13.7 m radio telescope to investigate the detailed distribution of associated molecular material. In addition, we also used radiorecombination line and VLA data. To select young stellar objects (YSOs) consistent with this region, we used the GLIMPSE I catalog. Results: The large-scale infrared bubble shows a half-shell morphology at 8 μm. The H II regions of G53.54-0.01, G53.64+0.24, and G54.09-0.06 are situated on the bubble. Comparing the radio recombination line velocities and associated 13CO J = 1-0 components of the three H II regions, we found that the 8 μm emission associated with H II region G53.54-0.01 should belong to the foreground emission, and only overlap with the large-scale infrared bubble in the line of sight. Three extended green objects (EGOs, the candidate massive young stellar objects), as well as three H II regions and two small-scale bubbles are found located in the G54.09-0.06 complex, indicating an active massive star-forming region. Emission from C18O at J = 1-0 presents four cloud clumps on the northeastern border of H II region G53.54-0.01. By comparing the spectral profiles of 12CO J = 1-0, 13CO J = 1-0, and C18O J = 1-0 at the peak position of each clump, we found the collected gas in the three clumps, except for the clump coinciding with a massive YSO (IRAS 19282+1814). Using the evolutive model of the H II region, we derived that

  20. Using Web-Based Testing for Large-Scale Assessment.

    ERIC Educational Resources Information Center

    Hamilton, Laura S.; Klein, Stephen P.; Lorie, William

    This paper describes an approach to large-scale assessment that uses tests that are delivered to students over the Internet and that are tailored (adapted) to each student's own level of proficiency. A brief background on large-scale assessment is followed by a description of this new technology and an example. Issues that need to be investigated…

  1. Accuracy improvement in laser stripe extraction for large-scale triangulation scanning measurement system

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Liu, Wei; Li, Xiaodong; Yang, Fan; Gao, Peng; Jia, Zhenyuan

    2015-10-01

    Large-scale triangulation scanning measurement systems are widely used to measure the three-dimensional profile of large-scale components and parts. The accuracy and speed of the laser stripe center extraction are essential for guaranteeing the accuracy and efficiency of the measuring system. However, in the process of large-scale measurement, multiple factors can cause deviation of the laser stripe center, including the spatial light intensity distribution, material reflectivity characteristics, and spatial transmission characteristics. A center extraction method is proposed for improving the accuracy of the laser stripe center extraction based on image evaluation of Gaussian fitting structural similarity and analysis of the multiple source factors. First, according to the features of the gray distribution of the laser stripe, evaluation of the Gaussian fitting structural similarity is estimated to provide a threshold value for center compensation. Then using the relationships between the gray distribution of the laser stripe and the multiple source factors, a compensation method of center extraction is presented. Finally, measurement experiments for a large-scale aviation composite component are carried out. The experimental results for this specific implementation verify the feasibility of the proposed center extraction method and the improved accuracy for large-scale triangulation scanning measurements.

  2. Statistical analysis of mesoscale rainfall: Dependence of a random cascade generator on large-scale forcing

    NASA Technical Reports Server (NTRS)

    Over, Thomas, M.; Gupta, Vijay K.

    1994-01-01

    Under the theory of independent and identically distributed random cascades, the probability distribution of the cascade generator determines the spatial and the ensemble properties of spatial rainfall. Three sets of radar-derived rainfall data in space and time are analyzed to estimate the probability distribution of the generator. A detailed comparison between instantaneous scans of spatial rainfall and simulated cascades using the scaling properties of the marginal moments is carried out. This comparison highlights important similarities and differences between the data and the random cascade theory. Differences are quantified and measured for the three datasets. Evidence is presented to show that the scaling properties of the rainfall can be captured to the first order by a random cascade with a single parameter. The dependence of this parameter on forcing by the large-scale meteorological conditions, as measured by the large-scale spatial average rain rate, is investigated for these three datasets. The data show that this dependence can be captured by a one-to-one function. Since the large-scale average rain rate can be diagnosed from the large-scale dynamics, this relationship demonstrates an important linkage between the large-scale atmospheric dynamics and the statistical cascade theory of mesoscale rainfall. Potential application of this research to parameterization of runoff from the land surface and regional flood frequency analysis is briefly discussed, and open problems for further research are presented.

  3. Scalable parallel distance field construction for large-scale applications

    DOE PAGES

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan -Liu; ...

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. Anew distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking overtime, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate itsmore » efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. In conclusion, our work greatly extends the usability of distance fields for demanding applications.« less

  4. White-Light Polarization and Large-Scale Coronal Structures

    NASA Astrophysics Data System (ADS)

    Badalyan, O. G.; Livshits, M. A.; Sýkora, J.

    1997-06-01

    The results of the white-light polarization measurements performed during three solar eclipses (1973, 1980, 1991) are presented. The eclipse images were processed and analysed by the same technique and method and, consequently, the distributions of the polarization and coronal intensity around the Sun were obtained in unified form for all three solar eclipses. The mutual comparisons of our results, and their comparison with the distributions found by other authors, allowed the real accuracy of the current measurements of the white-light corona polarization, which is not worse than +/-5%, to be estimated. We have investigated the behaviour of the polarization in dependence on heliocentric distance in helmet streamers and coronal holes. Simultaneous interpretation of the data on polarization and intensity in white-light helmet streamers is only possible if a considerable concentration of coronal matter (plasma) towards the plane of the sky is assumed. The values obtained for the coronal hole regions can be understood within the framework of a spherically symmetrical model of the low density solar atmosphere. A tendency towards increasing polarization in coronal holes, connected with the decrease of the hole's size and with the transition from the minimum to the maximum of the solar cycle, was noticed. The problem of how the peculiarities of the large-scale coronal structures are related to the orientation of the global (dipole) solar magnetic field and to the degree of the goffer character of the coronal and interplanetary current sheet is discussed briefly.

  5. SALSA - a Sectional Aerosol module for Large Scale Applications

    NASA Astrophysics Data System (ADS)

    Kokkola, H.; Korhonen, H.; Lehtinen, K. E. J.; Makkonen, R.; Asmi, A.; Järvenoja, S.; Anttila, T.; Partanen, A.-I.; Kulmala, M.; Järvinen, H.; Laaksonen, A.; Kerminen, V.-M.

    2007-12-01

    The sectional aerosol module SALSA is introduced. The model has been designed to be implemented in large scale climate models, which require both accuracy and computational efficiency. We have used multiple methods to reduce the computational burden of different aerosol processes to optimize the model performance without losing physical features relevant to problematics of climate importance. The optimizations include limiting the chemical compounds and physical processes available in different size sections of aerosol particles; division of the size distribution into size sections using size sections of variable width depending on the sensitivity of microphysical processing to the particles sizes; the total amount of size sections to describe the size distribution is kept to the minimum; furthermore, only the relevant microphysical processes affecting each size section are calculated. The ability of the module to describe different microphysical processes was evaluated against explicit microphysical models and several microphysical models used in air quality models. The results from the current module show good consistency when compared to more explicit models. Also, the module was used to simulate a new particle formation event typical in highly polluted conditions with comparable results to a more explicit model setup.

  6. SALSA - a Sectional Aerosol module for Large Scale Applications

    NASA Astrophysics Data System (ADS)

    Kokkola, H.; Korhonen, H.; Lehtinen, K. E. J.; Makkonen, R.; Asmi, A.; Järvenoja, S.; Anttila, T.; Partanen, A.-I.; Kulmala, M.; Järvinen, H.; Laaksonen, A.; Kerminen, V.-M.

    2008-05-01

    The sectional aerosol module SALSA is introduced. The model has been designed to be implemented in large scale climate models, which require both accuracy and computational efficiency. We have used multiple methods to reduce the computational burden of different aerosol processes to optimize the model performance without losing physical features relevant to problematics of climate importance. The optimizations include limiting the chemical compounds and physical processes available in different size sections of aerosol particles; division of the size distribution into size sections using size sections of variable width depending on the sensitivity of microphysical processing to the particles sizes; the total amount of size sections to describe the size distribution is kept to the minimum; furthermore, only the relevant microphysical processes affecting each size section are calculated. The ability of the module to describe different microphysical processes was evaluated against explicit microphysical models and several microphysical models used in air quality models. The results from the current module show good consistency when compared to more explicit models. Also, the module was used to simulate a new particle formation event typical in highly polluted conditions with comparable results to more explicit model setup.

  7. Large Scale Relationship between Aquatic Insect Traits and Climate.

    PubMed

    Bhowmik, Avit Kumar; Schäfer, Ralf B

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices (~18 km resolution) for five insect orders (Diptera, Ephemeroptera, Odonata, Plecoptera and Trichoptera), evaluated their potential for changing distribution pattern under future climate change and identified the most influential bioclimatic indices. The data comprised 782 species and 395 genera sampled in 4,752 stream sites during 2006 and 2007 in Germany (~357,000 km² spatial extent). We quantified the variability and spatial autocorrelation in the traits and orders that are associated with the combined and individual bioclimatic indices. Traits of temperature preference grouping feature that are the products of several other underlying climate-associated traits, and the insect order Ephemeroptera exhibited the strongest response to the bioclimatic indices as well as the highest potential for changing distribution pattern. Regarding individual traits, insects in general and ephemeropterans preferring very cold temperature showed the highest response, and the insects preferring cold and trichopterans preferring moderate temperature showed the highest potential for changing distribution. We showed that the seasonal radiation and moisture are the most influential bioclimatic aspects, and thus changes in these aspects may affect the most responsive traits and orders and drive a change in their spatial distribution pattern. Our findings support the development of trait-based metrics to predict and detect climate

  8. Large Scale Relationship between Aquatic Insect Traits and Climate

    PubMed Central

    Bhowmik, Avit Kumar; Schäfer, Ralf B.

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices (~18 km resolution) for five insect orders (Diptera, Ephemeroptera, Odonata, Plecoptera and Trichoptera), evaluated their potential for changing distribution pattern under future climate change and identified the most influential bioclimatic indices. The data comprised 782 species and 395 genera sampled in 4,752 stream sites during 2006 and 2007 in Germany (~357,000 km² spatial extent). We quantified the variability and spatial autocorrelation in the traits and orders that are associated with the combined and individual bioclimatic indices. Traits of temperature preference grouping feature that are the products of several other underlying climate-associated traits, and the insect order Ephemeroptera exhibited the strongest response to the bioclimatic indices as well as the highest potential for changing distribution pattern. Regarding individual traits, insects in general and ephemeropterans preferring very cold temperature showed the highest response, and the insects preferring cold and trichopterans preferring moderate temperature showed the highest potential for changing distribution. We showed that the seasonal radiation and moisture are the most influential bioclimatic aspects, and thus changes in these aspects may affect the most responsive traits and orders and drive a change in their spatial distribution pattern. Our findings support the development of trait-based metrics to predict and detect climate

  9. Dominant modes of variability in large-scale Birkeland currents

    NASA Astrophysics Data System (ADS)

    Cousins, E. D. P.; Matsuo, Tomoko; Richmond, A. D.; Anderson, B. J.

    2015-08-01

    Properties of variability in large-scale Birkeland currents are investigated through empirical orthogonal function (EOF) analysis of 1 week of data from the Active Magnetosphere and Planetary Electrodynamics Response Experiment (AMPERE). Mean distributions and dominant modes of variability are identified for both the Northern and Southern Hemispheres. Differences in the results from the two hemispheres are observed, which are attributed to seasonal differences in conductivity (the study period occurred near solstice). A universal mean and set of dominant modes of variability are obtained through combining the hemispheric results, and it is found that the mean and first three modes of variability (EOFs) account for 38% of the total observed squared magnetic perturbations (δB2) from both hemispheres. The mean distribution represents a standard Region 1/Region 2 (R1/R2) morphology of currents and EOF 1 captures the strengthening/weakening of the average distribution and is well correlated with the north-south component of the interplanetary magnetic field (IMF). EOF 2 captures a mixture of effects including the expansion/contraction and rotation of the (R1/R2) currents; this mode correlates only weakly with possible external driving parameters. EOF 3 captures changes in the morphology of the currents in the dayside cusp region and is well correlated with the dawn-dusk component of the IMF. The higher-order EOFs capture more complex, smaller-scale variations in the Birkeland currents and appear generally uncorrelated with external driving parameters. The results of the EOF analysis described here are used for describing error covariance in a data assimilation procedure utilizing AMPERE data, as described in a companion paper.

  10. Periodic cells for large-scale problem initialization

    NASA Astrophysics Data System (ADS)

    Ciantia, Matteo O.; Arroyo, Marcos; Zhang, Ningning; Emam, Sacha

    2017-06-01

    In geotechnical applications the success of the discrete element method (DEM) in simulating fundamental aspects of soil behaviour has increased the interest in applications for direct simulation of engineering scale boundary value problems (BVP's). The main problem is that the method remains relatively expensive in terms of computational cost. A non-negligible part of that cost is related to specimen creation and initialization. As the response of soil is strongly dependant on its initial state (stress and porosity), attaining a specified initial state is a crucial part of a DEM model. Different procedures for controlled sample generation are available. However, applying the existing REV-oriented initialization procedures to such models is inefficient in terms of computational cost and challenging in terms of sample homogeneity. In this work a simple but efficient procedure to initialize large-scale DEM models is presented. Periodic cells are first generated with a sufficient number of particles matching a desired particle size distribution (PSD). The cells are then equilibrated at low-level isotropic stress at target porosity. Once the cell is in equilibrium, it is replicated in space in order to fill the model domain. After the domain is thus filled a few mechanical cycles are needed to re-equilibrate the large domain. The result is a large, homogeneous sample, equilibrated under prescribed stress at the desired porosity. The method is applicable to both isotropic and anisotropic initial stress states, with stress magnitude varying in space.

  11. Large Scale Applications of HTS in New Zealand

    NASA Astrophysics Data System (ADS)

    Wimbush, Stuart C.

    New Zealand has one of the longest-running and most consistently funded (relative to GDP) programmes in high temperature superconductor (HTS) development and application worldwide. As a consequence, it has a sustained breadth of involvement in HTS technology development stretching from the materials discovery right through to burgeoning commercial exploitation. This review paper outlines the present large scale projects of the research team at the newly-established Robinson Research Institute of Victoria University of Wellington. These include the construction and grid-based testing of a three-phase 1 MVA 2G HTS distribution transformer utilizing Roebel cable for its high-current secondary windings and the development of a cryogen-free conduction-cooled 1.5 T YBCO-based human extremity magnetic resonance imaging system. Ongoing activities supporting applications development such as low-temperature full-current characterization of commercial superconducting wires and the implementation of inductive flux-pump technologies for efficient brushless coil excitation in superconducting magnets and rotating machines are also described.

  12. Methods for Ranking and Selection in Large-Scale Inference

    NASA Astrophysics Data System (ADS)

    Henderson, Nicholas C.

    This thesis addresses two distinct problems: one related to ranking and selection for large-scale inference and another related to latent class modeling of longitudinal count data. The first part of the thesis focuses on the problem of identifying leading measurement units from a large collection with a focus on settings with differing levels of estimation precision across measurement units. The main approach presented is a Bayesian ranking procedure that populates the list of top units in a way that maximizes the expected overlap between the true and reported top lists for all list sizes. This procedure relates unit-specific posterior upper tail probabilities with their empirical distribution to yield a ranking variable. It discounts high-variance units less than other common methods and thus achieves improved operating characteristics in the models considered. In the second part of the thesis, we introduce and describe a finite mixture model for longitudinal count data where, conditional on the class label, the subject-specific observations are assumed to arise from a discrete autoregressive process. This approach offers notable computational advantages over related methods due to the within-class closed form of the likelihood function and, as we describe, has a within-class correlation structure which improves model identifiability. We also outline computational strategies for estimating model parameters, and we describe a novel measure of the underlying separation between latent classes and discuss its relation to posterior classification.

  13. MULTIPLE TESTING VIA FDRL FOR LARGE SCALE IMAGING DATA

    PubMed Central

    Zhang, Chunming; Fan, Jianqing; Yu, Tao

    2010-01-01

    The multiple testing procedure plays an important role in detecting the presence of spatial signals for large scale imaging data. Typically, the spatial signals are sparse but clustered. This paper provides empirical evidence that for a range of commonly used control levels, the conventional FDR procedure can lack the ability to detect statistical significance, even if the p-values under the true null hypotheses are independent and uniformly distributed; more generally, ignoring the neighboring information of spatially structured data will tend to diminish the detection effectiveness of the FDR procedure. This paper first introduces a scalar quantity to characterize the extent to which the “lack of identification phenomenon” (LIP) of the FDR procedure occurs. Second, we propose a new multiple comparison procedure, called FDRL, to accommodate the spatial information of neighboring p-values, via a local aggregation of p-values. Theoretical properties of the FDRL procedure are investigated under weak dependence of p-values. It is shown that the FDRL procedure alleviates the LIP of the FDR procedure, thus substantially facilitating the selection of more stringent control levels. Simulation evaluations indicate that the FDRL procedure improves the detection sensitivity of the FDR procedure with little loss in detection specificity. The computational simplicity and detection effectiveness of the FDRL procedure are illustrated through a real brain fMRI dataset. PMID:21643445

  14. Large-scale clustering of CAGE tag expression data

    PubMed Central

    Shimokawa, Kazuro; Okamura-Oho, Yuko; Kurita, Takio; Frith, Martin C; Kawai, Jun; Carninci, Piero; Hayashizaki, Yoshihide

    2007-01-01

    Background Recent analyses have suggested that many genes possess multiple transcription start sites (TSSs) that are differentially utilized in different tissues and cell lines. We have identified a huge number of TSSs mapped onto the mouse genome using the cap analysis of gene expression (CAGE) method. The standard hierarchical clustering algorithm, which gives us easily understandable graphical tree images, has difficulties in processing such huge amounts of TSS data and a better method to calculate and display the results is needed. Results We use a combination of hierarchical and non-hierarchical clustering to cluster expression profiles of TSSs based on a large amount of CAGE data to profit from the best of both methods. We processed the genome-wide expression data, including 159,075 TSSs derived from 127 RNA samples of various organs of mouse, and succeeded in categorizing them into 70–100 clusters. The clusters exhibited intriguing biological features: a cluster supergroup with a ubiquitous expression profile, tissue-specific patterns, a distinct distribution of non-coding RNA and functional TSS groups. Conclusion Our approach succeeded in greatly reducing the calculation cost, and is an appropriate solution for analyzing large-scale TSS usage data. PMID:17517134

  15. Large-scale analysis of microRNA evolution

    PubMed Central

    2012-01-01

    Background In animals, microRNAs (miRNA) are important genetic regulators. Animal miRNAs appear to have expanded in conjunction with an escalation in complexity during early bilaterian evolution. Their small size and high-degree of similarity makes them challenging for phylogenetic approaches. Furthermore, genomic locations encoding miRNAs are not clearly defined in many species. A number of studies have looked at the evolution of individual miRNA families. However, we currently lack resources for large-scale analysis of miRNA evolution. Results We addressed some of these issues in order to analyse the evolution of miRNAs. We perform syntenic and phylogenetic analysis for miRNAs from 80 animal species. We present synteny maps, phylogenies and functional data for miRNAs across these species. These data represent the basis of our analyses and also act as a resource for the community. Conclusions We use these data to explore the distribution of miRNAs across phylogenetic space, characterise their birth and death, and examine functional relationships between miRNAs and other genes. These data confirm a number of previously reported findings on a larger scale and also offer novel insights into the evolution of the miRNA repertoire in animals, and it’s genomic organization. PMID:22672736

  16. Parallel Index and Query for Large Scale Data Analysis

    SciTech Connect

    Chou, Jerry; Wu, Kesheng; Ruebel, Oliver; Howison, Mark; Qiang, Ji; Prabhat,; Austin, Brian; Bethel, E. Wes; Ryne, Rob D.; Shoshani, Arie

    2011-07-18

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing of a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.

  17. Large-scale physical activity data reveal worldwide activity inequality.

    PubMed

    Althoff, Tim; Sosič, Rok; Hicks, Jennifer L; King, Abby C; Delp, Scott L; Leskovec, Jure

    2017-07-20

    To be able to curb the global pandemic of physical inactivity and the associated 5.3 million deaths per year, we need to understand the basic principles that govern physical activity. However, there is a lack of large-scale measurements of physical activity patterns across free-living populations worldwide. Here we leverage the wide usage of smartphones with built-in accelerometry to measure physical activity at the global scale. We study a dataset consisting of 68 million days of physical activity for 717,527 people, giving us a window into activity in 111 countries across the globe. We find inequality in how activity is distributed within countries and that this inequality is a better predictor of obesity prevalence in the population than average activity volume. Reduced activity in females contributes to a large portion of the observed activity inequality. Aspects of the built environment, such as the walkability of a city, are associated with a smaller gender gap in activity and lower activity inequality. In more walkable cities, activity is greater throughout the day and throughout the week, across age, gender, and body mass index (BMI) groups, with the greatest increases in activity found for females. Our findings have implications for global public health policy and urban planning and highlight the role of activity inequality and the built environment in improving physical activity and health.

  18. Large scale stochastic spatio-temporal modelling with PCRaster

    NASA Astrophysics Data System (ADS)

    Karssenberg, Derek; Drost, Niels; Schmitz, Oliver; de Jong, Kor; Bierkens, Marc F. P.

    2013-04-01

    software from the eScience Technology Platform (eSTeP), developed at the Netherlands eScience Center. This will allow us to scale up to hundreds of machines, with thousands of compute cores. A key requirement is not to change the user experience of the software. PCRaster operations and the use of the Python framework classes should work in a similar manner on machines ranging from a laptop to a supercomputer. This enables a seamless transfer of models from small machines, where model development is done, to large machines used for large-scale model runs. Domain specialists from a large range of disciplines, including hydrology, ecology, sedimentology, and land use change studies, currently use the PCRaster Python software within research projects. Applications include global scale hydrological modelling and error propagation in large-scale land use change models. The software runs on MS Windows, Linux operating systems, and OS X.

  19. Local and Regional Impacts of Large Scale Wind Energy Deployment

    NASA Astrophysics Data System (ADS)

    Michalakes, J.; Hammond, S.; Lundquist, J. K.; Moriarty, P.; Robinson, M.

    2010-12-01

    resources and upscaling large scale wind farm impact on local and regional climate. It will bridge localized and larger scale interactions of renewable energy generation with energy resource and grid management system control. By 2030, when 20 percent wind energy penetration is planned and exascale computing resources have become commonplace, we envision such a system spanning the entire mesoscale to sub-millimeter range of scales to provide a real-time computational and systems control capability to optimize renewable based generating and grid distribution for efficiency and with minimizing environmental impact.

  20. Large-Scale Spray Releases: Additional Aerosol Test Results

    SciTech Connect

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  1. Cloud-based large-scale air traffic flow optimization

    NASA Astrophysics Data System (ADS)

    Cao, Yi

    The ever-increasing traffic demand makes the efficient use of airspace an imperative mission, and this paper presents an effort in response to this call. Firstly, a new aggregate model, called Link Transmission Model (LTM), is proposed, which models the nationwide traffic as a network of flight routes identified by origin-destination pairs. The traversal time of a flight route is assumed to be the mode of distribution of historical flight records, and the mode is estimated by using Kernel Density Estimation. As this simplification abstracts away physical trajectory details, the complexity of modeling is drastically decreased, resulting in efficient traffic forecasting. The predicative capability of LTM is validated against recorded traffic data. Secondly, a nationwide traffic flow optimization problem with airport and en route capacity constraints is formulated based on LTM. The optimization problem aims at alleviating traffic congestions with minimal global delays. This problem is intractable due to millions of variables. A dual decomposition method is applied to decompose the large-scale problem such that the subproblems are solvable. However, the whole problem is still computational expensive to solve since each subproblem is an smaller integer programming problem that pursues integer solutions. Solving an integer programing problem is known to be far more time-consuming than solving its linear relaxation. In addition, sequential execution on a standalone computer leads to linear runtime increase when the problem size increases. To address the computational efficiency problem, a parallel computing framework is designed which accommodates concurrent executions via multithreading programming. The multithreaded version is compared with its monolithic version to show decreased runtime. Finally, an open-source cloud computing framework, Hadoop MapReduce, is employed for better scalability and reliability. This framework is an "off-the-shelf" parallel computing model

  2. Formation of large-scale structure from cosmic strings and massive neutrinos

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.; Melott, Adrian L.; Bertschinger, Edmund

    1989-01-01

    Numerical simulations of large-scale structure formation from cosmic strings and massive neutrinos are described. The linear power spectrum in this model resembles the cold-dark-matter power spectrum. Galaxy formation begins early, and the final distribution consists of isolated density peaks embedded in a smooth background, leading to a natural bias in the distribution of luminous matter. The distribution of clustered matter has a filamentary appearance with large voids.

  3. Formation of large-scale structure from cosmic strings and massive neutrinos

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.; Melott, Adrian L.; Bertschinger, Edmund

    1989-01-01

    Numerical simulations of large-scale structure formation from cosmic strings and massive neutrinos are described. The linear power spectrum in this model resembles the cold-dark-matter power spectrum. Galaxy formation begins early, and the final distribution consists of isolated density peaks embedded in a smooth background, leading to a natural bias in the distribution of luminous matter. The distribution of clustered matter has a filamentary appearance with large voids.

  4. Large-scale simulations of layered double hydroxide nanocomposite materials

    NASA Astrophysics Data System (ADS)

    Thyveetil, Mary-Ann

    Layered double hydroxides (LDHs) have the ability to intercalate a multitude of anionic species. Atomistic simulation techniques such as molecular dynamics have provided considerable insight into the behaviour of these materials. We review these techniques and recent algorithmic advances which considerably improve the performance of MD applications. In particular, we discuss how the advent of high performance computing and computational grids has allowed us to explore large scale models with considerable ease. Our simulations have been heavily reliant on computational resources on the UK's NGS (National Grid Service), the US TeraGrid and the Distributed European Infrastructure for Supercomputing Applications (DEISA). In order to utilise computational grids we rely on grid middleware to launch, computationally steer and visualise our simulations. We have integrated the RealityGrid steering library into the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) 1 . which has enabled us to perform re mote computational steering and visualisation of molecular dynamics simulations on grid infrastruc tures. We also use the Application Hosting Environment (AHE) 2 in order to launch simulations on remote supercomputing resources and we show that data transfer rates between local clusters and super- computing resources can be considerably enhanced by using optically switched networks. We perform large scale molecular dynamics simulations of MgiAl-LDHs intercalated with either chloride ions or a mixture of DNA and chloride ions. The systems exhibit undulatory modes, which are suppressed in smaller scale simulations, caused by the collective thermal motion of atoms in the LDH layers. Thermal undulations provide elastic properties of the system including the bending modulus, Young's moduli and Poisson's ratios. To explore the interaction between LDHs and DNA. we use molecular dynamics techniques to per form simulations of double stranded, linear and plasmid DNA up

  5. Modified gravity and large scale flows, a review

    NASA Astrophysics Data System (ADS)

    Mould, Jeremy

    2017-02-01

    Large scale flows have been a challenging feature of cosmography ever since galaxy scaling relations came on the scene 40 years ago. The next generation of surveys will offer a serious test of the standard cosmology.

  6. Learning networks for sustainable, large-scale improvement.

    PubMed

    McCannon, C Joseph; Perla, Rocco J

    2009-05-01

    Large-scale improvement efforts known as improvement networks offer structured opportunities for exchange of information and insights into the adaptation of clinical protocols to a variety of settings.

  7. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    NASA Astrophysics Data System (ADS)

    Kumar, Rohit; Verma, Mahendra K.

    2017-09-01

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  8. Large-scale environments of narrow-line Seyfert 1 galaxies

    NASA Astrophysics Data System (ADS)

    Järvelä, E.; Lähteenmäki, A.; Lietzen, H.; Poudel, A.; Heinämäki, P.; Einasto, M.

    2017-09-01

    Studying large-scale environments of narrow-line Seyfert 1 (NLS1) galaxies gives a new perspective on their properties, particularly their radio loudness. The large-scale environment is believed to have an impact on the evolution and intrinsic properties of galaxies, however, NLS1 sources have not been studied in this context before. We have a large and diverse sample of 1341 NLS1 galaxies and three separate environment data sets constructed using Sloan Digital Sky Survey. We use various statistical methods to investigate how the properties of NLS1 galaxies are connected to the large-scale environment, and compare the large-scale environments of NLS1 galaxies with other active galactic nuclei (AGN) classes, for example, other jetted AGN and broad-line Seyfert 1 (BLS1) galaxies, to study how they are related. NLS1 galaxies reside in less dense environments than any of the comparison samples, thus confirming their young age. The average large-scale environment density and environmental distribution of NLS1 sources is clearly different compared to BLS1 galaxies, thus it is improbable that they could be the parent population of NLS1 galaxies and unified by orientation. Within the NLS1 class there is a trend of increasing radio loudness with increasing large-scale environment density, indicating that the large-scale environment affects their intrinsic properties. Our results suggest that the NLS1 class of sources is not homogeneous, and furthermore, that a considerable fraction of them are misclassified. We further support a published proposal to replace the traditional classification to radio-loud, and radio-quiet or radio-silent sources with a division into jetted and non-jetted sources.

  9. Modeling Booklet Effects for Nonequivalent Group Designs in Large-Scale Assessment

    ERIC Educational Resources Information Center

    Hecht, Martin; Weirich, Sebastian; Siegle, Thilo; Frey, Andreas

    2015-01-01

    Multiple matrix designs are commonly used in large-scale assessments to distribute test items to students. These designs comprise several booklets, each containing a subset of the complete item pool. Besides reducing the test burden of individual students, using various booklets allows aligning the difficulty of the presented items to the assumed…

  10. Analytical model of the statistical properties of contrast of large-scale ionospheric inhomogeneities.

    NASA Astrophysics Data System (ADS)

    Vsekhsvyatskaya, I. S.; Evstratova, E. A.; Kalinin, Yu. K.; Romanchuk, A. A.

    1989-08-01

    A new analytical model is proposed for the distribution of variations of the relative electron-density contrast of large-scale ionospheric inhomogeneities. The model is characterized by other-than-zero skewness and kurtosis. It is shown that the model is applicable in the interval of horizontal dimensions of inhomogeneities from hundreds to thousands of kilometers.

  11. The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth

    ERIC Educational Resources Information Center

    Steyvers, Mark; Tenenbaum, Joshua B.

    2005-01-01

    We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…

  12. Finite Mixture Multilevel Multidimensional Ordinal IRT Models for Large Scale Cross-Cultural Research

    ERIC Educational Resources Information Center

    de Jong, Martijn G.; Steenkamp, Jan-Benedict E. M.

    2010-01-01

    We present a class of finite mixture multilevel multidimensional ordinal IRT models for large scale cross-cultural research. Our model is proposed for confirmatory research settings. Our prior for item parameters is a mixture distribution to accommodate situations where different groups of countries have different measurement operations, while…

  13. Strategic Leadership for Large-Scale Reform: The Case of England's National Literacy and Numeracy Strategy

    ERIC Educational Resources Information Center

    Leithwood, Kenneth; Jantzi, Doris; Earl, Lorna; Watson, Nancy; Levin, Benjamin; Fullan, Michael

    2004-01-01

    Both 'strategic' and 'distributed' forms of leadership are considered promising responses to the demands placed on school systems by large-scale reform initiatives. Using observation, interview and survey data collected as part of a larger evaluation of England's National Literacy and Numeracy Strategies, this study inquired about sources of…

  14. Modeling Booklet Effects for Nonequivalent Group Designs in Large-Scale Assessment

    ERIC Educational Resources Information Center

    Hecht, Martin; Weirich, Sebastian; Siegle, Thilo; Frey, Andreas

    2015-01-01

    Multiple matrix designs are commonly used in large-scale assessments to distribute test items to students. These designs comprise several booklets, each containing a subset of the complete item pool. Besides reducing the test burden of individual students, using various booklets allows aligning the difficulty of the presented items to the assumed…

  15. The Large-Scale Structure of Semantic Networks: Statistical Analyses and a Model of Semantic Growth

    ERIC Educational Resources Information Center

    Steyvers, Mark; Tenenbaum, Joshua B.

    2005-01-01

    We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget's Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of…

  16. Large-scale studies of marked birds in North America

    USGS Publications Warehouse

    Tautin, J.; Metras, L.; Smith, G.

    1999-01-01

    The first large-scale, co-operative, studies of marked birds in North America were attempted in the 1950s. Operation Recovery, which linked numerous ringing stations along the east coast in a study of autumn migration of passerines, and the Preseason Duck Ringing Programme in prairie states and provinces, conclusively demonstrated the feasibility of large-scale projects. The subsequent development of powerful analytical models and computing capabilities expanded the quantitative potential for further large-scale projects. Monitoring Avian Productivity and Survivorship, and Adaptive Harvest Management are current examples of truly large-scale programmes. Their exemplary success and the availability of versatile analytical tools are driving changes in the North American bird ringing programme. Both the US and Canadian ringing offices are modifying operations to collect more and better data to facilitate large-scale studies and promote a more project-oriented ringing programme. New large-scale programmes such as the Cornell Nest Box Network are on the horizon.

  17. A study of MLFMA for large-scale scattering problems

    NASA Astrophysics Data System (ADS)

    Hastriter, Michael Larkin

    This research is centered in computational electromagnetics with a focus on solving large-scale problems accurately in a timely fashion using first principle physics. Error control of the translation operator in 3-D is shown. A parallel implementation of the multilevel fast multipole algorithm (MLFMA) was studied as far as parallel efficiency and scaling. The large-scale scattering program (LSSP), based on the ScaleME library, was used to solve ultra-large-scale problems including a 200lambda sphere with 20 million unknowns. As these large-scale problems were solved, techniques were developed to accurately estimate the memory requirements. Careful memory management is needed in order to solve these massive problems. The study of MLFMA in large-scale problems revealed significant errors that stemmed from inconsistencies in constants used by different parts of the algorithm. These were fixed to produce the most accurate data possible for large-scale surface scattering problems. Data was calculated on a missile-like target using both high frequency methods and MLFMA. This data was compared and analyzed to determine possible strategies to increase data acquisition speed and accuracy through multiple computation method hybridization.

  18. Satellite measurements of large-scale air pollution: Methods

    SciTech Connect

    Kaufman, Y.J.; Fraser, R.S.; Ferrare, R.A. )

    1990-06-20

    A method is presented for simultaneous determination of the aerosol optical thickness ({tau}{sub a}), particle size (r{sub m}, geometric mean mass radius for a lognormal distribution) and the single scattering albedo ({omega}{sub 0}, ratio between scattering and scattering + absorption) from satellite imagery. The method is based on satellite images of the surface (land and water) in the visible and near-IR bands and is applied here to the first two channels of the Advanced Very High Resolution Radiometer (AVHRR) sensor. The aerosol characteristics are obtained from the difference in the upward radiances, detected by the satellite, between a clear and a hazy day. Therefore the method is mainly useful for remote sensing of large-scale air pollution (e.g., smoke from a large fire or concentrated anthropogenic pollution), which introduces dense aerosol into the atmosphere (aerosol optical thickness {ge}0.4) on top of an existing aerosol. The method is very sensitive to the stability of the surface reflectance between the clear day and the hazy day. It also requires accurate satellite calibration (preferably not more than 5% error) and stable calibration with good relative values between the two bands used in the analysis. With these requirements, the aerosol optical thickness can be derived with an error of {Delta}{tau}{sub a} = 0.08-0.15. For an assumed lognormal size distribution, the particle geometrical mean mass radius r{sub m} can be derived (if good calibration is available) with an error of {Delta}r{sub m} = {plus minus}(0.10-0.20){mu}m, and {omega}{sub 0} with {Delta}{omega}{sub 0} = {plus minus}0.03 for {omega}{sub 0} close to 1 and {Delta}{sub omega}{sub 0} = {plus minus}(0.03-0.07) for {omega}{sub 0} about 0.8. The method was applied to AVHRR images of a forest fire smoke.

  19. Large scale cardiac modeling on the Blue Gene supercomputer.

    PubMed

    Reumann, Matthias; Fitch, Blake G; Rayshubskiy, Aleksandr; Keller, David U; Weiss, Daniel L; Seemann, Gunnar; Dössel, Olaf; Pitman, Michael C; Rice, John J

    2008-01-01

    Multi-scale, multi-physical heart models have not yet been able to include a high degree of accuracy and resolution with respect to model detail and spatial resolution due to computational limitations of current systems. We propose a framework to compute large scale cardiac models. Decomposition of anatomical data in segments to be distributed on a parallel computer is carried out by optimal recursive bisection (ORB). The algorithm takes into account a computational load parameter which has to be adjusted according to the cell models used. The diffusion term is realized by the monodomain equations. The anatomical data-set was given by both ventricles of the Visible Female data-set in a 0.2 mm resolution. Heterogeneous anisotropy was included in the computation. Model weights as input for the decomposition and load balancing were set to (a) 1 for tissue and 0 for non-tissue elements; (b) 10 for tissue and 1 for non-tissue elements. Scaling results for 512, 1024, 2048, 4096 and 8192 computational nodes were obtained for 10 ms simulation time. The simulations were carried out on an IBM Blue Gene/L parallel computer. A 1 s simulation was then carried out on 2048 nodes for the optimal model load. Load balances did not differ significantly across computational nodes even if the number of data elements distributed to each node differed greatly. Since the ORB algorithm did not take into account computational load due to communication cycles, the speedup is close to optimal for the computation time but not optimal overall due to the communication overhead. However, the simulation times were reduced form 87 minutes on 512 to 11 minutes on 8192 nodes. This work demonstrates that it is possible to run simulations of the presented detailed cardiac model within hours for the simulation of a heart beat.

  20. Inflationary tensor fossils in large-scale structure

    SciTech Connect

    Dimastrogiovanni, Emanuela; Fasiello, Matteo; Jeong, Donghui; Kamionkowski, Marc E-mail: mrf65@case.edu E-mail: kamion@jhu.edu

    2014-12-01

    Inflation models make specific predictions for a tensor-scalar-scalar three-point correlation, or bispectrum, between one gravitational-wave (tensor) mode and two density-perturbation (scalar) modes. This tensor-scalar-scalar correlation leads to a local power quadrupole, an apparent departure from statistical isotropy in our Universe, as well as characteristic four-point correlations in the current mass distribution in the Universe. So far, the predictions for these observables have been worked out only for single-clock models in which certain consistency conditions between the tensor-scalar-scalar correlation and tensor and scalar power spectra are satisfied. Here we review the requirements on inflation models for these consistency conditions to be satisfied. We then consider several examples of inflation models, such as non-attractor and solid-inflation models, in which these conditions are put to the test. In solid inflation the simplest consistency conditions are already violated whilst in the non-attractor model we find that, contrary to the standard scenario, the tensor-scalar-scalar correlator probes directly relevant model-dependent information. We work out the predictions for observables in these models. For non-attractor inflation we find an apparent local quadrupolar departure from statistical isotropy in large-scale structure but that this power quadrupole decreases very rapidly at smaller scales. The consistency of the CMB quadrupole with statistical isotropy then constrains the distance scale that corresponds to the transition from the non-attractor to attractor phase of inflation to be larger than the currently observable horizon. Solid inflation predicts clustering fossils signatures in the current galaxy distribution that may be large enough to be detectable with forthcoming, and possibly even current, galaxy surveys.

  1. Chronic, Wireless Recordings of Large Scale Brain Activity in Freely Moving Rhesus Monkeys

    PubMed Central

    Schwarz, David A.; Lebedev, Mikhail A.; Hanson, Timothy L.; Dimitrov, Dragan F.; Lehew, Gary; Meloy, Jim; Rajangam, Sankaranarayani; Subramanian, Vivek; Ifft, Peter J.; Li, Zheng; Ramakrishnan, Arjun; Tate, Andrew; Zhuang, Katie; Nicolelis, Miguel A.L.

    2014-01-01

    Advances in techniques for recording large-scale brain activity contribute to both the elucidation of neurophysiological principles and the development of brain-machine interfaces (BMIs). Here we describe a neurophysiological paradigm for performing tethered and wireless large-scale recordings based on movable volumetric three-dimensional (3D) multielectrode implants. This approach allowed us to isolate up to 1,800 units per animal and simultaneously record the extracellular activity of close to 500 cortical neurons, distributed across multiple cortical areas, in freely behaving rhesus monkeys. The method is expandable, in principle, to thousands of simultaneously recorded channels. It also allows increased recording longevity (5 consecutive years), and recording of a broad range of behaviors, e.g. social interactions, and BMI paradigms in freely moving primates. We propose that wireless large-scale recordings could have a profound impact on basic primate neurophysiology research, while providing a framework for the development and testing of clinically relevant neuroprostheses. PMID:24776634

  2. Modification in drag of turbulent boundary layers resulting from manipulation of large-scale structures

    NASA Technical Reports Server (NTRS)

    Corke, T. C.; Guezennec, Y.; Nagib, H. M.

    1981-01-01

    The effects of placing a parallel-plate turbulence manipulator in a boundary layer are documented through flow visualization and hot wire measurements. The boundary layer manipulator was designed to manage the large scale structures of turbulence leading to a reduction in surface drag. The differences in the turbulent structure of the boundary layer are summarized to demonstrate differences in various flow properties. The manipulator inhibited the intermittent large scale structure of the turbulent boundary layer for at least 70 boundary layer thicknesses downstream. With the removal of the large scale, the streamwise turbulence intensity levels near the wall were reduced. The downstream distribution of the skin friction was also altered by the introduction of the manipulator.

  3. Recursive architecture for large-scale adaptive system

    NASA Astrophysics Data System (ADS)

    Hanahara, Kazuyuki; Sugiyama, Yoshihiko

    1994-09-01

    'Large scale' is one of major trends in the research and development of recent engineering, especially in the field of aerospace structural system. This term expresses the large scale of an artifact in general, however, it also implies the large number of the components which make up the artifact in usual. Considering a large scale system which is especially used in remote space or deep-sea, such a system should be adaptive as well as robust by itself, because its control as well as maintenance by human operators are not easy due to the remoteness. An approach to realizing this large scale, adaptive and robust system is to build the system as an assemblage of components which are respectively adaptive by themselves. In this case, the robustness of the system can be achieved by using a large number of such components and suitable adaptation as well as maintenance strategies. Such a system gathers many research's interest and their studies such as decentralized motion control, configurating algorithm and characteristics of structural elements are reported. In this article, a recursive architecture concept is developed and discussed towards the realization of large scale system which consists of a number of uniform adaptive components. We propose an adaptation strategy based on the architecture and its implementation by means of hierarchically connected processing units. The robustness and the restoration from degeneration of the processing unit are also discussed. Two- and three-dimensional adaptive truss structures are conceptually designed based on the recursive architecture.

  4. Local Large-Scale Structure and the Assumption of Homogeneity

    NASA Astrophysics Data System (ADS)

    Keenan, Ryan C.; Barger, Amy J.; Cowie, Lennox L.

    2016-10-01

    Our recent estimates of galaxy counts and the luminosity density in the near-infrared (Keenan et al. 2010, 2012) indicated that the local universe may be under-dense on radial scales of several hundred megaparsecs. Such a large-scale local under-density could introduce significant biases in the measurement and interpretation of cosmological observables, such as the inferred effects of dark energy on the rate of expansion. In Keenan et al. (2013), we measured the K-band luminosity density as a function of distance from us to test for such a local under-density. We made this measurement over the redshift range 0.01 < z < 0.2 (radial distances D ~ 50 - 800 h 70 -1 Mpc). We found that the shape of the K-band luminosity function is relatively constant as a function of distance and environment. We derive a local (z < 0.07, D < 300 h 70 -1 Mpc) K-band luminosity density that agrees well with previously published studies. At z > 0.07, we measure an increasing luminosity density that by z ~ 0.1 rises to a value of ~ 1.5 times higher than that measured locally. This implies that the stellar mass density follows a similar trend. Assuming that the underlying dark matter distribution is traced by this luminous matter, this suggests that the local mass density may be lower than the global mass density of the universe at an amplitude and on a scale that is sufficient to introduce significant biases into the measurement of basic cosmological observables. At least one study has shown that an under-density of roughly this amplitude and scale could resolve the apparent tension between direct local measurements of the Hubble constant and those inferred by Planck team. Other theoretical studies have concluded that such an under-density could account for what looks like an accelerating expansion, even when no dark energy is present.

  5. KA-SB: from data integration to large scale reasoning

    PubMed Central

    Roldán-García, María del Mar; Navas-Delgado, Ismael; Kerzazi, Amine; Chniber, Othmane; Molina-Castro, Joaquín; Aldana-Montes, José F

    2009-01-01

    Background The analysis of information in the biological domain is usually focused on the analysis of data from single on-line data sources. Unfortunately, studying a biological process requires having access to disperse, heterogeneous, autonomous data sources. In this context, an analysis of the information is not possible without the integration of such data. Methods KA-SB is a querying and analysis system for final users based on combining a data integration solution with a reasoner. Thus, the tool has been created with a process divided into two steps: 1) KOMF, the Khaos Ontology-based Mediator Framework, is used to retrieve information from heterogeneous and distributed databases; 2) the integrated information is crystallized in a (persistent and high performance) reasoner (DBOWL). This information could be further analyzed later (by means of querying and reasoning). Results In this paper we present a novel system that combines the use of a mediation system with the reasoning capabilities of a large scale reasoner to provide a way of finding new knowledge and of analyzing the integrated information from different databases, which is retrieved as a set of ontology instances. This tool uses a graphical query interface to build user queries easily, which shows a graphical representation of the ontology and allows users o build queries by clicking on the ontology concepts. Conclusion These kinds of systems (based on KOMF) will provide users with very large amounts of information (interpreted as ontology instances once retrieved), which cannot be managed using traditional main memory-based reasoners. We propose a process for creating persistent and scalable knowledgebases from sets of OWL instances obtained by integrating heterogeneous data sources with KOMF. This process has been applied to develop a demo tool , which uses the BioPax Level 3 ontology as the integration schema, and integrates UNIPROT, KEGG, CHEBI, BRENDA and SABIORK databases. PMID:19796402

  6. Testing gravity using large-scale redshift-space distortions

    NASA Astrophysics Data System (ADS)

    Raccanelli, Alvise; Bertacca, Daniele; Pietrobon, Davide; Schmidt, Fabian; Samushia, Lado; Bartolo, Nicola; Doré, Olivier; Matarrese, Sabino; Percival, Will J.

    2013-11-01

    We use luminous red galaxies from the Sloan Digital Sky Survey (SDSS) II to test the cosmological structure growth in two alternatives to the standard Λ cold dark matter (ΛCDM)+general relativity (GR) cosmological model. We compare observed three-dimensional clustering in SDSS Data Release 7 (DR7) with theoretical predictions for the standard vanilla ΛCDM+GR model, unified dark matter (UDM) cosmologies and the normal branch Dvali-Gabadadze-Porrati (nDGP). In computing the expected correlations in UDM cosmologies, we derive a parametrized formula for the growth factor in these models. For our analysis we apply the methodology tested in Raccanelli et al. and use the measurements of Samushia et al. that account for survey geometry, non-linear and wide-angle effects and the distribution of pair orientation. We show that the estimate of the growth rate is potentially degenerate with wide-angle effects, meaning that extremely accurate measurements of the growth rate on large scales will need to take such effects into account. We use measurements of the zeroth and second-order moments of the correlation function from SDSS DR7 data and the Large Suite of Dark Matter Simulations (LasDamas), and perform a likelihood analysis to constrain the parameters of the models. Using information on the clustering up to rmax = 120 h-1 Mpc, and after marginalizing over the bias, we find, for UDM models, a speed of sound c∞ ≤ 6.1e-4, and, for the nDGP model, a cross-over scale rc ≥ 340 Mpc, at 95 per cent confidence level.

  7. Toward Improved Support for Loosely Coupled Large Scale Simulation Workflows

    SciTech Connect

    Boehm, Swen; Elwasif, Wael R; Naughton, III, Thomas J; Vallee, Geoffroy R

    2014-01-01

    High-performance computing (HPC) workloads are increasingly leveraging loosely coupled large scale simula- tions. Unfortunately, most large-scale HPC platforms, including Cray/ALPS environments, are designed for the execution of long-running jobs based on coarse-grained launch capabilities (e.g., one MPI rank per core on all allocated compute nodes). This assumption limits capability-class workload campaigns that require large numbers of discrete or loosely coupled simulations, and where time-to-solution is an untenable pacing issue. This paper describes the challenges related to the support of fine-grained launch capabilities that are necessary for the execution of loosely coupled large scale simulations on Cray/ALPS platforms. More precisely, we present the details of an enhanced runtime system to support this use case, and report on initial results from early testing on systems at Oak Ridge National Laboratory.

  8. Seismic safety in conducting large-scale blasts

    NASA Astrophysics Data System (ADS)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  9. Acoustic Studies of the Large Scale Ocean Circulation

    NASA Technical Reports Server (NTRS)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  10. Large-scale velocity structures in turbulent thermal convection.

    PubMed

    Qiu, X L; Tong, P

    2001-09-01

    A systematic study of large-scale velocity structures in turbulent thermal convection is carried out in three different aspect-ratio cells filled with water. Laser Doppler velocimetry is used to measure the velocity profiles and statistics over varying Rayleigh numbers Ra and at various spatial positions across the whole convection cell. Large velocity fluctuations are found both in the central region and near the cell boundary. Despite the large velocity fluctuations, the flow field still maintains a large-scale quasi-two-dimensional structure, which rotates in a coherent manner. This coherent single-roll structure scales with Ra and can be divided into three regions in the rotation plane: (1) a thin viscous boundary layer, (2) a fully mixed central core region with a constant mean velocity gradient, and (3) an intermediate plume-dominated buffer region. The experiment reveals a unique driving mechanism for the large-scale coherent rotation in turbulent convection.

  11. Large-scale simulations of complex physical systems

    NASA Astrophysics Data System (ADS)

    Belić, A.

    2007-04-01

    Scientific computing has become a tool as vital as experimentation and theory for dealing with scientific challenges of the twenty-first century. Large scale simulations and modelling serve as heuristic tools in a broad problem-solving process. High-performance computing facilities make possible the first step in this process - a view of new and previously inaccessible domains in science and the building up of intuition regarding the new phenomenology. The final goal of this process is to translate this newly found intuition into better algorithms and new analytical results. In this presentation we give an outline of the research themes pursued at the Scientific Computing Laboratory of the Institute of Physics in Belgrade regarding large-scale simulations of complex classical and quantum physical systems, and present recent results obtained in the large-scale simulations of granular materials and path integrals.

  12. Large-scale simulations of complex physical systems

    SciTech Connect

    Belic, A.

    2007-04-23

    Scientific computing has become a tool as vital as experimentation and theory for dealing with scientific challenges of the twenty-first century. Large scale simulations and modelling serve as heuristic tools in a broad problem-solving process. High-performance computing facilities make possible the first step in this process - a view of new and previously inaccessible domains in science and the building up of intuition regarding the new phenomenology. The final goal of this process is to translate this newly found intuition into better algorithms and new analytical results.In this presentation we give an outline of the research themes pursued at the Scientific Computing Laboratory of the Institute of Physics in Belgrade regarding large-scale simulations of complex classical and quantum physical systems, and present recent results obtained in the large-scale simulations of granular materials and path integrals.

  13. A relativistic signature in large-scale structure

    NASA Astrophysics Data System (ADS)

    Bartolo, Nicola; Bertacca, Daniele; Bruni, Marco; Koyama, Kazuya; Maartens, Roy; Matarrese, Sabino; Sasaki, Misao; Verde, Licia; Wands, David

    2016-09-01

    In General Relativity, the constraint equation relating metric and density perturbations is inherently nonlinear, leading to an effective non-Gaussianity in the dark matter density field on large scales-even if the primordial metric perturbation is Gaussian. Intrinsic non-Gaussianity in the large-scale dark matter overdensity in GR is real and physical. However, the variance smoothed on a local physical scale is not correlated with the large-scale curvature perturbation, so that there is no relativistic signature in the galaxy bias when using the simplest model of bias. It is an open question whether the observable mass proxies such as luminosity or weak lensing correspond directly to the physical mass in the simple halo bias model. If not, there may be observables that encode this relativistic signature.

  14. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    NASA Astrophysics Data System (ADS)

    Blackman, Eric G.

    2015-05-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. Here I discuss how magnetic helicity has come to help us understand the saturation of and sustenance of large scale dynamos, the need for either local or global helicity fluxes to avoid dynamo quenching, and the associated observational consequences. I also discuss how magnetic helicity acts as a hindrance to turbulent diffusion of large scale fields, and thus a helper for fossil remnant large scale field origin models in some contexts. I briefly discuss the connection between large scale fields and accretion disk theory as well. The goal here is to provide a conceptual primer to help the reader efficiently penetrate the literature.

  15. Large Scale Processes and Extreme Floods in Brazil

    NASA Astrophysics Data System (ADS)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  16. Large-scale motions in the universe: Using clusters of galaxies as tracers

    NASA Technical Reports Server (NTRS)

    Gramann, Mirt; Bahcall, Neta A.; Cen, Renyue; Gott, J. Richard

    1995-01-01

    Can clusters of galaxies be used to trace the large-scale peculiar velocity field of the universe? We answer this question by using large-scale cosmological simulations to compare the motions of rich clusters of galaxies with the motion of the underlying matter distribution. Three models are investigated: Omega = 1 and Omega = 0.3 cold dark matter (CDM), and Omega = 0.3 primeval baryonic isocurvature (PBI) models, all normalized to the Cosmic Background Explorer (COBE) background fluctuations. We compare the cluster and mass distribution of peculiar velocities, bulk motions, velocity dispersions, and Mach numbers as a function of scale for R greater than or = 50/h Mpc. We also present the large-scale velocity and potential maps of clusters and of the matter. We find that clusters of galaxies trace well the large-scale velocity field and can serve as an efficient tool to constrain cosmological models. The recently reported bulk motion of clusters 689 +/- 178 km/s on approximately 150/h Mpc scale (Lauer & Postman 1994) is larger than expected in any of the models studied (less than or = 190 +/- 78 km/s).

  17. Large-scale motions in the universe: Using clusters of galaxies as tracers

    NASA Technical Reports Server (NTRS)

    Gramann, Mirt; Bahcall, Neta A.; Cen, Renyue; Gott, J. Richard

    1995-01-01

    Can clusters of galaxies be used to trace the large-scale peculiar velocity field of the universe? We answer this question by using large-scale cosmological simulations to compare the motions of rich clusters of galaxies with the motion of the underlying matter distribution. Three models are investigated: Omega = 1 and Omega = 0.3 cold dark matter (CDM), and Omega = 0.3 primeval baryonic isocurvature (PBI) models, all normalized to the Cosmic Background Explorer (COBE) background fluctuations. We compare the cluster and mass distribution of peculiar velocities, bulk motions, velocity dispersions, and Mach numbers as a function of scale for R greater than or = 50/h Mpc. We also present the large-scale velocity and potential maps of clusters and of the matter. We find that clusters of galaxies trace well the large-scale velocity field and can serve as an efficient tool to constrain cosmological models. The recently reported bulk motion of clusters 689 +/- 178 km/s on approximately 150/h Mpc scale (Lauer & Postman 1994) is larger than expected in any of the models studied (less than or = 190 +/- 78 km/s).

  18. [Issues of large scale tissue culture of medicinal plant].

    PubMed

    Lv, Dong-Mei; Yuan, Yuan; Zhan, Zhi-Lai

    2014-09-01

    In order to increase the yield and quality of the medicinal plant and enhance the competitive power of industry of medicinal plant in our country, this paper analyzed the status, problem and countermeasure of the tissue culture of medicinal plant on large scale. Although the biotechnology is one of the most efficient and promising means in production of medicinal plant, it still has problems such as stability of the material, safety of the transgenic medicinal plant and optimization of cultured condition. Establishing perfect evaluation system according to the characteristic of the medicinal plant is the key measures to assure the sustainable development of the tissue culture of medicinal plant on large scale.

  19. Corridors Increase Plant Species Richness at Large Scales

    SciTech Connect

    Damschen, Ellen I.; Haddad, Nick M.; Orrock,John L.; Tewksbury, Joshua J.; Levey, Douglas J.

    2006-09-01

    Habitat fragmentation is one of the largest threats to biodiversity. Landscape corridors, which are hypothesized to reduce the negative consequences of fragmentation, have become common features of ecological management plans worldwide. Despite their popularity, there is little evidence documenting the effectiveness of corridors in preserving biodiversity at large scales. Using a large-scale replicated experiment, we showed that habitat patches connected by corridors retain more native plant species than do isolated patches, that this difference increases over time, and that corridors do not promote invasion by exotic species. Our results support the use of corridors in biodiversity conservation.

  20. Large-Scale Graph Processing Analysis using Supercomputer Cluster

    NASA Astrophysics Data System (ADS)

    Vildario, Alfrido; Fitriyani; Nugraha Nurkahfi, Galih

    2017-01-01

    Graph implementation is widely use in various sector such as automotive, traffic, image processing and many more. They produce graph in large-scale dimension, cause the processing need long computational time and high specification resources. This research addressed the analysis of implementation large-scale graph using supercomputer cluster. We impelemented graph processing by using Breadth-First Search (BFS) algorithm with single destination shortest path problem. Parallel BFS implementation with Message Passing Interface (MPI) used supercomputer cluster at High Performance Computing Laboratory Computational Science Telkom University and Stanford Large Network Dataset Collection. The result showed that the implementation give the speed up averages more than 30 times and eficiency almost 90%.

  1. Survey of decentralized control methods. [for large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Athans, M.

    1975-01-01

    An overview is presented of the types of problems that are being considered by control theorists in the area of dynamic large scale systems with emphasis on decentralized control strategies. Approaches that deal directly with decentralized decision making for large scale systems are discussed. It is shown that future advances in decentralized system theory are intimately connected with advances in the stochastic control problem with nonclassical information pattern. The basic assumptions and mathematical tools associated with the latter are summarized, and recommendations concerning future research are presented.

  2. Clearing and Labeling Techniques for Large-Scale Biological Tissues

    PubMed Central

    Seo, Jinyoung; Choe, Minjin; Kim, Sung-Yon

    2016-01-01

    Clearing and labeling techniques for large-scale biological tissues enable simultaneous extraction of molecular and structural information with minimal disassembly of the sample, facilitating the integration of molecular, cellular and systems biology across different scales. Recent years have witnessed an explosive increase in the number of such methods and their applications, reflecting heightened interest in organ-wide clearing and labeling across many fields of biology and medicine. In this review, we provide an overview and comparison of existing clearing and labeling techniques and discuss challenges and opportunities in the investigations of large-scale biological systems. PMID:27239813

  3. The Evolution of Baryons in Cosmic Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Snedden, Ali; Arielle Phillips, Lara; Mathews, Grant James; Coughlin, Jared; Suh, In-Saeng; Bhattacharya, Aparna

    2015-01-01

    The environments of galaxies play a critical role in their formation and evolution. We study these environments using cosmological simulations with star formation and supernova feedback included. From these simulations, we parse the large scale structure into clusters, filaments and voids using a segmentation algorithm adapted from medical imaging. We trace the star formation history, gas phase and metal evolution of the baryons in the intergalactic medium as function of structure. We find that our algorithm reproduces the baryon fraction in the intracluster medium and that the majority of star formation occurs in cold, dense filaments. We present the consequences this large scale environment has for galactic halos and galaxy evolution.

  4. Large scale purification of RNA nanoparticles by preparative ultracentrifugation.

    PubMed

    Jasinski, Daniel L; Schwartz, Chad T; Haque, Farzin; Guo, Peixuan

    2015-01-01

    Purification of large quantities of supramolecular RNA complexes is of paramount importance due to the large quantities of RNA needed and the purity requirements for in vitro and in vivo assays. Purification is generally carried out by liquid chromatography (HPLC), polyacrylamide gel electrophoresis (PAGE), or agarose gel electrophoresis (AGE). Here, we describe an efficient method for the large-scale purification of RNA prepared by in vitro transcription using T7 RNA polymerase by cesium chloride (CsCl) equilibrium density gradient ultracentrifugation and the large-scale purification of RNA nanoparticles by sucrose gradient rate-zonal ultracentrifugation or cushioned sucrose gradient rate-zonal ultracentrifugation.

  5. Large-Scale Spray Releases: Initial Aerosol Test Results

    SciTech Connect

    Schonewill, Philip P.; Gauglitz, Phillip A.; Bontha, Jagannadha R.; Daniel, Richard C.; Kurath, Dean E.; Adkins, Harold E.; Billing, Justin M.; Burns, Carolyn A.; Davis, James M.; Enderlin, Carl W.; Fischer, Christopher M.; Jenks, Jeromy WJ; Lukins, Craig D.; MacFarlan, Paul J.; Shutthanandan, Janani I.; Smith, Dennese M.

    2012-12-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and

  6. Development of Large-Scale Data Visualization System for Magnetic Flux Tracing in Global MHD Simulations

    NASA Astrophysics Data System (ADS)

    Murata, K. T.; Watari, S.; Kubota, Y.; Fukazawa, K.; Tsubouchi, K.; Fujita, S.; Tanaka, T.; Den, M.; Murayama, Y.

    2011-12-01

    At NICT (National Institute of Information and Communications Technology) we have been developing a new research environment named "OneSpaceNet". The OneSpaceNet is a cloud-computing environment to provide the researchers rich resources for research studies, such as super-computers, large-scale disk area, licensed applications, database and communication devices. The large-scale disk area is rovided via Gfarm, which is one of the distributed file systems. This paper first proposes a distributed data-type and/or data-intensive processing system that are provided via Gfarm as a solution to large-scale data processing in the context of distributed data management and data processing environments in the field of solar-terrestrial physics. The usefulness of a system composed of many file system nodes was examined using large-scale computer simulation data. In the parallel 3D visualization of computer simulation data varying in terms of data processing granularity, optimized load balancing through FIFO scheduling or pipe-line scheduling yielded parallelization efficacy. Using the large-scale data processing system, we have developed a magnetic flux tracing system of global MHD simulations. Under the assumption of magnetic field frozen-in theory of ideal MHD plasma, we trace an element (or elements) of plasma at all steps of global MHD simulation, and visualize magnetic flux (magnetic field lines) penetrating the element(s). Since this system depends on the frozen-in theory, we need to examine when and where this assumption breaks before we apply it for physical data analyses. Figure (a) and Figure (b) show magnetic field lines in the vicinity of the Earth's magnetopause visualized via present system. Both figures show that the magnetic field lines are scattered as they advance downward. In the present talk we discuss the error in the tracings and the restrictions to apply for this technique.

  7. Resurrecting hot dark matter - Large-scale structure from cosmic strings and massive neutrinos

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.

    1988-01-01

    These are the results of a numerical simulation of the formation of large-scale structure from cosmic-string loops in a universe dominated by massive neutrinos (hot dark matter). This model has several desirable features. The final matter distribution contains isolated density peaks embedded in a smooth background, producing a natural bias in the distribution of luminous matter. Because baryons can accrete onto the cosmic strings before the neutrinos, the galaxies will have baryon cores and dark neutrino halos. Galaxy formation in this model begins much earlier than in random-phase models. On large scales the distribution of clustered matter visually resembles the CfA survey, with large voids and filaments.

  8. Resurrecting hot dark matter - Large-scale structure from cosmic strings and massive neutrinos

    NASA Technical Reports Server (NTRS)

    Scherrer, Robert J.

    1988-01-01

    These are the results of a numerical simulation of the formation of large-scale structure from cosmic-string loops in a universe dominated by massive neutrinos (hot dark matter). This model has several desirable features. The final matter distribution contains isolated density peaks embedded in a smooth background, producing a natural bias in the distribution of luminous matter. Because baryons can accrete onto the cosmic strings before the neutrinos, the galaxies will have baryon cores and dark neutrino halos. Galaxy formation in this model begins much earlier than in random-phase models. On large scales the distribution of clustered matter visually resembles the CfA survey, with large voids and filaments.

  9. The Large-Scale Structure of Scientific Method

    ERIC Educational Resources Information Center

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  10. A bibliographical surveys of large-scale systems

    NASA Technical Reports Server (NTRS)

    Corliss, W. R.

    1970-01-01

    A limited, partly annotated bibliography was prepared on the subject of large-scale system control. Approximately 400 references are divided into thirteen application areas, such as large societal systems and large communication systems. A first-author index is provided.

  11. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  12. Firebrands and spotting ignition in large-scale fires

    Treesearch

    Eunmo Koo; Patrick J. Pagni; David R. Weise; John P. Woycheese

    2010-01-01

    Spotting ignition by lofted firebrands is a significant mechanism of fire spread, as observed in many largescale fires. The role of firebrands in fire propagation and the important parameters involved in spot fire development are studied. Historical large-scale fires, including wind-driven urban and wildland conflagrations and post-earthquake fires are given as...

  13. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  14. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    ERIC Educational Resources Information Center

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  15. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  16. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  17. Large-Scale Environmental Influences on Aquatic Animal Health

    EPA Science Inventory

    In the latter portion of the 20th century, North America experienced numerous large-scale mortality events affecting a broad diversity of aquatic animals. Short-term forensic investigations of these events have sometimes characterized a causative agent or condition, but have rare...

  18. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  19. Developing and Understanding Methods for Large-Scale Nonlinear Optimization

    DTIC Science & Technology

    2006-07-24

    algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with -2- many thousands...34Published in peer-reviewed journals" E. Eskow, B. Bader, R. Byrd, S. Crivelli, T. Head-Gordon, V. Lamberti and R. Schnabel, "An optimization approach to the

  20. Probabilistic Cuing in Large-Scale Environmental Search

    ERIC Educational Resources Information Center

    Smith, Alastair D.; Hood, Bruce M.; Gilchrist, Iain D.

    2010-01-01

    Finding an object in our environment is an important human ability that also represents a critical component of human foraging behavior. One type of information that aids efficient large-scale search is the likelihood of the object being in one location over another. In this study we investigated the conditions under which individuals respond to…

  1. Feasibility of large-scale aquatic microcosms. Final report

    SciTech Connect

    Pease, T.; Wyman, R.L.; Logan, D.T.; Logan, C.M.; Lispi, D.R.

    1982-02-01

    Microcosms have been used to study a number of fundamental ecological principles and more recently to investigate the effects of man-made perturbations on ecosystems. In this report the feasibility of using large-scale microcosms to access aquatic impacts of power generating facilities is evaluated. Aquatic problems of concern to utilities are outlined, and various research approaches, including large and small microcosms, bioassays, and other laboratory experiments, are discussed. An extensive critical review and synthesis of the literature on recent microcosm research, which includes a comparison of the factors influencing physical, chemical, and biological processes in small vs large microcosms and in microcosms vs nature, led the authors to conclude that large-scale microcosms offer several advantages over other study techniques for particular types of problems. A hypothetical large-scale facility simulating a lake ecosystem is presented to illustrate the size, cost, and complexity of such facilities. The rationale for designing a lake-simulating large-scale microcosm is presented.

  2. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  3. Improving the Utility of Large-Scale Assessments in Canada

    ERIC Educational Resources Information Center

    Rogers, W. Todd

    2014-01-01

    Principals and teachers do not use large-scale assessment results because the lack of distinct and reliable subtests prevents identifying strengths and weaknesses of students and instruction, the results arrive too late to be used, and principals and teachers need assistance to use the results to improve instruction so as to improve student…

  4. Research directions in large scale systems and decentralized control

    NASA Technical Reports Server (NTRS)

    Tenney, R. R.

    1980-01-01

    Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.

  5. Ecosystem resilience despite large-scale altered hydro climatic conditions

    USDA-ARS?s Scientific Manuscript database

    Climate change is predicted to increase both drought frequency and duration, and when coupled with substantial warming, will establish a new hydroclimatological paradigm for many regions. Large-scale, warm droughts have recently impacted North America, Africa, Europe, Amazonia, and Australia result...

  6. The Large-Scale Structure of Scientific Method

    ERIC Educational Resources Information Center

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  7. Large-Scale Assessments and Educational Policies in Italy

    ERIC Educational Resources Information Center

    Damiani, Valeria

    2016-01-01

    Despite Italy's extensive participation in most large-scale assessments, their actual influence on Italian educational policies is less easy to identify. The present contribution aims at highlighting and explaining reasons for the weak and often inconsistent relationship between international surveys and policy-making processes in Italy.…

  8. Large-Scale Innovation and Change in UK Higher Education

    ERIC Educational Resources Information Center

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  9. Current Scientific Issues in Large Scale Atmospheric Dynamics

    NASA Technical Reports Server (NTRS)

    Miller, T. L. (Compiler)

    1986-01-01

    Topics in large scale atmospheric dynamics are discussed. Aspects of atmospheric blocking, the influence of transient baroclinic eddies on planetary-scale waves, cyclogenesis, the effects of orography on planetary scale flow, small scale frontal structure, and simulations of gravity waves in frontal zones are discussed.

  10. Large-Scale Assessments and Educational Policies in Italy

    ERIC Educational Resources Information Center

    Damiani, Valeria

    2016-01-01

    Despite Italy's extensive participation in most large-scale assessments, their actual influence on Italian educational policies is less easy to identify. The present contribution aims at highlighting and explaining reasons for the weak and often inconsistent relationship between international surveys and policy-making processes in Italy.…

  11. Large scale fire whirls: Can their formation be predicted?

    Treesearch

    J. Forthofer; Bret Butler

    2010-01-01

    Large scale fire whirls have not traditionally been recognized as a frequent phenomenon on wildland fires. However, there are anecdotal data suggesting that they can and do occur with some regularity. This paper presents a brief summary of this information and an analysis of the causal factors leading to their formation.

  12. Large-Scale Environmental Influences on Aquatic Animal Health

    EPA Science Inventory

    In the latter portion of the 20th century, North America experienced numerous large-scale mortality events affecting a broad diversity of aquatic animals. Short-term forensic investigations of these events have sometimes characterized a causative agent or condition, but have rare...

  13. International Large-Scale Assessments: What Uses, What Consequences?

    ERIC Educational Resources Information Center

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  14. Large-Scale Innovation and Change in UK Higher Education

    ERIC Educational Resources Information Center

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  15. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  16. Individual Skill Differences and Large-Scale Environmental Learning

    ERIC Educational Resources Information Center

    Fields, Alexa W.; Shelton, Amy L.

    2006-01-01

    Spatial skills are known to vary widely among normal individuals. This project was designed to address whether these individual differences are differentially related to large-scale environmental learning from route (ground-level) and survey (aerial) perspectives. Participants learned two virtual environments (route and survey) with limited…

  17. Newton Methods for Large Scale Problems in Machine Learning

    ERIC Educational Resources Information Center

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  18. Large-Scale Machine Learning for Classification and Search

    ERIC Educational Resources Information Center

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  19. Global smoothing and continuation for large-scale molecular optimization

    SciTech Connect

    More, J.J.; Wu, Zhijun

    1995-10-01

    We discuss the formulation of optimization problems that arise in the study of distance geometry, ionic systems, and molecular clusters. We show that continuation techniques based on global smoothing are applicable to these molecular optimization problems, and we outline the issues that must be resolved in the solution of large-scale molecular optimization problems.

  20. Large-scale Eucalyptus energy farms and power cogeneration

    Treesearch

    Robert C. Noroña

    1983-01-01

    A thorough evaluation of all factors possibly affecting a large-scale planting of eucalyptus is foremost in determining the cost effectiveness of the planned operation. Seven basic areas of concern must be analyzed:1. Species Selection 2. Site Preparation 3. Planting 4. Weed Control 5....

  1. Probabilistic Cuing in Large-Scale Environmental Search

    ERIC Educational Resources Information Center

    Smith, Alastair D.; Hood, Bruce M.; Gilchrist, Iain D.

    2010-01-01

    Finding an object in our environment is an important human ability that also represents a critical component of human foraging behavior. One type of information that aids efficient large-scale search is the likelihood of the object being in one location over another. In this study we investigated the conditions under which individuals respond to…

  2. Lessons from Large-Scale Renewable Energy Integration Studies: Preprint

    SciTech Connect

    Bird, L.; Milligan, M.

    2012-06-01

    In general, large-scale integration studies in Europe and the United States find that high penetrations of renewable generation are technically feasible with operational changes and increased access to transmission. This paper describes other key findings such as the need for fast markets, large balancing areas, system flexibility, and the use of advanced forecasting.

  3. The large scale microwave background anisotropy in decaying particle cosmology

    SciTech Connect

    Panek, M.

    1987-06-01

    We investigate the large-scale anisotropy of the microwave background radiation in cosmological models with decaying particles. The observed value of the quadrupole moment combined with other constraints gives an upper limit on the redshift of the decay z/sub d/ < 3-5. 12 refs., 2 figs.

  4. Large-scale search for dark-matter axions

    SciTech Connect

    Kinion, D; van Bibber, K

    2000-08-30

    We review the status of two ongoing large-scale searches for axions which may constitute the dark matter of our Milky Way halo. The experiments are based on the microwave cavity technique proposed by Sikivie, and marks a ''second-generation'' to the original experiments performed by the Rochester-Brookhaven-Fermilab collaboration, and the University of Florida group.

  5. Resilience of Florida Keys coral communities following large scale disturbances

    EPA Science Inventory

    The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...

  6. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  7. The Role of Plausible Values in Large-Scale Surveys

    ERIC Educational Resources Information Center

    Wu, Margaret

    2005-01-01

    In large-scale assessment programs such as NAEP, TIMSS and PISA, students' achievement data sets provided for secondary analysts contain so-called "plausible values." Plausible values are multiple imputations of the unobservable latent achievement for each student. In this article it has been shown how plausible values are used to: (1)…

  8. Large-scale silicon optical switches for optical interconnection

    NASA Astrophysics Data System (ADS)

    Qiao, Lei; Tang, Weijie; Chu, Tao

    2016-11-01

    Large-scale optical switches are greatly demanded in building optical interconnections in data centers and high performance computers (HPCs). Silicon optical switches have advantages of being compact and CMOS process compatible, which can be easily monolithically integrated. However, there are difficulties to construct large ports silicon optical switches. One of them is the non-uniformity of the switch units in large scale silicon optical switches, which arises from the fabrication error and causes confusion in finding the unit optimum operation points. In this paper, we proposed a method to detect the optimum operating point in large scale switch with limited build-in power monitors. We also propose methods for improving the unbalanced crosstalk of cross/bar states in silicon electro-optical MZI switches and insertion losses. Our recent progress in large scale silicon optical switches, including 64 × 64 thermal-optical and 32 × 32 electro-optical switches will be introduced. To the best our knowledge, both of them are the largest scale silicon optical switches in their sections, respectively. The switches were fabricated on 340-nm SOI substrates with CMOS 180- nm processes. The crosstalk of the 32 × 32 electro-optic switch was -19.2dB to -25.1 dB, while the value of the 64 × 64 thermal-optic switch was -30 dB to -48.3 dB.

  9. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  10. Computational Complexity, Efficiency and Accountability in Large Scale Teleprocessing Systems.

    DTIC Science & Technology

    1980-12-01

    COMPLEXITY, EFFICIENCY AND ACCOUNTABILITY IN LARGE SCALE TELEPROCESSING SYSTEMS DAAG29-78-C-0036 STANFORD UNIVERSITY JOHN T. GILL MARTIN E. BELLMAN...solve but easy to check. Ve have also suggested howy sucb random tapes can be simulated by determin- istically generating "pseudorandom" numbers by a

  11. Large-Scale Assessment and English Language Learners with Disabilities

    ERIC Educational Resources Information Center

    Liu, Kristin K.; Ward, Jenna M.; Thurlow, Martha L.; Christensen, Laurene L.

    2017-01-01

    This article highlights a set of principles and guidelines, developed by a diverse group of specialists in the field, for appropriately including English language learners (ELLs) with disabilities in large-scale assessments. ELLs with disabilities make up roughly 9% of the rapidly increasing ELL population nationwide. In spite of the small overall…

  12. Large-scale silviculture experiments of western Oregon and Washington.

    Treesearch

    Nathan J. Poage; Paul D. Anderson

    2007-01-01

    We review 12 large-scale silviculture experiments (LSSEs) in western Washington and Oregon with which the Pacific Northwest Research Station of the USDA Forest Service is substantially involved. We compiled and arrayed information about the LSSEs as a series of matrices in a relational database, which is included on the compact disc published with this report and...

  13. Newton Methods for Large Scale Problems in Machine Learning

    ERIC Educational Resources Information Center

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  14. Large-Scale Machine Learning for Classification and Search

    ERIC Educational Resources Information Center

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  15. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  16. Large-scale societal changes and intentionality - an uneasy marriage.

    PubMed

    Bodor, Péter; Fokas, Nikos

    2014-08-01

    Our commentary focuses on juxtaposing the proposed science of intentional change with facts and concepts pertaining to the level of large populations or changes on a worldwide scale. Although we find a unified evolutionary theory promising, we think that long-term and large-scale, scientifically guided - that is, intentional - social change is not only impossible, but also undesirable.

  17. Large-scale screening by the automated Wassermann reaction

    PubMed Central

    Wagstaff, W.; Firth, R.; Booth, J. R.; Bowley, C. C.

    1969-01-01

    In view of the drawbacks in the use of the Kahn test for large-scale screening of blood donors, mainly those of human error through work overload and fatiguability, an attempt was made to adapt an existing automated complement-fixation technique for this purpose. This paper reports the successful results of that adaptation. PMID:5776559

  18. International Large-Scale Assessments: What Uses, What Consequences?

    ERIC Educational Resources Information Center

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  19. Cosmic strings and the large-scale structure

    NASA Technical Reports Server (NTRS)

    Stebbins, Albert

    1988-01-01

    A possible problem for cosmic string models of galaxy formation is presented. If very large voids are common and if loop fragmentation is not much more efficient than presently believed, then it may be impossible for string scenarios to produce the observed large-scale structure with Omega sub 0 = 1 and without strong environmental biasing.

  20. Large scale structure of the sun's radio corona

    NASA Technical Reports Server (NTRS)

    Kundu, M. R.

    1986-01-01

    Results of studies of large scale structures of the corona at long radio wavelengths are presented, using data obtained with the multifrequency radioheliograph of the Clark Lake Radio Observatory. It is shown that features corresponding to coronal streamers and coronal holes are readily apparent in the Clark Lake maps.