Science.gov

Sample records for air reveals large-scale

  1. DESIGN OF LARGE-SCALE AIR MONITORING NETWORKS

    EPA Science Inventory

    The potential effects of air pollution on human health have received much attention in recent years. In the U.S. and other countries, there are extensive large-scale monitoring networks designed to collect data to inform the public of exposure risks to air pollution. A major crit...

  2. Landsat 7 Reveals Large-scale Fractal Motion of Clouds

    NASA Technical Reports Server (NTRS)

    2002-01-01

    get carried along within the vortices, but these are soon mixed into the surrounding clouds. Landsat is unique in its ability to image both the small-scale eddies that mix clear and cloudy air, down to the 30 meter pixel size of Landsat, but also having a wide enough field-of-view, 180 km, to reveal the connection of the turbulence to large-scale flows such as the subtropical oceanic gyres. Landsat 7, with its new onboard digital recorder, has extended this capability away from the few Landsat ground stations to remote areas such as Alejandro Island, and thus is gradually providing a global dynamic picture of evolving human-scale phenomena. (For more details on von Karman vortices, refer to http://climate.gsfc.nasa.gov/cahalan) Image and caption courtesy Bob Cahalan, NASA GSFC

  3. Satellite measurements of large-scale air pollution: Methods

    SciTech Connect

    Kaufman, Y.J.; Fraser, R.S.; Ferrare, R.A. )

    1990-06-20

    A method is presented for simultaneous determination of the aerosol optical thickness ({tau}{sub a}), particle size (r{sub m}, geometric mean mass radius for a lognormal distribution) and the single scattering albedo ({omega}{sub 0}, ratio between scattering and scattering + absorption) from satellite imagery. The method is based on satellite images of the surface (land and water) in the visible and near-IR bands and is applied here to the first two channels of the Advanced Very High Resolution Radiometer (AVHRR) sensor. The aerosol characteristics are obtained from the difference in the upward radiances, detected by the satellite, between a clear and a hazy day. Therefore the method is mainly useful for remote sensing of large-scale air pollution (e.g., smoke from a large fire or concentrated anthropogenic pollution), which introduces dense aerosol into the atmosphere (aerosol optical thickness {ge}0.4) on top of an existing aerosol. The method is very sensitive to the stability of the surface reflectance between the clear day and the hazy day. It also requires accurate satellite calibration (preferably not more than 5% error) and stable calibration with good relative values between the two bands used in the analysis. With these requirements, the aerosol optical thickness can be derived with an error of {Delta}{tau}{sub a} = 0.08-0.15. For an assumed lognormal size distribution, the particle geometrical mean mass radius r{sub m} can be derived (if good calibration is available) with an error of {Delta}r{sub m} = {plus minus}(0.10-0.20){mu}m, and {omega}{sub 0} with {Delta}{omega}{sub 0} = {plus minus}0.03 for {omega}{sub 0} close to 1 and {Delta}{sub omega}{sub 0} = {plus minus}(0.03-0.07) for {omega}{sub 0} about 0.8. The method was applied to AVHRR images of a forest fire smoke.

  4. Cloud-based large-scale air traffic flow optimization

    NASA Astrophysics Data System (ADS)

    Cao, Yi

    The ever-increasing traffic demand makes the efficient use of airspace an imperative mission, and this paper presents an effort in response to this call. Firstly, a new aggregate model, called Link Transmission Model (LTM), is proposed, which models the nationwide traffic as a network of flight routes identified by origin-destination pairs. The traversal time of a flight route is assumed to be the mode of distribution of historical flight records, and the mode is estimated by using Kernel Density Estimation. As this simplification abstracts away physical trajectory details, the complexity of modeling is drastically decreased, resulting in efficient traffic forecasting. The predicative capability of LTM is validated against recorded traffic data. Secondly, a nationwide traffic flow optimization problem with airport and en route capacity constraints is formulated based on LTM. The optimization problem aims at alleviating traffic congestions with minimal global delays. This problem is intractable due to millions of variables. A dual decomposition method is applied to decompose the large-scale problem such that the subproblems are solvable. However, the whole problem is still computational expensive to solve since each subproblem is an smaller integer programming problem that pursues integer solutions. Solving an integer programing problem is known to be far more time-consuming than solving its linear relaxation. In addition, sequential execution on a standalone computer leads to linear runtime increase when the problem size increases. To address the computational efficiency problem, a parallel computing framework is designed which accommodates concurrent executions via multithreading programming. The multithreaded version is compared with its monolithic version to show decreased runtime. Finally, an open-source cloud computing framework, Hadoop MapReduce, is employed for better scalability and reliability. This framework is an "off-the-shelf" parallel computing model

  5. Satellite measurements of large-scale air pollution - Methods

    NASA Astrophysics Data System (ADS)

    Kaufman, Yoram J.; Ferrare, Richard A.; Fraser, Robert S.

    1990-06-01

    A technique for deriving large-scale pollution parameters from NIR and visible satellite remote-sensing images obtained over land or water is described and demonstrated on AVHRR images. The method is based on comparison of the upward radiances on clear and hazy days and permits simultaneous determination of aerosol optical thickness with error Delta tau(a) = 0.08-0.15, particle size with error + or - 100-200 nm, and single-scattering albedo with error + or - 0.03 (for albedos near 1), all assuming accurate and stable satellite calibration and stable surface reflectance between the clear and hazy days. In the analysis of AVHRR images of smoke from a forest fire, good agreement was obtained between satellite and ground-based (sun-photometer) measurements of aerosol optical thickness, but the satellite particle sizes were systematically greater than those measured from the ground. The AVHRR single-scattering albedo agreed well with a Landsat albedo for the same smoke.

  6. Satellite measurements of large-scale air pollution - Methods

    NASA Technical Reports Server (NTRS)

    Kaufman, Yoram J.; Ferrare, Richard A.; Fraser, Robert S.

    1990-01-01

    A technique for deriving large-scale pollution parameters from NIR and visible satellite remote-sensing images obtained over land or water is described and demonstrated on AVHRR images. The method is based on comparison of the upward radiances on clear and hazy days and permits simultaneous determination of aerosol optical thickness with error Delta tau(a) = 0.08-0.15, particle size with error + or - 100-200 nm, and single-scattering albedo with error + or - 0.03 (for albedos near 1), all assuming accurate and stable satellite calibration and stable surface reflectance between the clear and hazy days. In the analysis of AVHRR images of smoke from a forest fire, good agreement was obtained between satellite and ground-based (sun-photometer) measurements of aerosol optical thickness, but the satellite particle sizes were systematically greater than those measured from the ground. The AVHRR single-scattering albedo agreed well with a Landsat albedo for the same smoke.

  7. Influence of large-scale atmospheric circulation on marine air intrusion toward the East Antarctic coast

    NASA Astrophysics Data System (ADS)

    Kurita, Naoyuki; Hirasawa, Naohiko; Koga, Seizi; Matsushita, Junji; Steen-Larsen, Hans Christian; Masson-Delmotte, Valérie; Fujiyoshi, Yasushi

    2016-09-01

    Marine air intrusions into Antarctica play a key role in high-precipitation events. Here we use shipboard observations of water vapor isotopologues between Australia and Syowa on the East Antarctic coast to elucidate the mechanism by which large-scale circulation influences marine air intrusions. The temporal isotopic variations at Syowa reflect the meridional movement of a marine air front. They are also associated with atmospheric circulation anomalies that enhance the southward movement of cyclones over the Southern Ocean. The relationship between large-scale circulation and the movement of the front is explained by northerly winds which, in association with cyclones, move toward the Antarctic coast and push marine air with isotopically enriched moisture into the inland covered by glacial air with depleted isotopic values. Future changes in large-scale circulation may have a significant impact on the frequency and intensity of marine air intrusion into Antarctica.

  8. Comparison of large scale renewable energy projects for the United States air force

    NASA Astrophysics Data System (ADS)

    Hughes, Jeffrey S.

    This thesis focused on the performance of large-scale renewable energy projects for the United States Air Force. As global energy demands continue to rise, the need to find ways to save energy and produce alternative sources of energy will increase. The Federal Government has begun to address the challenge of energy production and energy security in recent years. In order to increase both the energy production and energy security for the Air Force, there is a trend to increase the amount of renewable energy produced on military installations. The goal of this research was to compare the estimated and actual performance of these large-scale on-site renewable energy projects at Air Force installations. The variables considered for this research were the execution methods and the renewable energy sources. The performance of each project was evaluated against factors identified in previous sustainable construction studies. The study found that actual performance of third party owned and operated projects differed from the expected performance by less than the Air Force owned and operated projects, and that performance of renewable energy projects differed from the expected performance by less than high performance buildings from previous studies. The study also found factors that contributed to the gap between the expected and actual performance including optimistic modeling, unusual weather, operational issues and higher than expected maintenance of the projects. The results of this research were an initial step in understanding the actual performance of large-scale renewable energy projects.

  9. MtDNA metagenomics reveals large-scale invasion of belowground arthropod communities by introduced species.

    PubMed

    Cicconardi, Francesco; Borges, Paulo A V; Strasberg, Dominique; Oromí, Pedro; López, Heriberto; Pérez-Delgado, Antonio J; Casquet, Juliane; Caujapé-Castells, Juli; Fernández-Palacios, José María; Thébaud, Christophe; Emerson, Brent C

    2017-01-31

    Using a series of standardized sampling plots within forest ecosystems in remote oceanic islands, we reveal fundamental differences between the structuring of aboveground and belowground arthropod biodiversity that are likely due to large-scale species introductions by humans. Species of beetle and spider were sampled almost exclusively from single islands, while soil-dwelling Collembola exhibited more than tenfold higher species sharing among islands. Comparison of Collembola mitochondrial metagenomic data to a database of more than 80 000 Collembola barcode sequences revealed almost 30% of sampled island species are genetically identical, or near identical, to individuals sampled from often very distant geographic regions of the world. Patterns of mtDNA relatedness among Collembola implicate human-mediated species introductions, with minimum estimates for the proportion of introduced species on the sampled islands ranging from 45% to 88%. Our results call for more attention to soil mesofauna to understand the global extent and ecological consequences of species introductions.

  10. Drought Variability in Eastern Part of Romania and its Connection with Large-Scale Air Circulation

    NASA Astrophysics Data System (ADS)

    Barbu, Nicu; Stefan, Sabina; Georgescu, Florinela

    2014-05-01

    Drought is a phenomenon that appears due to precipitation deficit and it is intensified by strong winds, high temperatures, low relative humidity and high insolation; in fact, all these factors lead to increasing of evapotranspiration processes that contribute to soil water deficit. The Standardized Precipitation Evapotranspiration Index (SPEI) take into account all this factors listed above. The temporal variability of the drought in Eastern part of Romania for 50 years, during the period 1961-2010, is investigated. This study is focused on the drought variability related to large scale air circulation. The gridded dataset with spatial resolution of 0.5º lat/lon of SPEI, (https://digital.csic.es/handle/10261/72264) were used to analyze drought periods in connection with large scale air circulation determinate from the two catalogues (GWT - GrossWetter-Typen and WLK - WetterLargenKlassifikation) defined in COST733Action. The GWT catalogue uses at input dataset the sea level pressure and the WLK catalogue uses as input dataset the geopotential field at 925 hPa and 500 hPa, wind at 700 hPa and total water content for entire atmospheric column. In this study we use the GWT catalogue with 18 circulation types and the WLK catalogue with 40 circulation types. The analysis for Barlad Hydrological Basin indicated that the negative values (that means water deficit - drought period) of SPEI are associated with prevailing anticyclonic regime and positive values (that means water excess - rainy period) of SPEI are associated with prevailing cyclonic regime as was expected. In last decade was observed an increase of dry period associated with an increase of anticyclonic activity over Romania. Using GWT18 catalogue the drought are associated with the north-eastern anticyclonic circulation type (NE-A). According to the WLK40 catalogue, the dominant circulation type associated with the drought is north-west-anticyclonic-dry anticyclonic (NW-AAD) type. keywords: drought, SPEI

  11. A Feasibility Study on Operating Large Scale Compressed Air Energy Storage in Porous Formations

    NASA Astrophysics Data System (ADS)

    Wang, B.; Pfeiffer, W. T.; Li, D.; Bauer, S.

    2015-12-01

    Compressed air energy storage (CAES) in porous formations has been considered as one promising option of large scale energy storage for decades. This study, hereby, aims at analyzing the feasibility of operating large scale CAES in porous formations and evaluating the performance of underground porous gas reservoirs. To address these issues quantitatively, a hypothetic CAES scenario with a typical anticline structure in northern Germany was numerically simulated. Because of the rapid growth in photovoltaics, the period of extraction in a daily cycle was set to the early morning and the late afternoon in order to bypass the massive solar energy production around noon. The gas turbine scenario was defined referring to the specifications of the Huntorf CAES power plant. The numerical simulations involved two stages, i.e. initial fill and cyclic operation, and both were carried out using the Eclipse E300 simulator (Schlumberger). Pressure loss in the gas wells was post analyzed using an analytical solution. The exergy concept was applied to evaluate the potential energy amount stored in the specific porous formation. The simulation results show that porous formations prove to be a feasible solution of large scale CAES. The initial fill with shut-in periods determines the spatial distribution of the gas phase and helps to achieve higher gas saturation around the wells, and thus higher deliverability. The performance evaluation shows that the overall exergy flow of stored compressed air is also determined by the permeability, which directly affects the deliverability of the gas reservoir and thus the number of wells required.

  12. Characterization of microbial communities in exhaust air treatment systems of large-scale pig housing facilities.

    PubMed

    Haneke, J; Lee, N M; Gaul, T W; Van den Weghe, H F A

    2010-01-01

    Exhaust air treatment has gained importance as an essential factor in intensive livestock areas due to the rising emissions in the environment. Wet filter walls of multi-stage exhaust air treatment systems precipitate gaseous ammonia and dust particles from exhaust air in washing water. Microbial communities in the biomass developed in the washing water of five large-scale exhaust air treatment units of pig housing facilities, were investigated by fluorescence in situ hybridization (FISH) and 16S rDNA sequence analyses. No "standard" nitrifying bacteria were found in the washing water. Instead mainly α-Proteobacteria, aggregating β- and χ-Proteobacteria, a large number of Actinobacteria, as well as individual Planctomycetales and Crenarchaeota were detected after more than twelve months' operation. The main Proteobacteria species present were affiliated to the families Alcaligenaceae, Comamonadaceae and Xanthomonadaceae. Furthermore, we investigated the consumption of inorganic nitrogen compounds in the washing water of one exhaust air treatment unit during a fattening period with and without pH control. Maintaining the pH at 6.0 resulted in a ca. fivefold higher ammonium concentration and a ca. fourfold lower concentration of oxidized nitrogen compounds after the fattening period was finished.

  13. Large-scale flow phenomena in axial compressors: Modeling, analysis, and control with air injectors

    NASA Astrophysics Data System (ADS)

    Hagen, Gregory Scott

    This thesis presents a large scale model of axial compressor flows that is detailed enough to describe the modal and spike stall inception processes, and is also amenable to dynamical systems analysis and control design. The research presented here is based on the model derived by Mezic, which shows that the flows are dominated by the competition between the blade forcing of the compressor and the overall pressure differential created by the compressor. This model describes the modal stall inception process in a similar manner as the Moore-Greitzer model, but also describes the cross sectional flow velocities, and exhibits full span and part span stall. All of these flow patterns described by the model agree with experimental data. Furthermore, the initial model is altered in order to describe the effects of three dimensional spike disturbances, which can destabilize the compressor at otherwise stable operating points. The three dimensional model exhibits flow patterns during spike stall inception that also appear in experiments. The second part of this research focuses on the dynamical systems analysis of, and control design with, the PDE model of the axial flow in the compressor. We show that the axial flow model can be written as a gradient system and illustrate some stability properties of the stalled flow. This also reveals that flows with multiple stall cells correspond to higher energy states in the compressor. The model is derived with air injection actuation, and globally stabilizing distributed controls are designed. We first present a locally optimal controller for the linearized system, and then use Lyapunov analysis to show sufficient conditions for global stability. The concept of sector nonlinearities is applied to the problem of distributed parameter systems, and by analyzing the sector property of the compressor characteristic function, completely decentralized controllers are derived. Finally, the modal decomposition and Lyapunov analysis used in

  14. Large-scale structural transitions in supercoiled DNA revealed by coarse-grained simulations

    NASA Astrophysics Data System (ADS)

    Krajina, Brad; Spakowitz, Andrew

    Topological constraints, such as DNA supercoiling, play an integral role in genomic regulation and organization in living systems. However, physical understanding of the principles that underlie DNA structure and organization at biologically-relevant length-scales remains a formidable challenge. We develop a coarse-grained simulation approach for predicting equilibrium conformations of supercoiled DNA. With this approach, we study the conformational transitions that arise due to supercoiling across the full range of supercoiling densities that are commonly explored by living systems. Simulations of ring DNA molecules with lengths up to the scale of topological domains in the E. coli chromosome (~10 kilobases) reveal large-scale structural transitions elicited by supercoiling, resulting in 3 supercoiling conformational regimes: chiral coils, extended plectonemes, and branched hyper-supercoils. These results capture the non-monotonic relationship of size versus degree of supercoiling observed in experimental sedimentation studies of supercoiled DNA, and our results provide a physical explanation of the structural transitions underlying this behavior.

  15. Large-Scale Phosphoproteomics Analysis of Whole Saliva Reveals a Distinct Phosphorylation Pattern

    PubMed Central

    Stone, Matthew D.; Chen, Xiaobing; McGowan, Thomas; Bandhakavi, Sricharan; Cheng, Bin; Rhodus, Nelson L.; Griffin, Timothy J.

    2011-01-01

    In-depth knowledge of bodily fluid phosphoproteomes, such as whole saliva, is limited. To better understand the whole saliva phosphoproteome, we generated a large-scale catalog of phosphorylated proteins. To circumvent the wide dynamic range of phosphoprotein abundance in whole saliva, we combined dynamic range compression using hexapeptide beads, strong cation exchange HPLC peptide fractionation, and immobilized metal affinity chromatography prior to mass spectrometry. In total, 217 unique phosphopeptides sites were identified representing 85 distinct phosphoproteins at 2.3% global FDR. From these peptides, 129 distinct phosphorylation sites were identified of which 57 were previously known, but only 11 of which had been previously identified in whole saliva. Cellular localization analysis revealed salivary phosphoproteins had a distribution similar to all known salivary proteins, but with less relative representation in “extracellular” and “plasma membrane” categories compared to salivary glycoproteins. Sequence alignment showed that phosphorylation occurred at acidic-directed kinase, proline-directed, and basophilic motifs. This differs from plasma phosphoproteins, which predominantly occur at Golgi casein kinase recognized sequences. Collectively, these results suggest diverse functions for salivary phosphoproteins and multiple kinases involved in their processing and secretion. In all, this study should lay groundwork for future elucidation of the functions of salivary protein phosphorylation. PMID:21299198

  16. Evolutionary dynamics of influenza A nucleoprotein (NP) lineages revealed by large-scale sequence analyses.

    PubMed

    Xu, Jianpeng; Christman, Mary C; Donis, Ruben O; Lu, Guoqing

    2011-12-01

    Influenza A viral nucleoprotein (NP) plays a critical role in virus replication and host adaptation, however, the underlying molecular evolutionary dynamics of NP lineages are less well-understood. In this study, large-scale analyses of 5094 NP nucleotide sequences revealed eight distinct evolutionary lineages, including three host-specific lineages (human, classical swine and equine), two cross-host lineages (Eurasian avian-like swine and swine-origin human pandemic H1N1 2009) and three geographically isolated avian lineages (Eurasian, North American and Oceanian). The average nucleotide substitution rate of the NP lineages was estimated to be 2.4 × 10(-3) substitutions per site per year, with the highest value observed in pandemic H1N1 2009 (3.4 × 10(-3)) and the lowest in equine (0.9 × 10(-3)). The estimated time of most recent common ancestor (TMRCA) for each lineage demonstrated that the earliest human lineage was derived around 1906, and the latest pandemic H1N1 2009 lineage dated back to December 17, 2008. A marked time gap was found between the times when the viruses emerged and were first sampled, suggesting the crucial role for long-term surveillance of newly emerging viruses. The selection analyses showed that human lineage had six positive selection sites, whereas pandemic H1N1 2009, classical swine, Eurasian avian and Eurasian swine had only one or two sites. Protein structure analyses revealed several positive selection sites located in epitope regions or host adaptation regions, indicating strong adaptation to host immune system pressures in influenza viruses. Along with previous studies, this study provides new insights into the evolutionary dynamics of influenza A NP lineages. Further lineage analyses of other gene segments will allow better understanding of influenza A virus evolution and assist in the improvement of global influenza surveillance.

  17. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    PubMed

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication.

  18. Large-Scale Meta-Analysis of Human Medial Frontal Cortex Reveals Tripartite Functional Organization

    PubMed Central

    Chang, Luke J.; Banich, Marie T.; Wager, Tor D.; Yarkoni, Tal

    2016-01-01

    The functional organization of human medial frontal cortex (MFC) is a subject of intense study. Using fMRI, the MFC has been associated with diverse psychological processes, including motor function, cognitive control, affect, and social cognition. However, there have been few large-scale efforts to comprehensively map specific psychological functions to subregions of medial frontal anatomy. Here we applied a meta-analytic data-driven approach to nearly 10,000 fMRI studies to identify putatively separable regions of MFC and determine which psychological states preferentially recruit their activation. We identified regions at several spatial scales on the basis of meta-analytic coactivation, revealing three broad functional zones along a rostrocaudal axis composed of 2–4 smaller subregions each. Multivariate classification analyses aimed at identifying the psychological functions most strongly predictive of activity in each region revealed a tripartite division within MFC, with each zone displaying a relatively distinct functional signature. The posterior zone was associated preferentially with motor function, the middle zone with cognitive control, pain, and affect, and the anterior with reward, social processing, and episodic memory. Within each zone, the more fine-grained subregions showed distinct, but subtler, variations in psychological function. These results provide hypotheses about the functional organization of medial prefrontal cortex that can be tested explicitly in future studies. SIGNIFICANCE STATEMENT Activation of medial frontal cortex in fMRI studies is associated with a wide range of psychological states ranging from cognitive control to pain. However, this high rate of activation makes it challenging to determine how these various processes are topologically organized across medial frontal anatomy. We conducted a meta-analysis across nearly 10,000 studies to comprehensively map psychological states to discrete subregions in medial frontal cortex

  19. Dusty Starbursts within a z=3 Large Scale Structure revealed by ALMA

    NASA Astrophysics Data System (ADS)

    Umehata, Hideki

    The role of the large-scale structure is one of the most important theme in studying galaxy formation and evolution. However, it has been still mystery especially at z>2. On the basis of our ALMA 1.1 mm observations in a z ~ 3 protocluster field, it is suggested that submillimeter galaxies (SMGs) preferentially reside in the densest environment at z ~ 3. Furthermore we find a rich cluster of AGN-host SMGs at the core of the protocluster, combining with Chandra X-ray data. Our results indicate the vigorous star-formation and accelerated super massive black hole (SMBH) growth in the node of the cosmic web.

  20. Large-scale sequencing of human influenza reveals the dynamic nature of viral genome evolution.

    PubMed

    Ghedin, Elodie; Sengamalay, Naomi A; Shumway, Martin; Zaborsky, Jennifer; Feldblyum, Tamara; Subbu, Vik; Spiro, David J; Sitz, Jeff; Koo, Hean; Bolotov, Pavel; Dernovoy, Dmitry; Tatusova, Tatiana; Bao, Yiming; St George, Kirsten; Taylor, Jill; Lipman, David J; Fraser, Claire M; Taubenberger, Jeffery K; Salzberg, Steven L

    2005-10-20

    Influenza viruses are remarkably adept at surviving in the human population over a long timescale. The human influenza A virus continues to thrive even among populations with widespread access to vaccines, and continues to be a major cause of morbidity and mortality. The virus mutates from year to year, making the existing vaccines ineffective on a regular basis, and requiring that new strains be chosen for a new vaccine. Less-frequent major changes, known as antigenic shift, create new strains against which the human population has little protective immunity, thereby causing worldwide pandemics. The most recent pandemics include the 1918 'Spanish' flu, one of the most deadly outbreaks in recorded history, which killed 30-50 million people worldwide, the 1957 'Asian' flu, and the 1968 'Hong Kong' flu. Motivated by the need for a better understanding of influenza evolution, we have developed flexible protocols that make it possible to apply large-scale sequencing techniques to the highly variable influenza genome. Here we report the results of sequencing 209 complete genomes of the human influenza A virus, encompassing a total of 2,821,103 nucleotides. In addition to increasing markedly the number of publicly available, complete influenza virus genomes, we have discovered several anomalies in these first 209 genomes that demonstrate the dynamic nature of influenza transmission and evolution. This new, large-scale sequencing effort promises to provide a more comprehensive picture of the evolution of influenza viruses and of their pattern of transmission through human and animal populations. All data from this project are being deposited, without delay, in public archives.

  1. Large-Scale Mitochondrial DNA Analysis of the Domestic Goat Reveals Six Haplogroups with High Diversity

    PubMed Central

    Naderi, Saeid; Rezaei, Hamid-Reza; Taberlet, Pierre; Zundel, Stéphanie; Rafat, Seyed-Abbas; Naghash, Hamid-Reza; El-Barody, Mohamed A. A.; Ertugrul, Okan; Pompanon, François

    2007-01-01

    Background From the beginning of domestication, the transportation of domestic animals resulted in genetic and demographic processes that explain their present distribution and genetic structure. Thus studying the present genetic diversity helps to better understand the history of domestic species. Methodology/Principal Findings The genetic diversity of domestic goats has been characterized with 2430 individuals from all over the old world, including 946 new individuals from regions poorly studied until now (mainly the Fertile Crescent). These individuals represented 1540 haplotypes for the HVI segment of the mitochondrial DNA (mtDNA) control region. This large-scale study allowed the establishment of a clear nomenclature of the goat maternal haplogroups. Only five of the six previously defined groups of haplotypes were divergent enough to be considered as different haplogroups. Moreover a new mitochondrial group has been localized around the Fertile Crescent. All groups showed very high haplotype diversity. Most of this diversity was distributed among groups and within geographic regions. The weak geographic structure may result from the worldwide distribution of the dominant A haplogroup (more than 90% of the individuals). The large-scale distribution of other haplogroups (except one), may be related to human migration. The recent fragmentation of local goat populations into discrete breeds is not detectable with mitochondrial markers. The estimation of demographic parameters from mismatch analyses showed that all groups had a recent demographic expansion corresponding roughly to the period when domestication took place. But even with a large data set it remains difficult to give relative dates of expansion for different haplogroups because of large confidence intervals. Conclusions/Significance We propose standard criteria for the definition of the different haplogroups based on the result of mismatch analysis and on the use of sequences of reference. Such a

  2. Dynamics of Disagreement: Large-Scale Temporal Network Analysis Reveals Negative Interactions in Online Collaboration

    NASA Astrophysics Data System (ADS)

    Tsvetkova, Milena; García-Gavilanes, Ruth; Yasseri, Taha

    2016-11-01

    Disagreement and conflict are a fact of social life. However, negative interactions are rarely explicitly declared and recorded and this makes them hard for scientists to study. In an attempt to understand the structural and temporal features of negative interactions in the community, we use complex network methods to analyze patterns in the timing and configuration of reverts of article edits to Wikipedia. We investigate how often and how fast pairs of reverts occur compared to a null model in order to control for patterns that are natural to the content production or are due to the internal rules of Wikipedia. Our results suggest that Wikipedia editors systematically revert the same person, revert back their reverter, and come to defend a reverted editor. We further relate these interactions to the status of the involved editors. Even though the individual reverts might not necessarily be negative social interactions, our analysis points to the existence of certain patterns of negative social dynamics within the community of editors. Some of these patterns have not been previously explored and carry implications for the knowledge collection practice conducted on Wikipedia. Our method can be applied to other large-scale temporal collaboration networks to identify the existence of negative social interactions and other social processes.

  3. A large scale screen reveals genes that mediate electrotaxis in Dictyostelium discoideum**

    PubMed Central

    Gao, Runchi; Zhao, Siwei; Jiang, Xupin; Sun, Yaohui; Zhao, Sanjun; Gao, Jing; Borleis, Jane; Willard, Stacey; Tang, Ming; Cai, Huaqing; Kamimura, Yoichiro; Huang, Yuesheng; Jiang, Jianxin; Huang, Zunxi; Mogilner, Alex; Pan, Tingrui; Devreotes, Peter N; Zhao, Min

    2015-01-01

    Directional cell migration in an electric field, a phenomenon called galvanotaxis or electrotaxis, occurs in many types of cells, and may play an important role in wound healing and development. Small extracellular electric fields can guide the migration of amoeboid cells, and here, we established a large-scale screening approach to search for mutants with electrotaxis phenotypes from a collection of 563 Dictyostelium discoideum strains with morphological defects. We identified 28 strains that were defective in electrotaxis and 10 strains with a slightly higher directional response. Using plasmid rescue followed by gene disruption, we identified some of the mutated genes, including some previously implicated in chemotaxis. Amongst these we studied PiaA, which encodes a critical component of TORC2, a kinase protein complex that transduces changes in motility by activating the kinase PKB (also known as Akt). Furthermore, we found that electrotaxis was decreased in mutants lacking gefA, rasC, rip3, lst8 or pkbR1, genes that encode other components of the TORC2-PKB pathway. Thus, we have developed a high-throughput screening technique that will be a useful tool to elucidate the molecular mechanisms of electrotaxis. PMID:26012633

  4. Dynamics of Disagreement: Large-Scale Temporal Network Analysis Reveals Negative Interactions in Online Collaboration

    PubMed Central

    Tsvetkova, Milena; García-Gavilanes, Ruth; Yasseri, Taha

    2016-01-01

    Disagreement and conflict are a fact of social life. However, negative interactions are rarely explicitly declared and recorded and this makes them hard for scientists to study. In an attempt to understand the structural and temporal features of negative interactions in the community, we use complex network methods to analyze patterns in the timing and configuration of reverts of article edits to Wikipedia. We investigate how often and how fast pairs of reverts occur compared to a null model in order to control for patterns that are natural to the content production or are due to the internal rules of Wikipedia. Our results suggest that Wikipedia editors systematically revert the same person, revert back their reverter, and come to defend a reverted editor. We further relate these interactions to the status of the involved editors. Even though the individual reverts might not necessarily be negative social interactions, our analysis points to the existence of certain patterns of negative social dynamics within the community of editors. Some of these patterns have not been previously explored and carry implications for the knowledge collection practice conducted on Wikipedia. Our method can be applied to other large-scale temporal collaboration networks to identify the existence of negative social interactions and other social processes. PMID:27808267

  5. Systematic Phenotyping of a Large-Scale Candida glabrata Deletion Collection Reveals Novel Antifungal Tolerance Genes

    PubMed Central

    Hiller, Ekkehard; Istel, Fabian; Tscherner, Michael; Brunke, Sascha; Ames, Lauren; Firon, Arnaud; Green, Brian; Cabral, Vitor; Marcet-Houben, Marina; Jacobsen, Ilse D.; Quintin, Jessica; Seider, Katja; Frohner, Ingrid; Glaser, Walter; Jungwirth, Helmut; Bachellier-Bassi, Sophie; Chauvel, Murielle; Zeidler, Ute; Ferrandon, Dominique; Gabaldón, Toni; Hube, Bernhard; d'Enfert, Christophe; Rupp, Steffen; Cormack, Brendan; Haynes, Ken; Kuchler, Karl

    2014-01-01

    The opportunistic fungal pathogen Candida glabrata is a frequent cause of candidiasis, causing infections ranging from superficial to life-threatening disseminated disease. The inherent tolerance of C. glabrata to azole drugs makes this pathogen a serious clinical threat. To identify novel genes implicated in antifungal drug tolerance, we have constructed a large-scale C. glabrata deletion library consisting of 619 unique, individually bar-coded mutant strains, each lacking one specific gene, all together representing almost 12% of the genome. Functional analysis of this library in a series of phenotypic and fitness assays identified numerous genes required for growth of C. glabrata under normal or specific stress conditions, as well as a number of novel genes involved in tolerance to clinically important antifungal drugs such as azoles and echinocandins. We identified 38 deletion strains displaying strongly increased susceptibility to caspofungin, 28 of which encoding proteins that have not previously been linked to echinocandin tolerance. Our results demonstrate the potential of the C. glabrata mutant collection as a valuable resource in functional genomics studies of this important fungal pathogen of humans, and to facilitate the identification of putative novel antifungal drug target and virulence genes. PMID:24945925

  6. Tree Age Distributions Reveal Large-Scale Disturbance-Recovery Cycles in Three Tropical Forests.

    PubMed

    Vlam, Mart; van der Sleen, Peter; Groenendijk, Peter; Zuidema, Pieter A

    2016-01-01

    Over the past few decades there has been a growing realization that a large share of apparently 'virgin' or 'old-growth' tropical forests carries a legacy of past natural or anthropogenic disturbances that have a substantial effect on present-day forest composition, structure and dynamics. Yet, direct evidence of such disturbances is scarce and comparisons of disturbance dynamics across regions even more so. Here we present a tree-ring based reconstruction of disturbance histories from three tropical forest sites in Bolivia, Cameroon, and Thailand. We studied temporal patterns in tree regeneration of shade-intolerant tree species, because establishment of these trees is indicative for canopy disturbance. In three large areas (140-300 ha), stem disks and increment cores were collected for a total of 1154 trees (>5 cm diameter) from 12 tree species to estimate the age of every tree. Using these age estimates we produced population age distributions, which were analyzed for evidence of past disturbance. Our approach allowed us to reconstruct patterns of tree establishment over a period of around 250 years. In Bolivia, we found continuous regeneration rates of three species and a peaked age distribution of a long-lived pioneer species. In both Cameroon and Thailand we found irregular age distributions, indicating strongly reduced regeneration rates over a period of 10-60 years. Past fires, windthrow events or anthropogenic disturbances all provide plausible explanations for the reported variation in tree age across the three sites. Our results support the recent idea that the long-term dynamics of tropical forests are impacted by large-scale disturbance-recovery cycles, similar to those driving temperate forest dynamics.

  7. Tree Age Distributions Reveal Large-Scale Disturbance-Recovery Cycles in Three Tropical Forests

    PubMed Central

    Vlam, Mart; van der Sleen, Peter; Groenendijk, Peter; Zuidema, Pieter A.

    2017-01-01

    Over the past few decades there has been a growing realization that a large share of apparently ‘virgin’ or ‘old-growth’ tropical forests carries a legacy of past natural or anthropogenic disturbances that have a substantial effect on present-day forest composition, structure and dynamics. Yet, direct evidence of such disturbances is scarce and comparisons of disturbance dynamics across regions even more so. Here we present a tree-ring based reconstruction of disturbance histories from three tropical forest sites in Bolivia, Cameroon, and Thailand. We studied temporal patterns in tree regeneration of shade-intolerant tree species, because establishment of these trees is indicative for canopy disturbance. In three large areas (140–300 ha), stem disks and increment cores were collected for a total of 1154 trees (>5 cm diameter) from 12 tree species to estimate the age of every tree. Using these age estimates we produced population age distributions, which were analyzed for evidence of past disturbance. Our approach allowed us to reconstruct patterns of tree establishment over a period of around 250 years. In Bolivia, we found continuous regeneration rates of three species and a peaked age distribution of a long-lived pioneer species. In both Cameroon and Thailand we found irregular age distributions, indicating strongly reduced regeneration rates over a period of 10–60 years. Past fires, windthrow events or anthropogenic disturbances all provide plausible explanations for the reported variation in tree age across the three sites. Our results support the recent idea that the long-term dynamics of tropical forests are impacted by large-scale disturbance-recovery cycles, similar to those driving temperate forest dynamics. PMID:28105034

  8. Large-scale Models Reveal the Two-component Mechanics of Striated Muscle

    PubMed Central

    Jarosch, Robert

    2008-01-01

    This paper provides a comprehensive explanation of striated muscle mechanics and contraction on the basis of filament rotations. Helical proteins, particularly the coiled-coils of tropomyosin, myosin and α-actinin, shorten their H-bonds cooperatively and produce torque and filament rotations when the Coulombic net-charge repulsion of their highly charged side-chains is diminished by interaction with ions. The classical “two-component model” of active muscle differentiated a “contractile component” which stretches the “series elastic component” during force production. The contractile components are the helically shaped thin filaments of muscle that shorten the sarcomeres by clockwise drilling into the myosin cross-bridges with torque decrease (= force-deficit). Muscle stretch means drawing out the thin filament helices off the cross-bridges under passive counterclockwise rotation with torque increase (= stretch activation). Since each thin filament is anchored by four elastic α-actinin Z-filaments (provided with force-regulating sites for Ca2+ binding), the thin filament rotations change the torsional twist of the four Z-filaments as the “series elastic components”. Large scale models simulate the changes of structure and force in the Z-band by the different Z-filament twisting stages A, B, C, D, E, F and G. Stage D corresponds to the isometric state. The basic phenomena of muscle physiology, i. e. latency relaxation, Fenn-effect, the force-velocity relation, the length-tension relation, unexplained energy, shortening heat, the Huxley-Simmons phases, etc. are explained and interpreted with the help of the model experiments. PMID:19330099

  9. Large-scale transport of a CO-enhanced air mass from Europe to the Middle East

    NASA Technical Reports Server (NTRS)

    Connors, V. S.; Miles, T.; Reichle, H. G., Jr.

    1989-01-01

    On November 14, 1981, the shuttle-borne Measurement of Air Pollution from Satellites (MAPS) experiment observed a carbon monoxide (CO) enhanced air mass in the middle troposphere over the Middle East. The primary source of this polluted air was estimated by constructing adiabatic isentropic trajectories backwards from the MAPS measurement location over a 36 h period. The isentropic diagnostics indicate that CO-enhanced air was transported southeastward over the Mediterranean from an organized synoptic-scale weather regime, albeit of moderate intensity, influencing central Europe on November 12. Examination of the evolving synoptic scale vertical velocity and precipitation patterns during this period, in conjuction with Meteosat visible, infrared, and water vapor imagery, suggests that the presence of this disturbed weather system over Europe may have created upward transport of CO-enhanced air between the boundary-layer and midtropospheric levels, and subsequent entrainment in the large-scale northwesterly jet stream flow over Europe and the Mediterranean.

  10. Large-scale inhomogeneity in sapphire test masses revealed by Rayleigh scattering imaging

    NASA Astrophysics Data System (ADS)

    Yan, Zewu; Ju, Li; Eon, François; Gras, Slawomir; Zhao, Chunnong; Jacob, John; Blair, David G.

    2004-03-01

    Rayleigh scattering in test masses can introduce noise and reduce the sensitivity of laser interferometric gravitational wave detectors. In this paper, we present laser Rayleigh scattering imaging as a technique to investigate sapphire test masses. The system provides three-dimensional Rayleigh scattering mapping of entire test masses and quantitative evaluation of the Rayleigh scattering coefficient. Rayleigh scattering mapping of two sapphire samples reveals point defects as well as inhomogeneous structures in the samples. We present results showing significant non-uniform scattering within two 4.5 kg sapphire test masses manufactured by the heat exchanger method.

  11. Important aspects of Eastern Mediterranean large-scale variability revealed from data of three fixed observatories

    NASA Astrophysics Data System (ADS)

    Bensi, Manuel; Velaoras, Dimitris; Cardin, Vanessa; Perivoliotis, Leonidas; Pethiakis, George

    2015-04-01

    Long-term variations of temperature and salinity observed in the Adriatic and Aegean Seas seem to be regulated by larger-scale circulation modes of the Eastern Mediterranean (EMed) Sea, such as the recently discovered feedback mechanisms, namely the BiOS (Bimodal Oscillating System) and the internal thermohaline pump theories. These theories are the results of interpretation of many years' observations, highlighting possible interactions between two key regions of the EMed. Although repeated oceanographic cruises carried out in the past or planned for the future are a very useful tool for understanding the interaction between the two basins (e.g. alternating dense water formation, salt ingressions), recent long time-series of high frequency (up to 1h) sampling have added valuable information to the interpretation of internal mechanisms for both areas (i.e. mesoscale eddies, evolution of fast internal processes, etc.). During the last 10 years, three deep observatories were deployed and maintained in the Adriatic, Ionian, and Aegean Seas: they are respectively, the E2-M3A, the Pylos, and the E1-M3A. All are part of the largest European network of Fixed Point Open Ocean Observatories (FixO3, http://www.fixo3.eu/). Herein, from the analysis of temperature and salinity, and potential density time series collected at the three sites from the surface down to the intermediate and deep layers, we will discuss the almost perfect anti-correlated behavior between the Adriatic and the Aegean Seas. Our data, collected almost continuously since 2006, reveal that these observatories well represent the thermohaline variability of their own areas. Interestingly, temperature and salinity in the intermediate layer suddenly increased in the South Adriatic from the end of 2011, exactly when they started decreasing in the Aegean Sea. Moreover, Pylos data used together with additional ones (e.g. Absolute dynamic topography, temperature and salinity data from other platforms) collected

  12. Acoustic Telemetry Reveals Large-Scale Migration Patterns of Walleye in Lake Huron

    PubMed Central

    Hayden, Todd A.; Holbrook, Christopher M.; Fielder, David G.; Vandergoot, Christopher S.; Bergstedt, Roger A.; Dettmers, John M.; Krueger, Charles C.; Cooke, Steven J.

    2014-01-01

    Fish migration in large freshwater lacustrine systems such as the Laurentian Great Lakes is not well understood. The walleye (Sander vitreus) is an economically and ecologically important native fish species throughout the Great Lakes. In Lake Huron walleye has recently undergone a population expansion as a result of recovery of the primary stock, stemming from changing food web dynamics. During 2011 and 2012, we used acoustic telemetry to document the timing and spatial scale of walleye migration in Lake Huron and Saginaw Bay. Spawning walleye (n = 199) collected from a tributary of Saginaw Bay were implanted with acoustic tags and their migrations were documented using acoustic receivers (n = 140) deployed throughout U.S. nearshore waters of Lake Huron. Three migration pathways were described using multistate mark-recapture models. Models were evaluated using the Akaike Information Criterion. Fish sex did not influence migratory behavior but did affect migration rate and walleye were detected on all acoustic receiver lines. Most (95%) tagged fish migrated downstream from the riverine tagging and release location to Saginaw Bay, and 37% of these fish emigrated from Saginaw Bay into Lake Huron. Remarkably, 8% of walleye that emigrated from Saginaw Bay were detected at the acoustic receiver line located farthest from the release location more than 350 km away. Most (64%) walleye returned to the Saginaw River in 2012, presumably for spawning. Our findings reveal that fish from this stock use virtually the entirety of U.S. nearshore waters of Lake Huron. PMID:25506913

  13. Large scale RNAi reveals the requirement of nuclear envelope breakdown for nuclear import of human papillomaviruses.

    PubMed

    Aydin, Inci; Weber, Susanne; Snijder, Berend; Samperio Ventayol, Pilar; Kühbacher, Andreas; Becker, Miriam; Day, Patricia M; Schiller, John T; Kann, Michael; Pelkmans, Lucas; Helenius, Ari; Schelhaas, Mario

    2014-05-01

    A two-step, high-throughput RNAi silencing screen was used to identify host cell factors required during human papillomavirus type 16 (HPV16) infection. Analysis of validated hits implicated a cluster of mitotic genes and revealed a previously undetermined mechanism for import of the viral DNA (vDNA) into the nucleus. In interphase cells, viruses were endocytosed, routed to the perinuclear area, and uncoated, but the vDNA failed to be imported into the nucleus. Upon nuclear envelope perforation in interphase cells HPV16 infection occured. During mitosis, the vDNA and L2 associated with host cell chromatin on the metaphase plate. Hence, we propose that HPV16 requires nuclear envelope breakdown during mitosis for access of the vDNA to the nucleoplasm. The results accentuate the value of genes found by RNAi screens for investigation of viral infections. The list of cell functions required during HPV16 infection will, moreover, provide a resource for future virus-host cell interaction studies.

  14. Acoustic telemetry reveals large-scale migration patterns of walleye in Lake Huron

    USGS Publications Warehouse

    Hayden, Todd A.; Holbrook, Christopher; Fielder, David G.; Vandergoot, Christopher S.; Bergstedt, Roger A.; Dettmers, John M.; Krueger, Charles C.; Cooke, Steven J.

    2014-01-01

    Fish migration in large freshwater lacustrine systems such as the Laurentian Great Lakes is not well understood. The walleye (Sander vitreus) is an economically and ecologically important native fish species throughout the Great Lakes. In Lake Huron walleye has recently undergone a population expansion as a result of recovery of the primary stock, stemming from changing food web dynamics. During 2011 and 2012, we used acoustic telemetry to document the timing and spatial scale of walleye migration in Lake Huron and Saginaw Bay. Spawning walleye (n = 199) collected from a tributary of Saginaw Bay were implanted with acoustic tags and their migrations were documented using acoustic receivers (n = 140) deployed throughout U.S. nearshore waters of Lake Huron. Three migration pathways were described using multistate mark-recapture models. Models were evaluated using the Akaike Information Criterion. Fish sex did not influence migratory behavior but did affect migration rate and walleye were detected on all acoustic receiver lines. Most (95%) tagged fish migrated downstream from the riverine tagging and release location to Saginaw Bay, and 37% of these fish emigrated from Saginaw Bay into Lake Huron. Remarkably, 8% of walleye that emigrated from Saginaw Bay were detected at the acoustic receiver line located farthest from the release location more than 350 km away. Most (64%) walleye returned to the Saginaw River in 2012, presumably for spawning. Our findings reveal that fish from this stock use virtually the entirety of U.S. nearshore waters of Lake Huron.

  15. The Dynamics of Sea Straits Reveals Large-Scale Modes of Variability

    NASA Astrophysics Data System (ADS)

    Rubino, Angelo; Androsov, Alexey; Zanchettin, Davide; Voltzinger, Naum

    2016-04-01

    Using a very high resolution 3D numerical model we investigate the tidal dynamics in the Strait of Messina. We show that different stratifications at the southern boundaries, consistent with observed stratifications in the Ionian approaches to the Strait, yield different mean sea level heights. On this basis we search for long-term variations in sea level heights measured in the tidal stations of Catania, Messina and Marseille, and associate them with the concomitant phase of dominant modes of interannual-to-decadal climate variability in the Euro-Mediterranean sector. We focus on the atmospheric North Atlantic Oscillation (NAO) and on the Adriatic-Ionian Bimodal Oscillating System (BiOS) to illustrate the grand variability in sea level teleconnections during the last four decades. In particular, observations reveal a strong imprint of both NAO and BiOS on all sea level records in the 21st century, when NAO and BiOS are inversely correlated. In the 1990s, a well known period of persistent positive NAO anomalies, the NAO imprint on sea level becomes weaker compared to the most recent period, although it remains clear on decadal trends, while the BiOS shows very weak positive variability. In the 1970s and early 1980s, when the NAO was on a neutral phase with weak variability, the NAO imprint on sea levels is weakest, and sea levels in Marseille and Sicily anticorrelate with each other, in contrast to the positive correlations found during the later periods. Based on these observational evidence, we discuss how our modeling results provide a basis to understand the local dynamics that contributed to determine such observed decadal variability.

  16. EFFECTS OF OXYGEN AND AIR MIXING ON VOID FRACTIONS IN A LARGE SCALE SYSTEM

    SciTech Connect

    Leishear, R; Hector Guerrero, H; Michael Restivo, M

    2008-09-11

    Oxygen and air mixing with spargers was performed in a 30 foot tall by 30 inch diameter column, to investigate mass transfer as air sparged up through the column and removed saturated oxygen from solution. The mixing techniques required to support this research are the focus of this paper. The fluids tested included water, water with an antifoam agent (AFA), and a high, solids content, Bingham plastic, nuclear waste simulant with AFA, referred to as AZ01 simulant, which is non-radioactive. Mixing of fluids in the column was performed using a recirculation system and an air sparger. The re-circulation system consisted of the column, a re-circulating pump, and associated piping. The air sparger was fabricated from a two inch diameter pipe concentrically installed in the column and open near the bottom of the column. The column contents were slowly re-circulated while fluids were mixed with the air sparger. Samples were rheologically tested to ensure effective mixing, as required. Once the fluids were adequately mixed, oxygen was homogeneously added through the re-circulation loop using a sintered metal oxygen sparger followed by a static mixer. Then the air sparger was re-actuated to remove oxygen from solution as air bubbled up through solution. To monitor mixing effectiveness several variables were monitored, which included flow rates, oxygen concentration, differential pressures along the column height, fluid levels, and void fractions, which are defined as the percent of dissolved gas divided by the total volume of gas and liquid. Research showed that mixing was uniform for water and water with AFA, but mixing for the AZ101 fluid was far more complex. Although mixing of AZ101 was uniform throughout most of the column, gas entrapment and settling of solids significantly affected test results. The detailed test results presented here provide some insight into the complexities of mixing and void fractions for different fluids and how the mixing process itself

  17. Metaproteomics reveals major microbial players and their biodegradation functions in a large-scale aerobic composting plant.

    PubMed

    Liu, Dongming; Li, Mingxiao; Xi, Beidou; Zhao, Yue; Wei, Zimin; Song, Caihong; Zhu, Chaowei

    2015-11-01

    Composting is an appropriate management alternative for municipal solid waste; however, our knowledge about the microbial regulation of this process is still scare. We employed metaproteomics to elucidate the main biodegradation pathways in municipal solid waste composting system across the main phases in a large-scale composting plant. The investigation of microbial succession revealed that Bacillales, Actinobacteria and Saccharomyces increased significantly with respect to abundance in composting process. The key microbiologic population for cellulose degradation in different composting stages was different. Fungi were found to be the main producers of cellulase in earlier phase. However, the cellulolytic fungal communities were gradually replaced by a purely bacterial one in active phase, which did not support the concept that the thermophilic fungi are active through the thermophilic phase. The effective decomposition of cellulose required the synergy between bacteria and fungi in the curing phase.

  18. The iBeetle large-scale RNAi screen reveals gene functions for insect development and physiology

    PubMed Central

    Schmitt-Engel, Christian; Schultheis, Dorothea; Schwirz, Jonas; Ströhlein, Nadi; Troelenberg, Nicole; Majumdar, Upalparna; Dao, Van Anh; Grossmann, Daniela; Richter, Tobias; Tech, Maike; Dönitz, Jürgen; Gerischer, Lizzy; Theis, Mirko; Schild, Inga; Trauner, Jochen; Koniszewski, Nikolaus D. B.; Küster, Elke; Kittelmann, Sebastian; Hu, Yonggang; Lehmann, Sabrina; Siemanowski, Janna; Ulrich, Julia; Panfilio, Kristen A.; Schröder, Reinhard; Morgenstern, Burkhard; Stanke, Mario; Buchhholz, Frank; Frasch, Manfred; Roth, Siegfried; Wimmer, Ernst A.; Schoppmeier, Michael; Klingler, Martin; Bucher, Gregor

    2015-01-01

    Genetic screens are powerful tools to identify the genes required for a given biological process. However, for technical reasons, comprehensive screens have been restricted to very few model organisms. Therefore, although deep sequencing is revealing the genes of ever more insect species, the functional studies predominantly focus on candidate genes previously identified in Drosophila, which is biasing research towards conserved gene functions. RNAi screens in other organisms promise to reduce this bias. Here we present the results of the iBeetle screen, a large-scale, unbiased RNAi screen in the red flour beetle, Tribolium castaneum, which identifies gene functions in embryonic and postembryonic development, physiology and cell biology. The utility of Tribolium as a screening platform is demonstrated by the identification of genes involved in insect epithelial adhesion. This work transcends the restrictions of the candidate gene approach and opens fields of research not accessible in Drosophila. PMID:26215380

  19. The iBeetle large-scale RNAi screen reveals gene functions for insect development and physiology.

    PubMed

    Schmitt-Engel, Christian; Schultheis, Dorothea; Schwirz, Jonas; Ströhlein, Nadi; Troelenberg, Nicole; Majumdar, Upalparna; Dao, Van Anh; Grossmann, Daniela; Richter, Tobias; Tech, Maike; Dönitz, Jürgen; Gerischer, Lizzy; Theis, Mirko; Schild, Inga; Trauner, Jochen; Koniszewski, Nikolaus D B; Küster, Elke; Kittelmann, Sebastian; Hu, Yonggang; Lehmann, Sabrina; Siemanowski, Janna; Ulrich, Julia; Panfilio, Kristen A; Schröder, Reinhard; Morgenstern, Burkhard; Stanke, Mario; Buchhholz, Frank; Frasch, Manfred; Roth, Siegfried; Wimmer, Ernst A; Schoppmeier, Michael; Klingler, Martin; Bucher, Gregor

    2015-07-28

    Genetic screens are powerful tools to identify the genes required for a given biological process. However, for technical reasons, comprehensive screens have been restricted to very few model organisms. Therefore, although deep sequencing is revealing the genes of ever more insect species, the functional studies predominantly focus on candidate genes previously identified in Drosophila, which is biasing research towards conserved gene functions. RNAi screens in other organisms promise to reduce this bias. Here we present the results of the iBeetle screen, a large-scale, unbiased RNAi screen in the red flour beetle, Tribolium castaneum, which identifies gene functions in embryonic and postembryonic development, physiology and cell biology. The utility of Tribolium as a screening platform is demonstrated by the identification of genes involved in insect epithelial adhesion. This work transcends the restrictions of the candidate gene approach and opens fields of research not accessible in Drosophila.

  20. Metaproteomics reveals major microbial players and their biodegradation functions in a large-scale aerobic composting plant

    PubMed Central

    Liu, Dongming; Li, Mingxiao; Xi, Beidou; Zhao, Yue; Wei, Zimin; Song, Caihong; Zhu, Chaowei

    2015-01-01

    Composting is an appropriate management alternative for municipal solid waste; however, our knowledge about the microbial regulation of this process is still scare. We employed metaproteomics to elucidate the main biodegradation pathways in municipal solid waste composting system across the main phases in a large-scale composting plant. The investigation of microbial succession revealed that Bacillales, Actinobacteria and Saccharomyces increased significantly with respect to abundance in composting process. The key microbiologic population for cellulose degradation in different composting stages was different. Fungi were found to be the main producers of cellulase in earlier phase. However, the cellulolytic fungal communities were gradually replaced by a purely bacterial one in active phase, which did not support the concept that the thermophilic fungi are active through the thermophilic phase. The effective decomposition of cellulose required the synergy between bacteria and fungi in the curing phase. PMID:25989417

  1. Reconstruction of air-shower parameters for large-scale radio detectors using the lateral distribution

    NASA Astrophysics Data System (ADS)

    Kostunin, D.; Bezyazeekov, P. A.; Hiller, R.; Schröder, F. G.; Lenok, V.; Levinson, E.

    2016-02-01

    We investigate features of the lateral distribution function (LDF) of the radio signal emitted by cosmic ray air-showers with primary energies Epr > 0.1 EeV and its connection to air-shower parameters such as energy and shower maximum using CoREAS simulations made for the configuration of the Tunka-Rex antenna array. Taking into account all significant contributions to the total radio emission, such as by the geomagnetic effect, the charge excess, and the atmospheric refraction we parameterize the radio LDF. This parameterization is two-dimensional and has several free parameters. The large number of free parameters is not suitable for experiments of sparse arrays operating at low SNR (signal-to-noise ratios). Thus, exploiting symmetries, we decrease the number of free parameters based on the shower geometry and reduce the LDF to a simple one-dimensional function. The remaining parameters can be fit with a small number of points, i.e. as few as the signal from three antennas above detection threshold. Finally, we present a method for the reconstruction of air-shower parameters, in particular, energy and Xmax (shower maximum), which can be reached with a theoretical accuracy of better than 15% and 30 g/cm2, respectively.

  2. MALDI imaging mass spectrometry reveals multiple clinically relevant masses in colorectal cancer using large-scale tissue microarrays.

    PubMed

    Hinsch, Andrea; Buchholz, Malte; Odinga, Sinje; Borkowski, Carina; Koop, Christina; Izbicki, Jacob R; Wurlitzer, Marcus; Krech, Till; Wilczak, Waldemar; Steurer, Stefan; Jacobsen, Frank; Burandt, Eike-Christian; Stahl, Phillip; Simon, Ronald; Sauter, Guido; Schlüter, Hartmut

    2017-01-24

    For identification of clinically relevant masses to predict status, grade, relapse and prognosis of colorectal cancer we applied MALDI imaging mass spectrometry (IMS) to a tissue micro array (TMA) containing formalin-fixed and paraffin-embedded tissue samples from 349 patients. Analysis of our MALDI-IMS data revealed 27 different m/z signals associated with epithelial structures. Comparison of these signals showed significant association with status, grade and Ki-67 labeling index. 15 out of 27 IMS signals revealed a significant association with survival. For 7 signals (m/z 654, 776, 788, 904, 944, 975, and 1013) the absence, and for 8 signals (m/z 643, 678, 836, 886, 898, 1095, 1459, and 1477) the presence was associated with decreased life expectancy, including 5 masses (m/z 788, 836, 904, 944, and 1013) that provided prognostic information independently from the established prognosticators pT and pN. Combination of these 5 masses resulted in a 3-step classifier that provided prognostic information superior to univariate analysis. In addition, a total of 19 masses were associated with tumor stage, grade, metastasis, and cell proliferation. Our data demonstrate the suitability of combining IMS and large-scale TMAs to simultaneously identify and validate clinically useful molecular marker.

  3. 25-Day Period Large-Scale Oscillations in the Argentine Basin Revealed by the TOPEX/POSEIDON Altimeter

    NASA Technical Reports Server (NTRS)

    Fu, L-L.; Cheng, B.; Qiu, B.

    1999-01-01

    The measurement of global sea surface height made by the TOPEX/POSEIDON satellite has provided the first synoptic view of large-scale oceanic variability at the intraseasonal scales from weeks to months.

  4. High prevalence of caprine arthritis encephalitis virus (CAEV) in Taiwan revealed by large-scale serological survey

    PubMed Central

    YANG, Wei-Cheng; CHEN, Hui-Yu; WANG, Chi-Young; PAN, Hung-Yu; WU, Cheng-Wei; HSU, Yun-Hsiu; SU, Jui-Chuan; CHAN, Kun-Wei

    2016-01-01

    In this study, a large-scale serological survey of caprine arthritis encephalitis virus (CAEV) infection was conducted between March 2011 and October 2012. 3,437 goat blood or milk samples were collected from 65 goat farms throughout Taiwan. A commercial ELISA kit was used to detect antibodies against CAEV. The overall seropositive rate was 61.7% (2,120/3,437) in goats and in 98.5% (64/65) of goat farms. These results provide the first large-scale serological evidence for the presence of CAEV infection, indicating that the disease is widespread in Taiwan. PMID:27916786

  5. Sensitivity study of a large-scale air pollution model by using high-performance computations and Monte Carlo algorithms

    NASA Astrophysics Data System (ADS)

    Ostromsky, Tz.; Dimov, I.; Georgieva, R.; Marinov, P.; Zlatev, Z.

    2013-10-01

    In this paper we present some new results of our work on sensitivity analysis of a large-scale air pollution model, more specificly the Danish Eulerian Model (DEM). The main purpose of this study is to analyse the sensitivity of ozone concentrations with respect to the rates of some chemical reactions. The current sensitivity study considers the rates of six important chemical reactions and is done for the areas of several European cities with different geographical locations, climate, industrialization and population density. One of the most widely used variance-based techniques for sensitivity analysis, such as Sobol estimates and their modifications, have been used in this study. A vast number of numerical experiments with a specially adapted for the purpose version of the Danish Eulerian Model (SA-DEM) were carried out to compute global Sobol sensitivity measures. SA-DEM was implemented and run on two powerful cluster supercomputers: IBM Blue Gene/P, the most powerful parallel supercomputer in Bulgaria and IBM MareNostrum III, the most powerful parallel supercomputer in Spain. The refined (480 × 480) mesh version of the model was used in the experiments on MareNostrum III, which is a challenging computational problem even on such a powerful machine. Some optimizations of the code with respect to the parallel efficiency and the memory use were performed. Tables with performance results of a number of numerical experiments on IBM BlueGene/P and on IBM MareNostrum III are presented and analysed.

  6. Multimodal MR-imaging reveals large-scale structural and functional connectivity changes in profound early blindness

    PubMed Central

    Bauer, Corinna M.; Hirsch, Gabriella V.; Zajac, Lauren; Koo, Bang-Bon; Collignon, Olivier

    2017-01-01

    between occipital and frontal and somatosensory-motor areas and between temporal (mainly fusiform and parahippocampus) and parietal, frontal, and other temporal areas. Correlations in white matter connectivity and functional connectivity observed between early blind and sighted controls showed an overall high degree of association. However, comparing the relative changes in white matter and functional connectivity between early blind and sighted controls did not show a significant correlation. In summary, these findings provide complimentary evidence, as well as highlight potential contradictions, regarding the nature of regional and large scale neuroplastic reorganization resulting from early onset blindness. PMID:28328939

  7. Large-Scale Disasters

    NASA Astrophysics Data System (ADS)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  8. Bursty properties revealed in large-scale brain networks with a point-based method for dynamic functional connectivity

    PubMed Central

    Thompson, William Hedley; Fransson, Peter

    2016-01-01

    The brain is organized into large scale spatial networks that can be detected during periods of rest using fMRI. The brain is also a dynamic organ with activity that changes over time. We developed a method and investigated properties where the connections as a function of time are derived and quantified. The point based method (PBM) presented here derives covariance matrices after clustering individual time points based upon their global spatial pattern. This method achieved increased temporal sensitivity, together with temporal network theory, allowed us to study functional integration between resting-state networks. Our results show that functional integrations between two resting-state networks predominately occurs in bursts of activity. This is followed by varying intermittent periods of less connectivity. The described point-based method of dynamic resting-state functional connectivity allows for a detailed and expanded view on the temporal dynamics of resting-state connectivity that provides novel insights into how neuronal information processing is integrated in the human brain at the level of large-scale networks. PMID:27991540

  9. Captured metagenomics: large-scale targeting of genes based on ‘sequence capture’ reveals functional diversity in soils

    PubMed Central

    Manoharan, Lokeshwaran; Kushwaha, Sandeep K.; Hedlund, Katarina; Ahrén, Dag

    2015-01-01

    Microbial enzyme diversity is a key to understand many ecosystem processes. Whole metagenome sequencing (WMG) obtains information on functional genes, but it is costly and inefficient due to large amount of sequencing that is required. In this study, we have applied a captured metagenomics technique for functional genes in soil microorganisms, as an alternative to WMG. Large-scale targeting of functional genes, coding for enzymes related to organic matter degradation, was applied to two agricultural soil communities through captured metagenomics. Captured metagenomics uses custom-designed, hybridization-based oligonucleotide probes that enrich functional genes of interest in metagenomic libraries where only probe-bound DNA fragments are sequenced. The captured metagenomes were highly enriched with targeted genes while maintaining their target diversity and their taxonomic distribution correlated well with the traditional ribosomal sequencing. The captured metagenomes were highly enriched with genes related to organic matter degradation; at least five times more than similar, publicly available soil WMG projects. This target enrichment technique also preserves the functional representation of the soils, thereby facilitating comparative metagenomics projects. Here, we present the first study that applies the captured metagenomics approach in large scale, and this novel method allows deep investigations of central ecosystem processes by studying functional gene abundances. PMID:26490729

  10. Genome resequencing in Populus: Revealing large-scale genome variation and implications on specialized-trait genomics

    SciTech Connect

    Muchero, Wellington; Labbe, Jessy L; Priya, Ranjan; DiFazio, Steven P; Tuskan, Gerald A

    2014-01-01

    To date, Populus ranks among a few plant species with a complete genome sequence and other highly developed genomic resources. With the first genome sequence among all tree species, Populus has been adopted as a suitable model organism for genomic studies in trees. However, far from being just a model species, Populus is a key renewable economic resource that plays a significant role in providing raw materials for the biofuel and pulp and paper industries. Therefore, aside from leading frontiers of basic tree molecular biology and ecological research, Populus leads frontiers in addressing global economic challenges related to fuel and fiber production. The latter fact suggests that research aimed at improving quality and quantity of Populus as a raw material will likely drive the pursuit of more targeted and deeper research in order to unlock the economic potential tied in molecular biology processes that drive this tree species. Advances in genome sequence-driven technologies, such as resequencing individual genotypes, which in turn facilitates large scale SNP discovery and identification of large scale polymorphisms are key determinants of future success in these initiatives. In this treatise we discuss implications of genome sequence-enable technologies on Populus genomic and genetic studies of complex and specialized-traits.

  11. Analysis and experimental study on formation conditions of large-scale barrier-free diffuse atmospheric pressure air plasmas in repetitive pulse mode

    NASA Astrophysics Data System (ADS)

    Li, Lee; Liu, Lun; Liu, Yun-Long; Bin, Yu; Ge, Ya-Feng; Lin, Fo-Chang

    2014-01-01

    Atmospheric air diffuse plasmas have enormous application potential in various fields of science and technology. Without dielectric barrier, generating large-scale air diffuse plasmas is always a challenging issue. This paper discusses and analyses the formation mechanism of cold homogenous plasma. It is proposed that generating stable diffuse atmospheric plasmas in open air should meet the three conditions: high transient power with low average power, excitation in low average E-field with locally high E-field region, and multiple overlapping electron avalanches. Accordingly, an experimental configuration of generating large-scale barrier-free diffuse air plasmas is designed. Based on runaway electron theory, a low duty-ratio, high voltage repetitive nanosecond pulse generator is chosen as a discharge excitation source. Using the wire-electrodes with small curvature radius, the gaps with highly non-uniform E-field are structured. Experimental results show that the volume-scaleable, barrier-free, homogeneous air non-thermal plasmas have been obtained between the gap spacing with the copper-wire electrodes. The area of air cold plasmas has been up to hundreds of square centimeters. The proposed formation conditions of large-scale barrier-free diffuse air plasmas are proved to be reasonable and feasible.

  12. Analysis and experimental study on formation conditions of large-scale barrier-free diffuse atmospheric pressure air plasmas in repetitive pulse mode

    SciTech Connect

    Li, Lee Liu, Lun; Liu, Yun-Long; Bin, Yu; Ge, Ya-Feng; Lin, Fo-Chang

    2014-01-14

    Atmospheric air diffuse plasmas have enormous application potential in various fields of science and technology. Without dielectric barrier, generating large-scale air diffuse plasmas is always a challenging issue. This paper discusses and analyses the formation mechanism of cold homogenous plasma. It is proposed that generating stable diffuse atmospheric plasmas in open air should meet the three conditions: high transient power with low average power, excitation in low average E-field with locally high E-field region, and multiple overlapping electron avalanches. Accordingly, an experimental configuration of generating large-scale barrier-free diffuse air plasmas is designed. Based on runaway electron theory, a low duty-ratio, high voltage repetitive nanosecond pulse generator is chosen as a discharge excitation source. Using the wire-electrodes with small curvature radius, the gaps with highly non-uniform E-field are structured. Experimental results show that the volume-scaleable, barrier-free, homogeneous air non-thermal plasmas have been obtained between the gap spacing with the copper-wire electrodes. The area of air cold plasmas has been up to hundreds of square centimeters. The proposed formation conditions of large-scale barrier-free diffuse air plasmas are proved to be reasonable and feasible.

  13. Organization and evolution of brain lipidome revealed by large-scale analysis of human, chimpanzee, macaque, and mouse tissues.

    PubMed

    Bozek, Katarzyna; Wei, Yuning; Yan, Zheng; Liu, Xiling; Xiong, Jieyi; Sugimoto, Masahiro; Tomita, Masaru; Pääbo, Svante; Sherwood, Chet C; Hof, Patrick R; Ely, John J; Li, Yan; Steinhauser, Dirk; Willmitzer, Lothar; Giavalisco, Patrick; Khaitovich, Philipp

    2015-02-18

    Lipids are prominent components of the nervous system. Here we performed a large-scale mass spectrometry-based analysis of the lipid composition of three brain regions as well as kidney and skeletal muscle of humans, chimpanzees, rhesus macaques, and mice. The human brain shows the most distinct lipid composition: 76% of 5,713 lipid compounds examined in our study are either enriched or depleted in the human brain. Concentration levels of lipids enriched in the brain evolve approximately four times faster among primates compared with lipids characteristic of non-neural tissues and show further acceleration of change in human neocortical regions but not in the cerebellum. Human-specific concentration changes are supported by human-specific expression changes for corresponding enzymes. These results provide the first insights into the role of lipids in human brain evolution.

  14. Natural snowfall reveals large-scale flow structures in the wake of a 2.5-MW wind turbine.

    PubMed

    Hong, Jiarong; Toloui, Mostafa; Chamorro, Leonardo P; Guala, Michele; Howard, Kevin; Riley, Sean; Tucker, James; Sotiropoulos, Fotis

    2014-06-24

    To improve power production and structural reliability of wind turbines, there is a pressing need to understand how turbines interact with the atmospheric boundary layer. However, experimental techniques capable of quantifying or even qualitatively visualizing the large-scale turbulent flow structures around full-scale turbines do not exist today. Here we use snowflakes from a winter snowstorm as flow tracers to obtain velocity fields downwind of a 2.5-MW wind turbine in a sampling area of ~36 × 36 m(2). The spatial and temporal resolutions of the measurements are sufficiently high to quantify the evolution of blade-generated coherent motions, such as the tip and trailing sheet vortices, identify their instability mechanisms and correlate them with turbine operation, control and performance. Our experiment provides an unprecedented in situ characterization of flow structures around utility-scale turbines, and yields significant insights into the Reynolds number similarity issues presented in wind energy applications.

  15. Purkinje Cell Degeneration in pcd Mice Reveals Large Scale Chromatin Reorganization and Gene Silencing Linked to Defective DNA Repair*

    PubMed Central

    Baltanás, Fernando C.; Casafont, Iñigo; Lafarga, Vanesa; Weruaga, Eduardo; Alonso, José R.; Berciano, María T.; Lafarga, Miguel

    2011-01-01

    DNA repair protects neurons against spontaneous or disease-associated DNA damage. Dysfunctions of this mechanism underlie a growing list of neurodegenerative disorders. The Purkinje cell (PC) degeneration mutation causes the loss of nna1 expression and is associated with the postnatal degeneration of PCs. This PC degeneration dramatically affects nuclear architecture and provides an excellent model to elucidate the nuclear mechanisms involved in a whole array of neurodegenerative disorders. We used immunocytochemistry for histone variants and components of the DNA damage response, an in situ transcription assay, and in situ hybridization for telomeres to analyze changes in chromatin architecture and function. We demonstrate that the phosphorylation of H2AX, a DNA damage signal, and the trimethylation of the histone H4K20, a repressive mark, in extensive domains of genome are epigenetic hallmarks of chromatin in degenerating PCs. These histone modifications are associated with a large scale reorganization of chromatin, telomere clustering, and heterochromatin-induced gene silencing, all of them key factors in PC degeneration. Furthermore, ataxia telangiectasia mutated and 53BP1, two components of the DNA repair pathway, fail to be concentrated in the damaged chromatin compartments, even though the expression levels of their coding genes were slightly up-regulated. Although the mechanism by which Nna1 loss of function leads to PC neurodegeneration is undefined, the progressive accumulation of DNA damage in chromosome territories irreversibly compromises global gene transcription and seems to trigger PC degeneration and death. PMID:21700704

  16. Structural Variant Detection by Large-scale Sequencing Reveals New Evolutionary Evidence on Breed Divergence between Chinese and European Pigs

    PubMed Central

    Zhao, Pengju; Li, Junhui; Kang, Huimin; Wang, Haifei; Fan, Ziyao; Yin, Zongjun; Wang, Jiafu; Zhang, Qin; Wang, Zhiquan; Liu, Jian-Feng

    2016-01-01

    In this study, we performed a genome-wide SV detection among the genomes of thirteen pigs from diverse Chinese and European originated breeds by next genetation sequencing, and constrcuted a single-nucleotide resolution map involving 56,930 putative SVs. We firstly identified a SV hotspot spanning 35 Mb region on the X chromosome specifically in the genomes of Chinese originated individuals. Further scrutinizing this region by large-scale sequencing data of extra 111 individuals, we obtained the confirmatory evidence on our initial finding. Moreover, thirty five SV-related genes within the hotspot region, being of importance for reproduction ability, rendered significant different evolution rates between Chinese and European originated breeds. The SV hotspot identified herein offers a novel evidence for assessing phylogenetic relationships, as well as likely explains the genetic difference of corresponding phenotypes and features, among Chinese and European pig breeds. Furthermore, we employed various SVs to infer genetic structure of individuls surveyed. We found SVs can clearly detect the difference of genetic background among individuals. This clues us that genome-wide SVs can capture majority of geneic variation and be applied into cladistic analyses. Characterizing whole genome SVs demonstrated that SVs are significantly enriched/depleted with various genomic features. PMID:26729041

  17. Large-Scale Phylogenomic Analyses Reveal That Two Enigmatic Protist Lineages, Telonemia and Centroheliozoa, Are Related to Photosynthetic Chromalveolates

    PubMed Central

    Burki, Fabien; Inagaki, Yuji; Bråte, Jon; Archibald, John M.; Keeling, Patrick J.; Cavalier-Smith, Thomas; Sakaguchi, Miako; Hashimoto, Tetsuo; Horak, Ales; Kumar, Surendra; Klaveness, Dag; Jakobsen, Kjetill S.; Pawlowski, Jan

    2009-01-01

    Understanding the early evolution and diversification of eukaryotes relies on a fully resolved phylogenetic tree. In recent years, most eukaryotic diversity has been assigned to six putative supergroups, but the evolutionary origin of a few major “orphan” lineages remains elusive. Two ecologically important orphan groups are the heterotrophic Telonemia and Centroheliozoa. Telonemids have been proposed to be related to the photosynthetic cryptomonads or stramenopiles and centrohelids to haptophytes, but molecular phylogenies have failed to provide strong support for any phylogenetic hypothesis. Here, we investigate the origins of Telonema subtilis (a telonemid) and Raphidiophrys contractilis (a centrohelid) by large-scale 454 pyrosequencing of cDNA libraries and including new genomic data from two cryptomonads (Guillardia theta and Plagioselmis nannoplanctica) and a haptophyte (Imantonia rotunda). We demonstrate that 454 sequencing of cDNA libraries is a powerful and fast method of sampling a high proportion of protist genes, which can yield ample information for phylogenomic studies. Our phylogenetic analyses of 127 genes from 72 species indicate that telonemids and centrohelids are members of an emerging major group of eukaryotes also comprising cryptomonads and haptophytes. Furthermore, this group is possibly closely related to the SAR clade comprising stramenopiles (heterokonts), alveolates, and Rhizaria. Our results link two additional heterotrophic lineages to the predominantly photosynthetic chromalveolate supergroup, providing a new framework for interpreting the evolution of eukaryotic cell structures and the diversification of plastids. PMID:20333193

  18. Large-scale phylogenomic analyses reveal that two enigmatic protist lineages, telonemia and centroheliozoa, are related to photosynthetic chromalveolates.

    PubMed

    Burki, Fabien; Inagaki, Yuji; Bråte, Jon; Archibald, John M; Keeling, Patrick J; Cavalier-Smith, Thomas; Sakaguchi, Miako; Hashimoto, Tetsuo; Horak, Ales; Kumar, Surendra; Klaveness, Dag; Jakobsen, Kjetill S; Pawlowski, Jan; Shalchian-Tabrizi, Kamran

    2009-07-27

    Understanding the early evolution and diversification of eukaryotes relies on a fully resolved phylogenetic tree. In recent years, most eukaryotic diversity has been assigned to six putative supergroups, but the evolutionary origin of a few major "orphan" lineages remains elusive. Two ecologically important orphan groups are the heterotrophic Telonemia and Centroheliozoa. Telonemids have been proposed to be related to the photosynthetic cryptomonads or stramenopiles and centrohelids to haptophytes, but molecular phylogenies have failed to provide strong support for any phylogenetic hypothesis. Here, we investigate the origins of Telonema subtilis (a telonemid) and Raphidiophrys contractilis (a centrohelid) by large-scale 454 pyrosequencing of cDNA libraries and including new genomic data from two cryptomonads (Guillardia theta and Plagioselmis nannoplanctica) and a haptophyte (Imantonia rotunda). We demonstrate that 454 sequencing of cDNA libraries is a powerful and fast method of sampling a high proportion of protist genes, which can yield ample information for phylogenomic studies. Our phylogenetic analyses of 127 genes from 72 species indicate that telonemids and centrohelids are members of an emerging major group of eukaryotes also comprising cryptomonads and haptophytes. Furthermore, this group is possibly closely related to the SAR clade comprising stramenopiles (heterokonts), alveolates, and Rhizaria. Our results link two additional heterotrophic lineages to the predominantly photosynthetic chromalveolate supergroup, providing a new framework for interpreting the evolution of eukaryotic cell structures and the diversification of plastids.

  19. Analyses of transcriptome sequences reveal multiple ancient large-scale duplication events in the ancestor of Sphagnopsida (Bryophyta).

    PubMed

    Devos, Nicolas; Szövényi, Péter; Weston, David J; Rothfels, Carl J; Johnson, Matthew G; Shaw, A Jonathan

    2016-07-01

    The goal of this research was to investigate whether there has been a whole-genome duplication (WGD) in the ancestry of Sphagnum (peatmoss) or the class Sphagnopsida, and to determine if the timing of any such duplication(s) and patterns of paralog retention could help explain the rapid radiation and current ecological dominance of peatmosses. RNA sequencing (RNA-seq) data were generated for nine taxa in Sphagnopsida (Bryophyta). Analyses of frequency plots for synonymous substitutions per synonymous site (Ks ) between paralogous gene pairs and reconciliation of 578 gene trees were conducted to assess evidence of large-scale or genome-wide duplication events in each transcriptome. Both Ks frequency plots and gene tree-based analyses indicate multiple duplication events in the history of the Sphagnopsida. The most recent WGD event predates divergence of Sphagnum from the two other genera of Sphagnopsida. Duplicate retention is highly variable across species, which might be best explained by local adaptation. Our analyses indicate that the last WGD could have been an important factor underlying the diversification of peatmosses and facilitated their rise to ecological dominance in peatlands. The timing of the duplication events and their significance in the evolutionary history of peat mosses are discussed.

  20. A developmental framework for complex plasmodesmata formation revealed by large-scale imaging of the Arabidopsis leaf epidermis.

    PubMed

    Fitzgibbon, Jessica; Beck, Martina; Zhou, Ji; Faulkner, Christine; Robatzek, Silke; Oparka, Karl

    2013-01-01

    Plasmodesmata (PD) form tubular connections that function as intercellular communication channels. They are essential for transporting nutrients and for coordinating development. During cytokinesis, simple PDs are inserted into the developing cell plate, while during wall extension, more complex (branched) forms of PD are laid down. We show that complex PDs are derived from existing simple PDs in a pattern that is accelerated when leaves undergo the sink-source transition. Complex PDs are inserted initially at the three-way junctions between epidermal cells but develop most rapidly in the anisocytic complexes around stomata. For a quantitative analysis of complex PD formation, we established a high-throughput imaging platform and constructed PDQUANT, a custom algorithm that detected cell boundaries and PD numbers in different wall faces. For anticlinal walls, the number of complex PDs increased with increasing cell size, while for periclinal walls, the number of PDs decreased. Complex PD insertion was accelerated by up to threefold in response to salicylic acid treatment and challenges with mannitol. In a single 30-min run, we could derive data for up to 11k PDs from 3k epidermal cells. This facile approach opens the door to a large-scale analysis of the endogenous and exogenous factors that influence PD formation.

  1. Rank Order Coding: a Retinal Information Decoding Strategy Revealed by Large-Scale Multielectrode Array Retinal Recordings123

    PubMed Central

    Maccione, Alessandro; Di Marco, Stefano; Kornprobst, Pierre

    2016-01-01

    How a population of retinal ganglion cells (RGCs) encodes the visual scene remains an open question. Going beyond individual RGC coding strategies, results in salamander suggest that the relative latencies of a RGC pair encode spatial information. Thus, a population code based on this concerted spiking could be a powerful mechanism to transmit visual information rapidly and efficiently. Here, we tested this hypothesis in mouse by recording simultaneous light-evoked responses from hundreds of RGCs, at pan-retinal level, using a new generation of large-scale, high-density multielectrode array consisting of 4096 electrodes. Interestingly, we did not find any RGCs exhibiting a clear latency tuning to the stimuli, suggesting that in mouse, individual RGC pairs may not provide sufficient information. We show that a significant amount of information is encoded synergistically in the concerted spiking of large RGC populations. Thus, the RGC population response described with relative activities, or ranks, provides more relevant information than classical independent spike count- or latency- based codes. In particular, we report for the first time that when considering the relative activities across the whole population, the wave of first stimulus-evoked spikes is an accurate indicator of stimulus content. We show that this coding strategy coexists with classical neural codes, and that it is more efficient and faster. Overall, these novel observations suggest that already at the level of the retina, concerted spiking provides a reliable and fast strategy to rapidly transmit new visual scenes. PMID:27275008

  2. Targeted Sequencing Reveals Large-Scale Sequence Polymorphism in Maize Candidate Genes for Biomass Production and Composition.

    PubMed

    Muraya, Moses M; Schmutzer, Thomas; Ulpinnis, Chris; Scholz, Uwe; Altmann, Thomas

    2015-01-01

    A major goal of maize genomic research is to identify sequence polymorphisms responsible for phenotypic variation in traits of economic importance. Large-scale detection of sequence variation is critical for linking genes, or genomic regions, to phenotypes. However, due to its size and complexity, it remains expensive to generate whole genome sequences of sufficient coverage for divergent maize lines, even with access to next generation sequencing (NGS) technology. Because methods involving reduction of genome complexity, such as genotyping-by-sequencing (GBS), assess only a limited fraction of sequence variation, targeted sequencing of selected genomic loci offers an attractive alternative. We therefore designed a sequence capture assay to target 29 Mb genomic regions and surveyed a total of 4,648 genes possibly affecting biomass production in 21 diverse inbred maize lines (7 flints, 14 dents). Captured and enriched genomic DNA was sequenced using the 454 NGS platform to 19.6-fold average depth coverage, and a broad evaluation of read alignment and variant calling methods was performed to select optimal procedures for variant discovery. Sequence alignment with the B73 reference and de novo assembly identified 383,145 putative single nucleotide polymorphisms (SNPs), of which 42,685 were non-synonymous alterations and 7,139 caused frameshifts. Presence/absence variation (PAV) of genes was also detected. We found that substantial sequence variation exists among genomic regions targeted in this study, which was particularly evident within coding regions. This diversification has the potential to broaden functional diversity and generate phenotypic variation that may lead to new adaptations and the modification of important agronomic traits. Further, annotated SNPs identified here will serve as useful genetic tools and as candidates in searches for phenotype-altering DNA variation. In summary, we demonstrated that sequencing of captured DNA is a powerful approach for

  3. ChIP-on-chip significance analysis reveals large-scale binding and regulation by human transcription factor oncogenes

    PubMed Central

    Margolin, Adam A.; Palomero, Teresa; Sumazin, Pavel; Califano, Andrea; Ferrando, Adolfo A.; Stolovitzky, Gustavo

    2009-01-01

    ChIP-on-chip has emerged as a powerful tool to dissect the complex network of regulatory interactions between transcription factors and their targets. However, most ChIP-on-chip analysis methods use conservative approaches aimed at minimizing false-positive transcription factor targets. We present a model with improved sensitivity in detecting binding events from ChIP-on-chip data. Its application to human T cells, followed by extensive biochemical validation, reveals that 3 oncogenic transcription factors, NOTCH1, MYC, and HES1, bind to several thousand target gene promoters, up to an order of magnitude increase over conventional analysis methods. Gene expression profiling upon NOTCH1 inhibition shows broad-scale functional regulation across the entire range of predicted target genes, establishing a closer link between occupancy and regulation. Finally, the increased sensitivity reveals a combinatorial regulatory program in which MYC cobinds to virtually all NOTCH1-bound promoters. Overall, these results suggest an unappreciated complexity of transcriptional regulatory networks and highlight the fundamental importance of genome-scale analysis to represent transcriptional programs. PMID:19118200

  4. ChIP-on-chip significance analysis reveals large-scale binding and regulation by human transcription factor oncogenes.

    PubMed

    Margolin, Adam A; Palomero, Teresa; Sumazin, Pavel; Califano, Andrea; Ferrando, Adolfo A; Stolovitzky, Gustavo

    2009-01-06

    ChIP-on-chip has emerged as a powerful tool to dissect the complex network of regulatory interactions between transcription factors and their targets. However, most ChIP-on-chip analysis methods use conservative approaches aimed at minimizing false-positive transcription factor targets. We present a model with improved sensitivity in detecting binding events from ChIP-on-chip data. Its application to human T cells, followed by extensive biochemical validation, reveals that 3 oncogenic transcription factors, NOTCH1, MYC, and HES1, bind to several thousand target gene promoters, up to an order of magnitude increase over conventional analysis methods. Gene expression profiling upon NOTCH1 inhibition shows broad-scale functional regulation across the entire range of predicted target genes, establishing a closer link between occupancy and regulation. Finally, the increased sensitivity reveals a combinatorial regulatory program in which MYC cobinds to virtually all NOTCH1-bound promoters. Overall, these results suggest an unappreciated complexity of transcriptional regulatory networks and highlight the fundamental importance of genome-scale analysis to represent transcriptional programs.

  5. X-ray fluorescent microscopy reveals large-scale relocalization and extracellular translocation of cellular copper during angiogenesis.

    SciTech Connect

    Finney, L.; Mandava, S.; Ursos, L.; Zhang, W.; Rodi, D.; Vogt, S.; Legnini, D.; Maser, J.; Ikpatt, F.; Olopade, O. I.; Glesne, D.; Univ. of Chicago

    2007-02-13

    Although copper has been reported to influence numerous proteins known to be important for angiogenesis, the enhanced sensitivity of this developmental process to copper bioavailability has remained an enigma, because copper metalloproteins are prevalent and essential throughout all cells. Recent developments in x-ray optics at third-generation synchrotron sources have provided a resource for highly sensitive visualization and quantitation of metalloproteins in biological samples. Here, we report the application of x-ray fluorescence microscopy (XFM) to in vitro models of angiogenesis and neurogenesis, revealing a surprisingly dramatic spatial relocalization specific to capillary formation of 80-90% of endogenous cellular copper stores from intracellular compartments to the tips of nascent endothelial cell filopodia and across the cell membrane. Although copper chelation had no effect on process formation, an almost complete ablation of network formation was observed. XFM of highly vascularized ductal carcinomas showed copper clustering in putative neoangiogenic areas. This use of XFM for the study of a dynamic developmental process not only sheds light on the copper requirement for endothelial tube formation but highlights the value of synchrotron-based facilities in biological research.

  6. Large-Scale Structure of the Molecular Gas in Taurus Revealed by High Linear Dynamic Range Spectral Line Mapping

    NASA Astrophysics Data System (ADS)

    Goldsmith, Paul F.; Heyer, Mark; Narayanan, Gopal; Snell, Ronald; Li, Di; Brunt, Chris

    2008-06-01

    We report the results of a 100 deg2 survey of the Taurus molecular cloud region in 12CO and 13CO J = 1→ 0. The image of the cloud in each velocity channel includes simeq3 × 106 Nyquist-sampled pixels on a 20'' grid. The high sensitivity and large spatial dynamic range of the maps reveal a very complex, highly structured cloud morphology, including filaments, cavities, and rings. The axes of the striations seen in the 12CO emission from relatively diffuse gas are aligned with the direction of the magnetic field. We have developed a statistical method for analyzing the pixels in which 12CO but not 13CO is detected, which allows us to determine the CO column in the diffuse portion of the cloud, as well as in the denser regions in which we detect both isotopologues. Using a column-density-dependent model for the CO fractional abundance, we derive the mass of the region mapped to be 2.4 × 104 M⊙, more than twice as large as would be obtained using a canonical fixed fractional abundance of 13CO, and a factor of 3 greater than would be obtained considering only the high column density regions. We determine that half the mass of the cloud is in regions having column density below 2.1 × 1021 cm-2. The distribution of young stars in the region covered is highly nonuniform, with the probability of finding a star in a pixel with a specified column density rising sharply for N(H2) = 6 × 1021 cm-2. We determine a relatively low star formation efficiency (mass of young stars/mass of molecular gas), between 0.3% and 1.2%, and an average star formation rate during the past 3 Myr of 8 × 10-5 stars yr-1.

  7. Large-Scale Comparative Genomics Meta-Analysis of Campylobacter jejuni Isolates Reveals Low Level of Genome Plasticity

    PubMed Central

    Taboada, Eduardo N.; Acedillo, Rey R.; Carrillo, Catherine D.; Findlay, Wendy A.; Medeiros, Diane T.; Mykytczuk, Oksana L.; Roberts, Michael J.; Valencia, C. Alexander; Farber, Jeffrey M.; Nash, John H. E.

    2004-01-01

    We have used comparative genomic hybridization (CGH) on a full-genome Campylobacter jejuni microarray to examine genome-wide gene conservation patterns among 51 strains isolated from food and clinical sources. These data have been integrated with data from three previous C. jejuni CGH studies to perform a meta-analysis that included 97 strains from the four separate data sets. Although many genes were found to be divergent across multiple strains (n = 350), many genes (n = 249) were uniquely variable in single strains. Thus, the strains in each data set comprise strains with a unique genetic diversity not found in the strains in the other data sets. Despite the large increase in the collective number of variable C. jejuni genes (n = 599) found in the meta-analysis data set, nearly half of these (n = 276) mapped to previously defined variable loci, and it therefore appears that large regions of the C. jejuni genome are genetically stable. A detailed analysis of the microarray data revealed that divergent genes could be differentiated on the basis of the amplitudes of their differential microarray signals. Of 599 variable genes, 122 could be classified as highly divergent on the basis of CGH data. Nearly all highly divergent genes (117 of 122) had divergent neighbors and showed high levels of intraspecies variability. The approach outlined here has enabled us to distinguish global trends of gene conservation in C. jejuni and has enabled us to define this group of genes as a robust set of variable markers that can become the cornerstone of a new generation of genotyping methods that use genome-wide C. jejuni gene variability data. PMID:15472310

  8. A large-scale analysis of alternative splicing reveals a key role of QKI in lung cancer.

    PubMed

    de Miguel, Fernando J; Pajares, María J; Martínez-Terroba, Elena; Ajona, Daniel; Morales, Xabier; Sharma, Ravi D; Pardo, Francisco J; Rouzaut, Ana; Rubio, Angel; Montuenga, Luis M; Pio, Ruben

    2016-11-01

    Increasing interest has been devoted in recent years to the understanding of alternative splicing in cancer. In this study, we performed a genome-wide analysis to identify cancer-associated splice variants in non-small cell lung cancer. We discovered and validated novel differences in the splicing of genes known to be relevant to lung cancer biology, such as NFIB, ENAH or SPAG9. Gene enrichment analyses revealed an important contribution of alternative splicing to cancer-related molecular functions, especially those involved in cytoskeletal dynamics. Interestingly, a substantial fraction of the altered genes found in our analysis were targets of the protein quaking (QKI), pointing to this factor as one of the most relevant regulators of alternative splicing in non-small cell lung cancer. We also found that ESYT2, one of the QKI targets, is involved in cytoskeletal organization. ESYT2-short variant inhibition in lung cancer cells resulted in a cortical distribution of actin whereas inhibition of the long variant caused an increase of endocytosis, suggesting that the cancer-associated splicing pattern of ESYT2 has a profound impact in the biology of cancer cells. Finally, we show that low nuclear QKI expression in non-small cell lung cancer is an independent prognostic factor for disease-free survival (HR = 2.47; 95% CI = 1.11-5.46, P = 0.026). In conclusion, we identified several splicing variants with functional relevance in lung cancer largely regulated by the splicing factor QKI, a tumor suppressor associated with prognosis in lung cancer.

  9. Measuring air-sea gas-exchange velocities in a large-scale annular wind-wave tank

    NASA Astrophysics Data System (ADS)

    Mesarchaki, E.; Kräuter, C.; Krall, K. E.; Bopp, M.; Helleis, F.; Williams, J.; Jähne, B.

    2015-01-01

    In this study we present gas-exchange measurements conducted in a large-scale wind-wave tank. Fourteen chemical species spanning a wide range of solubility (dimensionless solubility, α = 0.4 to 5470) and diffusivity (Schmidt number in water, Scw = 594 to 1194) were examined under various turbulent (u10 = 0.73 to 13.2 m s-1) conditions. Additional experiments were performed under different surfactant modulated (two different concentration levels of Triton X-100) surface states. This paper details the complete methodology, experimental procedure and instrumentation used to derive the total transfer velocity for all examined tracers. The results presented here demonstrate the efficacy of the proposed method, and the derived gas-exchange velocities are shown to be comparable to previous investigations. The gas transfer behaviour is exemplified by contrasting two species at the two solubility extremes, namely nitrous oxide (N2O) and methanol (CH3OH). Interestingly, a strong transfer velocity reduction (up to a factor of 3) was observed for the relatively insoluble N2O under a surfactant covered water surface. In contrast, the surfactant effect for CH3OH, the high solubility tracer, was significantly weaker.

  10. Large-scale soil remediation using low temperature thermal volatilization technology at the Chanute Air Force Base

    SciTech Connect

    Davis, H.A.; Silkebakken, D.M.; Ghosh, S.B.; Beardsley, G.P.

    1995-12-31

    Chanute Air Force Base (AFB) in Rantoul, Illinois, was selected for closure by the Round 1 Base Closure Commission, pursuant to the Base Realignment and Closure (BRAC) Act of 1988. As part of the requirements for base closure, Parsons Engineering Science, Inc. was retained by the Air Force Center for Environmental Excellence (AFCEE) to treat petroleum-contaminated soil using low temperature thermal volatilization (LTTV). Using this technology, over 40,000 tons of fuel contaminated soils were successfully treated using one of the largest transportable LTTV treatment units in the world. The soil treatment system, soil management procedures, cost-effectiveness, and limitations of the use of this system are described in this paper.

  11. Large Scale Variability of Mid-Tropospheric Carbon Dioxide as Observed by the Atmospheric Infrared Sounder (AIRS) on the NASA EOS Aqua Platform

    NASA Technical Reports Server (NTRS)

    Pagano, Thomas S.; Olsen, Edward T.

    2012-01-01

    The Atmospheric Infrared Sounder (AIRS) is a hyperspectral infrared instrument on the EOS Aqua Spacecraft, launched on May 4, 2002. AIRS has 2378 infrared channels ranging from 3.7 microns to 15.4 microns and a 13.5 km footprint. AIRS, in conjunction with the Advanced Microwave Sounding Unit (AMSU), produces temperature profiles with 1K/km accuracy, water vapor profiles (20%/2km), infrared cloud height and fraction, and trace gas amounts for CO2, CO, SO2, O3 and CH4 in the mid to upper troposphere. AIRS wide swath(cedilla) +/-49.5 deg , enables daily global daily coverage for over 95% of the Earth's surface. AIRS data are used for weather forecasting, validating climate model distribution and processes, and observing long-range transport of greenhouse gases. In this study, we examine the large scale and regional horizontal variability in the AIRS Mid-tropospheric Carbon Dioxide product as a function of season and associate the observed variability with known atmospheric transport processes, and sources and sinks of CO2.

  12. Large scale air pollution estimation method combining land use regression and chemical transport modeling in a geostatistical framework.

    PubMed

    Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey

    2014-04-15

    In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.

  13. Large scale dynamic systems

    NASA Technical Reports Server (NTRS)

    Doolin, B. F.

    1975-01-01

    Classes of large scale dynamic systems were discussed in the context of modern control theory. Specific examples discussed were in the technical fields of aeronautics, water resources and electric power.

  14. Generation of large-scale, barrier-free diffuse plasmas in air at atmospheric pressure using array wire electrodes and nanosecond high-voltage pulses

    SciTech Connect

    Teng, Yun; Li, Lee Liu, Yun-Long; Liu, Lun; Liu, Minghai

    2014-10-15

    This paper introduces a method to generate large-scale diffuse plasmas by using a repetition nanosecond pulse generator and a parallel array wire-electrode configuration. We investigated barrier-free diffuse plasmas produced in the open air in parallel and cross-parallel array line-line electrode configurations. We found that, when the distance between the wire-electrode pair is small, the discharges were almost extinguished. Also, glow-like diffuse plasmas with little discharge weakening were obtained in an appropriate range of line-line distances and with a cathode-grounding cross-electrode configuration. As an example, we produced a large-scale, stable diffuse plasma with volumes as large as 18 × 15 × 15 cm{sup 3}, and this discharge region can be further expanded. Additionally, using optical and electrical measurements, we showed that the electron temperature was higher than the gas temperature, which was almost the same as room temperature. Also, an array of electrode configuration with more wire electrodes had helped to prevent the transition from diffuse discharge to arc discharge. Comparing the current waveforms of configurations with 1 cell and 9 cells, we found that adding cells significantly increased the conduction current and the electrical energy delivered in the electrode gaps.

  15. Generation of large-scale, barrier-free diffuse plasmas in air at atmospheric pressure using array wire electrodes and nanosecond high-voltage pulses

    NASA Astrophysics Data System (ADS)

    Teng, Yun; Li, Lee; Liu, Yun-Long; Liu, Lun; Liu, Minghai

    2014-10-01

    This paper introduces a method to generate large-scale diffuse plasmas by using a repetition nanosecond pulse generator and a parallel array wire-electrode configuration. We investigated barrier-free diffuse plasmas produced in the open air in parallel and cross-parallel array line-line electrode configurations. We found that, when the distance between the wire-electrode pair is small, the discharges were almost extinguished. Also, glow-like diffuse plasmas with little discharge weakening were obtained in an appropriate range of line-line distances and with a cathode-grounding cross-electrode configuration. As an example, we produced a large-scale, stable diffuse plasma with volumes as large as 18 × 15 × 15 cm3, and this discharge region can be further expanded. Additionally, using optical and electrical measurements, we showed that the electron temperature was higher than the gas temperature, which was almost the same as room temperature. Also, an array of electrode configuration with more wire electrodes had helped to prevent the transition from diffuse discharge to arc discharge. Comparing the current waveforms of configurations with 1 cell and 9 cells, we found that adding cells significantly increased the conduction current and the electrical energy delivered in the electrode gaps.

  16. A Revised Method of Presenting Wavenumber-Frequency Power Spectrum Diagrams That Reveals the Asymmetric Nature of Tropical Large-scale Waves

    NASA Technical Reports Server (NTRS)

    Chao, Winston C.; Yang, Bo; Fu, Xiouhua

    2007-01-01

    The popular method of presenting wavenumber-frequency power spectrum diagrams for studying tropical large-scale waves in the literature is shown to give an incomplete presentation of these waves. The so-called "convectively-coupled Kelvin (mixed Rossby-gravity) waves" are presented as existing only in the symmetric (antisymmetric) component of the diagrams. This is obviously not consistent with the published composite/regression studies of "convectively-coupled Kelvin waves," which illustrate the asymmetric nature of these waves. The cause of this inconsistency is revealed in this note and a revised method of presenting the power spectrum diagrams is proposed. When this revised method is used, "convectively-coupled Kelvin waves" do show anti-symmetric components, and "convectively-coupled mixed Rossby-gravity waves (also known as Yanai waves)" do show a hint of symmetric components. These results bolster a published proposal that these waves be called "chimeric Kelvin waves," "chimeric mixed Rossby-gravity waves," etc. This revised method of presenting power spectrum diagrams offers a more rigorous means of comparing the General Circulation Models (GCM) output with observations by calling attention to the capability of GCMs in correctly simulating the asymmetric characteristics of the equatorial waves.

  17. Large scale full-length cDNA sequencing reveals a unique genomic landscape in a lepidopteran model insect, Bombyx mori.

    PubMed

    Suetsugu, Yoshitaka; Futahashi, Ryo; Kanamori, Hiroyuki; Kadono-Okuda, Keiko; Sasanuma, Shun-ichi; Narukawa, Junko; Ajimura, Masahiro; Jouraku, Akiya; Namiki, Nobukazu; Shimomura, Michihiko; Sezutsu, Hideki; Osanai-Futahashi, Mizuko; Suzuki, Masataka G; Daimon, Takaaki; Shinoda, Tetsuro; Taniai, Kiyoko; Asaoka, Kiyoshi; Niwa, Ryusuke; Kawaoka, Shinpei; Katsuma, Susumu; Tamura, Toshiki; Noda, Hiroaki; Kasahara, Masahiro; Sugano, Sumio; Suzuki, Yutaka; Fujiwara, Haruhiko; Kataoka, Hiroshi; Arunkumar, Kallare P; Tomar, Archana; Nagaraju, Javaregowda; Goldsmith, Marian R; Feng, Qili; Xia, Qingyou; Yamamoto, Kimiko; Shimada, Toru; Mita, Kazuei

    2013-09-04

    The establishment of a complete genomic sequence of silkworm, the model species of Lepidoptera, laid a foundation for its functional genomics. A more complete annotation of the genome will benefit functional and comparative studies and accelerate extensive industrial applications for this insect. To realize these goals, we embarked upon a large-scale full-length cDNA collection from 21 full-length cDNA libraries derived from 14 tissues of the domesticated silkworm and performed full sequencing by primer walking for 11,104 full-length cDNAs. The large average intron size was 1904 bp, resulting from a high accumulation of transposons. Using gene models predicted by GLEAN and published mRNAs, we identified 16,823 gene loci on the silkworm genome assembly. Orthology analysis of 153 species, including 11 insects, revealed that among three Lepidoptera including Monarch and Heliconius butterflies, the 403 largest silkworm-specific genes were composed mainly of protective immunity, hormone-related, and characteristic structural proteins. Analysis of testis-/ovary-specific genes revealed distinctive features of sexual dimorphism, including depletion of ovary-specific genes on the Z chromosome in contrast to an enrichment of testis-specific genes. More than 40% of genes expressed in specific tissues mapped in tissue-specific chromosomal clusters. The newly obtained FL-cDNA sequences enabled us to annotate the genome of this lepidopteran model insect more accurately, enhancing genomic and functional studies of Lepidoptera and comparative analyses with other insect orders, and yielding new insights into the evolution and organization of lepidopteran-specific genes.

  18. Large Scale Full-Length cDNA Sequencing Reveals a Unique Genomic Landscape in a Lepidopteran Model Insect, Bombyx mori

    PubMed Central

    Suetsugu, Yoshitaka; Futahashi, Ryo; Kanamori, Hiroyuki; Kadono-Okuda, Keiko; Sasanuma, Shun-ichi; Narukawa, Junko; Ajimura, Masahiro; Jouraku, Akiya; Namiki, Nobukazu; Shimomura, Michihiko; Sezutsu, Hideki; Osanai-Futahashi, Mizuko; Suzuki, Masataka G; Daimon, Takaaki; Shinoda, Tetsuro; Taniai, Kiyoko; Asaoka, Kiyoshi; Niwa, Ryusuke; Kawaoka, Shinpei; Katsuma, Susumu; Tamura, Toshiki; Noda, Hiroaki; Kasahara, Masahiro; Sugano, Sumio; Suzuki, Yutaka; Fujiwara, Haruhiko; Kataoka, Hiroshi; Arunkumar, Kallare P.; Tomar, Archana; Nagaraju, Javaregowda; Goldsmith, Marian R.; Feng, Qili; Xia, Qingyou; Yamamoto, Kimiko; Shimada, Toru; Mita, Kazuei

    2013-01-01

    The establishment of a complete genomic sequence of silkworm, the model species of Lepidoptera, laid a foundation for its functional genomics. A more complete annotation of the genome will benefit functional and comparative studies and accelerate extensive industrial applications for this insect. To realize these goals, we embarked upon a large-scale full-length cDNA collection from 21 full-length cDNA libraries derived from 14 tissues of the domesticated silkworm and performed full sequencing by primer walking for 11,104 full-length cDNAs. The large average intron size was 1904 bp, resulting from a high accumulation of transposons. Using gene models predicted by GLEAN and published mRNAs, we identified 16,823 gene loci on the silkworm genome assembly. Orthology analysis of 153 species, including 11 insects, revealed that among three Lepidoptera including Monarch and Heliconius butterflies, the 403 largest silkworm-specific genes were composed mainly of protective immunity, hormone-related, and characteristic structural proteins. Analysis of testis-/ovary-specific genes revealed distinctive features of sexual dimorphism, including depletion of ovary-specific genes on the Z chromosome in contrast to an enrichment of testis-specific genes. More than 40% of genes expressed in specific tissues mapped in tissue-specific chromosomal clusters. The newly obtained FL-cDNA sequences enabled us to annotate the genome of this lepidopteran model insect more accurately, enhancing genomic and functional studies of Lepidoptera and comparative analyses with other insect orders, and yielding new insights into the evolution and organization of lepidopteran-specific genes. PMID:23821615

  19. Large-scale phosphotyrosine proteomic profiling of rat renal collecting duct epithelium reveals predominance of proteins involved in cell polarity determination.

    PubMed

    Zhao, Boyang; Knepper, Mark A; Chou, Chung-Lin; Pisitkun, Trairak

    2012-01-01

    Although extensive phosphoproteomic information is available for renal epithelial cells, previous emphasis has been on phosphorylation of serines and threonines with little focus on tyrosine phosphorylation. Here we have carried out large-scale identification of phosphotyrosine sites in pervanadate-treated native inner medullary collecting ducts of rat, with a view towards identification of physiological processes in epithelial cells that are potentially regulated by tyrosine phosphorylation. The method combined antibody-based affinity purification of tyrosine phosphorylated peptides coupled with immobilized metal ion chromatography to enrich tyrosine phosphopeptides, which were identified by LC-MS/MS. A total of 418 unique tyrosine phosphorylation sites in 273 proteins were identified. A large fraction of these sites have not been previously reported on standard phosphoproteomic databases. All results are accessible via an online database: http://helixweb.nih.gov/ESBL/Database/iPY/. Analysis of surrounding sequences revealed four overrepresented motifs: [D/E]xxY*, Y*xxP, DY*, and Y*E, where the asterisk symbol indicates the site of phosphorylation. These motifs plus contextual information, integrated using the NetworKIN tool, suggest that the protein tyrosine kinases involved include members of the insulin- and ephrin-receptor kinase families. Analysis of the gene ontology (GO) terms and KEGG pathways whose protein elements are overrepresented in our data set point to structures involved in epithelial cell-cell and cell-matrix interactions ("adherens junction," "tight junction," and "focal adhesion") and to components of the actin cytoskeleton as major sites of tyrosine phosphorylation in these cells. In general, these findings mesh well with evidence that tyrosine phosphorylation plays a key role in epithelial polarity determination.

  20. Structural dynamics and cation interactions of DNA quadruplex molecules containing mixed guanine/cytosine quartets revealed by large-scale MD simulations.

    PubMed

    Spacková, N; Berger, I; Sponer, J

    2001-04-11

    Large-scale molecular dynamics (MD) simulations have been utilized to study G-DNA quadruplex molecules containing mixed GCGC and all-guanine GGGG quartet layers. Incorporation of mixed GCGC quartets into G-DNA stems substantially enhances their sequence variability. The mixed quadruplexes form rigid assemblies that require integral monovalent cations for their stabilization. The interaction of cations with the all-guanine quartets is the leading contribution for the stability of the four-stranded assemblies, while the mixed quartets are rather tolerated within the structure. The simulations predict that two cations are preferred to stabilize a four-layer quadruplex stem composed of two GCGC and two all-guanine quartets. The distribution of cations in the structure is influenced by the position of the GCGC quartets within the quadruplex, the presence and arrangement of thymidine loops connecting the guanine/cytosine stretches forming the stems, and the cation type present (Na(+) or K(+)). The simulations identify multiple nanosecond-scale stable arrangements of the thymidine loops present in the molecules investigated. In these thymidine loops, several structured pockets are identified capable of temporarily coordinating cations. However, no stable association of cations to a loop has been observed. The simulations reveal several paths through the thymidine loop regions that can be followed by the cations when exchanging between the central ion channel in the quadruplex stem and the surrounding solvent. We have carried out 20 independent simulations while the length of simulations reaches a total of 90 ns, rendering this study one of the most extensive MD investigations carried out on nucleic acids so far. The trajectories provide a largely converged characterization of the structural dynamics of these four-stranded G-DNA molecules.

  1. Effects of sex and proficiency in second language processing as revealed by a large-scale fNIRS study of school-aged children.

    PubMed

    Sugiura, Lisa; Ojima, Shiro; Matsuba-Kurita, Hiroko; Dan, Ippeita; Tsuzuki, Daisuke; Katura, Takusige; Hagiwara, Hiroko

    2015-10-01

    Previous neuroimaging studies in adults have revealed that first and second languages (L1/L2) share similar neural substrates, and that proficiency is a major determinant of the neural organization of L2 in the lexical-semantic and syntactic domains. However, little is known about neural substrates of children in the phonological domain, or about sex differences. Here, we conducted a large-scale study (n = 484) of school-aged children using functional near-infrared spectroscopy and a word repetition task, which requires a great extent of phonological processing. We investigated cortical activation during word processing, emphasizing sex differences, to clarify similarities and differences between L1 and L2, and proficiency-related differences during early L2 learning. L1 and L2 shared similar neural substrates with decreased activation in L2 compared to L1 in the posterior superior/middle temporal and angular/supramarginal gyri for both sexes. Significant sex differences were found in cortical activation within language areas during high-frequency word but not during low-frequency word processing. During high-frequency word processing, widely distributed areas including the angular/supramarginal gyri were activated in boys, while more restricted areas, excluding the angular/supramarginal gyri were activated in girls. Significant sex differences were also found in L2 proficiency-related activation: activation significantly increased with proficiency in boys, whereas no proficiency-related differences were found in girls. Importantly, cortical sex differences emerged with proficiency. Based on previous research, the present results indicate that sex differences are acquired or enlarged during language development through different cognitive strategies between sexes, possibly reflecting their different memory functions.

  2. Rock-avalanche dynamics revealed by large-scale field mapping and seismic signals at a highly mobile avalanche in the West Salt Creek valley, western Colorado

    USGS Publications Warehouse

    Coe, Jeffrey A.; Baum, Rex L.; Allstadt, Kate; Kochevar, Bernard; Schmitt, Robert G.; Morgan, Matthew L.; White, Jonathan L.; Stratton, Benjamin T.; Hayashi, Timothy A.; Kean, Jason W.

    2016-01-01

    On 25 May 2014, a rain-on-snow–induced rock avalanche occurred in the West Salt Creek valley on the northern flank of Grand Mesa in western Colorado (United States). The avalanche mobilized from a preexisting rock slide in the Green River Formation and traveled 4.6 km down the confined valley, killing three people. The avalanche was rare for the contiguous United States because of its large size (54.5 Mm3) and high mobility (height/length = 0.14). To understand the avalanche failure sequence, mechanisms, and mobility, we conducted a forensic analysis using large-scale (1:1000) structural mapping and seismic data. We used high-resolution, unmanned aircraft system imagery as a base for field mapping, and analyzed seismic data from 22 broadband stations (distances < 656 km from the rock-slide source area) and one short-period network. We inverted broadband data to derive a time series of forces that the avalanche exerted on the earth and tracked these forces using curves in the avalanche path. Our results revealed that the rock avalanche was a cascade of landslide events, rather than a single massive failure. The sequence began with an early morning landslide/debris flow that started ∼10 h before the main avalanche. The main avalanche lasted ∼3.5 min and traveled at average velocities ranging from 15 to 36 m/s. For at least two hours after the avalanche ceased movement, a central, hummock-rich core continued to move slowly. Since 25 May 2014, numerous shallow landslides, rock slides, and rock falls have created new structures and modified avalanche topography. Mobility of the main avalanche and central core was likely enhanced by valley floor material that liquefied from undrained loading by the overriding avalanche. Although the base was likely at least partially liquefied, our mapping indicates that the overriding avalanche internally deformed predominantly by sliding along discrete shear surfaces in material that was nearly dry and had substantial frictional

  3. Multi-modal analysis of functional connectivity and cerebral blood flow reveals shared and unique effects of propofol in large-scale brain networks.

    PubMed

    Qiu, Maolin; Scheinost, Dustin; Ramani, Ramachandran; Constable, R Todd

    2017-03-01

    Anesthesia-induced changes in functional connectivity and cerebral blow flow (CBF) in large-scale brain networks have emerged as key markers of reduced consciousness. However, studies of functional connectivity disagree on which large-scale networks are altered or preserved during anesthesia, making it difficult to find a consensus amount studies. Additionally, pharmacological alterations in CBF could amplify or occlude changes in connectivity due to the shared variance between CBF and connectivity. Here, we used data-driven connectivity methods and multi-modal imaging to investigate shared and unique neural correlates of reduced consciousness for connectivity in large-scale brain networks. Rs-fMRI and CBF data were collected from the same subjects during an awake and deep sedation condition induced by propofol. We measured whole-brain connectivity using the intrinsic connectivity distribution (ICD), a method not reliant on pre-defined seed regions, networks of interest, or connectivity thresholds. The shared and unique variance between connectivity and CBF were investigated. Finally, to account for shared variance, we present a novel extension to ICD that incorporates cerebral blood flow (CBF) as a scaling factor in the calculation of global connectivity, labeled CBF-adjusted ICD). We observed altered connectivity in multiple large-scale brain networks including the default mode (DMN), salience, visual, and motor networks and reduced CBF in the DMN, frontoparietal network, and thalamus. Regional connectivity and CBF were significantly correlated during both the awake and propofol condition. Nevertheless changes in connectivity and CBF between the awake and deep sedation condition were only significantly correlated in a subsystem of the DMN, suggesting that, while there is significant shared variance between the modalities, changes due to propofol are relatively unique. Similar, but less significant, results were observed in the CBF-adjusted ICD analysis, providing

  4. Large scale traffic simulations

    SciTech Connect

    Nagel, K.; Barrett, C.L. |; Rickert, M. |

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  5. Large scale tracking algorithms

    SciTech Connect

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  6. A large-scale phylogeny of Synodontis (Mochokidae, Siluriformes) reveals the influence of geological events on continental diversity during the Cenozoic.

    PubMed

    Pinton, Aurélie; Agnèse, Jean-François; Paugy, Didier; Otero, Olga

    2013-03-01

    To explain the spatial variability of fish taxa at a large scale, two alternative proposals are usually evoked. In recent years, the debate has centred on the relative roles of present and historical processes in shaping biodiversity patterns. In Africa, attempts to understand the processes that determine the large scale distribution of fishes and exploration of historical contingencies have been under-investigated given that most of the phylogenetic studies focus on the history of the Great Lakes. Here, we explore phylogeographic events in the evolutionary history of Synodontis (Mohokidae, Siluriformes) over Africa during the Cenozoic focusing on the putative role of historical processes. We discuss how known geological events together with hydrographical changes contributed to shape Synodontis biogeographical history. Synodontis was chosen on the basis of its high diversity and distribution in Africa: it consists of approximately 120 species that are widely distributed in all hydrographic basins except the Maghreb and South Africa. We propose the most comprehensive phylogeny of this catfish genus. Our results provide support for the 'hydrogeological' hypothesis, which proposes that palaeohydrological changes linked with the geological context may have been the cause of diversification of freshwater fish deep in the Tertiary. More precisely, the two main geological structures that participated to shape the hydrographical network in Africa, namely the Central African Shear zone and the East African rift system, appear as strong drivers of Synodontis diversification and evolution.

  7. Volunteer Conservation Action Data Reveals Large-Scale and Long-Term Negative Population Trends of a Widespread Amphibian, the Common Toad (Bufo bufo)

    PubMed Central

    Petrovan, Silviu O.

    2016-01-01

    Rare and threatened species are the most frequent focus of conservation science and action. With the ongoing shift from single-species conservation towards the preservation of ecosystem services, there is a greater need to understand abundance trends of common species because declines in common species can disproportionately impact ecosystems function. We used volunteer-collected data in two European countries, the United Kingdom (UK) and Switzerland, since the 1970s to assess national and regional trends for one of Europe’s most abundant amphibian species, the common toad (Bufo bufo). Millions of toads were moved by volunteers across roads during this period in an effort to protect them from road traffic. For Switzerland, we additionally estimated trends for the common frog (Rana temporaria), a similarly widespread and common amphibian species. We used state-space models to account for variability in detection and effort and included only populations with at least 5 years of data; 153 populations for the UK and 141 for Switzerland. Common toads declined continuously in each decade in both countries since the 1980s. Given the declines, this common species almost qualifies for International Union for the Conservation of Nature (IUCN) red-listing over this period despite volunteer conservation efforts. Reasons for the declines and wider impacts remain unknown. By contrast, common frog populations were stable or increasing in Switzerland, although there was evidence of declines after 2003. “Toads on Roads” schemes are vital citizen conservation action projects, and the data from such projects can be used for large scale trend estimations of widespread amphibians. We highlight the need for increased research into the status of common amphibian species in addition to conservation efforts focusing on rare and threatened species. PMID:27706154

  8. A large-scale genetic analysis reveals a strong contribution of the HLA class II region to giant cell arteritis susceptibility.

    PubMed

    Carmona, F David; Mackie, Sarah L; Martín, Jose-Ezequiel; Taylor, John C; Vaglio, Augusto; Eyre, Stephen; Bossini-Castillo, Lara; Castañeda, Santos; Cid, Maria C; Hernández-Rodríguez, José; Prieto-González, Sergio; Solans, Roser; Ramentol-Sintas, Marc; González-Escribano, M Francisca; Ortiz-Fernández, Lourdes; Morado, Inmaculada C; Narváez, Javier; Miranda-Filloy, José A; Beretta, Lorenzo; Lunardi, Claudio; Cimmino, Marco A; Gianfreda, Davide; Santilli, Daniele; Ramirez, Giuseppe A; Soriano, Alessandra; Muratore, Francesco; Pazzola, Giulia; Addimanda, Olga; Wijmenga, Cisca; Witte, Torsten; Schirmer, Jan H; Moosig, Frank; Schönau, Verena; Franke, Andre; Palm, Øyvind; Molberg, Øyvind; Diamantopoulos, Andreas P; Carette, Simon; Cuthbertson, David; Forbess, Lindsy J; Hoffman, Gary S; Khalidi, Nader A; Koening, Curry L; Langford, Carol A; McAlear, Carol A; Moreland, Larry; Monach, Paul A; Pagnoux, Christian; Seo, Philip; Spiera, Robert; Sreih, Antoine G; Warrington, Kenneth J; Ytterberg, Steven R; Gregersen, Peter K; Pease, Colin T; Gough, Andrew; Green, Michael; Hordon, Lesley; Jarrett, Stephen; Watts, Richard; Levy, Sarah; Patel, Yusuf; Kamath, Sanjeet; Dasgupta, Bhaskar; Worthington, Jane; Koeleman, Bobby P C; de Bakker, Paul I W; Barrett, Jennifer H; Salvarani, Carlo; Merkel, Peter A; González-Gay, Miguel A; Morgan, Ann W; Martín, Javier

    2015-04-02

    We conducted a large-scale genetic analysis on giant cell arteritis (GCA), a polygenic immune-mediated vasculitis. A case-control cohort, comprising 1,651 case subjects with GCA and 15,306 unrelated control subjects from six different countries of European ancestry, was genotyped by the Immunochip array. We also imputed HLA data with a previously validated imputation method to perform a more comprehensive analysis of this genomic region. The strongest association signals were observed in the HLA region, with rs477515 representing the highest peak (p = 4.05 × 10(-40), OR = 1.73). A multivariate model including class II amino acids of HLA-DRβ1 and HLA-DQα1 and one class I amino acid of HLA-B explained most of the HLA association with GCA, consistent with previously reported associations of classical HLA alleles like HLA-DRB1(∗)04. An omnibus test on polymorphic amino acid positions highlighted DRβ1 13 (p = 4.08 × 10(-43)) and HLA-DQα1 47 (p = 4.02 × 10(-46)), 56, and 76 (both p = 1.84 × 10(-45)) as relevant positions for disease susceptibility. Outside the HLA region, the most significant loci included PTPN22 (rs2476601, p = 1.73 × 10(-6), OR = 1.38), LRRC32 (rs10160518, p = 4.39 × 10(-6), OR = 1.20), and REL (rs115674477, p = 1.10 × 10(-5), OR = 1.63). Our study provides evidence of a strong contribution of HLA class I and II molecules to susceptibility to GCA. In the non-HLA region, we confirmed a key role for the functional PTPN22 rs2476601 variant and proposed other putative risk loci for GCA involved in Th1, Th17, and Treg cell function.

  9. Volunteer Conservation Action Data Reveals Large-Scale and Long-Term Negative Population Trends of a Widespread Amphibian, the Common Toad (Bufo bufo).

    PubMed

    Petrovan, Silviu O; Schmidt, Benedikt R

    2016-01-01

    Rare and threatened species are the most frequent focus of conservation science and action. With the ongoing shift from single-species conservation towards the preservation of ecosystem services, there is a greater need to understand abundance trends of common species because declines in common species can disproportionately impact ecosystems function. We used volunteer-collected data in two European countries, the United Kingdom (UK) and Switzerland, since the 1970s to assess national and regional trends for one of Europe's most abundant amphibian species, the common toad (Bufo bufo). Millions of toads were moved by volunteers across roads during this period in an effort to protect them from road traffic. For Switzerland, we additionally estimated trends for the common frog (Rana temporaria), a similarly widespread and common amphibian species. We used state-space models to account for variability in detection and effort and included only populations with at least 5 years of data; 153 populations for the UK and 141 for Switzerland. Common toads declined continuously in each decade in both countries since the 1980s. Given the declines, this common species almost qualifies for International Union for the Conservation of Nature (IUCN) red-listing over this period despite volunteer conservation efforts. Reasons for the declines and wider impacts remain unknown. By contrast, common frog populations were stable or increasing in Switzerland, although there was evidence of declines after 2003. "Toads on Roads" schemes are vital citizen conservation action projects, and the data from such projects can be used for large scale trend estimations of widespread amphibians. We highlight the need for increased research into the status of common amphibian species in addition to conservation efforts focusing on rare and threatened species.

  10. A Large-Scale Genetic Analysis Reveals a Strong Contribution of the HLA Class II Region to Giant Cell Arteritis Susceptibility

    PubMed Central

    Carmona, F. David; Mackie, Sarah L.; Martín, Jose-Ezequiel; Taylor, John C.; Vaglio, Augusto; Eyre, Stephen; Bossini-Castillo, Lara; Castañeda, Santos; Cid, Maria C.; Hernández-Rodríguez, José; Prieto-González, Sergio; Solans, Roser; Ramentol-Sintas, Marc; González-Escribano, M. Francisca; Ortiz-Fernández, Lourdes; Morado, Inmaculada C.; Narváez, Javier; Miranda-Filloy, José A.; Martínez-Berriochoa, Agustín; Unzurrunzaga, Ainhoa; Hidalgo-Conde, Ana; Madroñero-Vuelta, Ana B.; Fernández-Nebro, Antonio; Ordóñez-Cañizares, M. Carmen; Escalante, Begoña; Marí-Alfonso, Begoña; Sopeña, Bernardo; Magro, César; Raya, Enrique; Grau, Elena; Román, José A.; de Miguel, Eugenio; López-Longo, F. Javier; Martínez, Lina; Gómez-Vaquero, Carmen; Fernández-Gutiérrez, Benjamín; Rodríguez-Rodríguez, Luis; Díaz-López, J. Bernardino; Caminal-Montero, Luis; Martínez-Zapico, Aleida; Monfort, Jordi; Tío, Laura; Sánchez-Martín, Julio; Alegre-Sancho, Juan J.; Sáez-Comet, Luis; Pérez-Conesa, Mercedes; Corbera-Bellalta, Marc; García-Villanueva, M. Jesús; Fernández-Contreras, M. Encarnación; Sanchez-Pernaute, Olga; Blanco, Ricardo; Ortego-Centeno, Norberto; Ríos-Fernández, Raquel; Callejas, José L.; Fanlo-Mateo, Patricia; Martínez-Taboada, Víctor M.; Beretta, Lorenzo; Lunardi, Claudio; Cimmino, Marco A.; Gianfreda, Davide; Santilli, Daniele; Ramirez, Giuseppe A.; Soriano, Alessandra; Muratore, Francesco; Pazzola, Giulia; Addimanda, Olga; Wijmenga, Cisca; Witte, Torsten; Schirmer, Jan H.; Moosig, Frank; Schönau, Verena; Franke, Andre; Palm, Øyvind; Molberg, Øyvind; Diamantopoulos, Andreas P.; Carette, Simon; Cuthbertson, David; Forbess, Lindsy J.; Hoffman, Gary S.; Khalidi, Nader A.; Koening, Curry L.; Langford, Carol A.; McAlear, Carol A.; Moreland, Larry; Monach, Paul A.; Pagnoux, Christian; Seo, Philip; Spiera, Robert; Sreih, Antoine G.; Warrington, Kenneth J.; Ytterberg, Steven R.; Gregersen, Peter K.; Pease, Colin T.; Gough, Andrew; Green, Michael; Hordon, Lesley; Jarrett, Stephen; Watts, Richard; Levy, Sarah; Patel, Yusuf; Kamath, Sanjeet; Dasgupta, Bhaskar; Worthington, Jane; Koeleman, Bobby P.C.; de Bakker, Paul I.W.; Barrett, Jennifer H.; Salvarani, Carlo; Merkel, Peter A.; González-Gay, Miguel A.; Morgan, Ann W.; Martín, Javier

    2015-01-01

    We conducted a large-scale genetic analysis on giant cell arteritis (GCA), a polygenic immune-mediated vasculitis. A case-control cohort, comprising 1,651 case subjects with GCA and 15,306 unrelated control subjects from six different countries of European ancestry, was genotyped by the Immunochip array. We also imputed HLA data with a previously validated imputation method to perform a more comprehensive analysis of this genomic region. The strongest association signals were observed in the HLA region, with rs477515 representing the highest peak (p = 4.05 × 10−40, OR = 1.73). A multivariate model including class II amino acids of HLA-DRβ1 and HLA-DQα1 and one class I amino acid of HLA-B explained most of the HLA association with GCA, consistent with previously reported associations of classical HLA alleles like HLA-DRB1∗04. An omnibus test on polymorphic amino acid positions highlighted DRβ1 13 (p = 4.08 × 10−43) and HLA-DQα1 47 (p = 4.02 × 10−46), 56, and 76 (both p = 1.84 × 10−45) as relevant positions for disease susceptibility. Outside the HLA region, the most significant loci included PTPN22 (rs2476601, p = 1.73 × 10−6, OR = 1.38), LRRC32 (rs10160518, p = 4.39 × 10−6, OR = 1.20), and REL (rs115674477, p = 1.10 × 10−5, OR = 1.63). Our study provides evidence of a strong contribution of HLA class I and II molecules to susceptibility to GCA. In the non-HLA region, we confirmed a key role for the functional PTPN22 rs2476601 variant and proposed other putative risk loci for GCA involved in Th1, Th17, and Treg cell function. PMID:25817017

  11. Large Scale Comparative Proteomics of a Chloroplast Clp Protease Mutant Reveals Folding Stress, Altered Protein Homeostasis, and Feedback Regulation of Metabolism*

    PubMed Central

    Zybailov, Boris; Friso, Giulia; Kim, Jitae; Rudella, Andrea; Rodríguez, Verenice Ramírez; Asakura, Yukari; Sun, Qi; van Wijk, Klaas J.

    2009-01-01

    The clpr2-1 mutant is delayed in development due to reduction of the chloroplast ClpPR protease complex. To understand the role of Clp proteases in plastid biogenesis and homeostasis, leaf proteomes of young seedlings of clpr2-1 and wild type were compared using large scale mass spectrometry-based quantification using an LTQ-Orbitrap and spectral counting with significance determined by G-tests. Virtually only chloroplast-localized proteins were significantly affected, indicating that the molecular phenotype was confined to the chloroplast. A comparative chloroplast stromal proteome analysis of fully developed plants was used to complement the data set. Chloroplast unfoldase ClpB3 was strongly up-regulated in both young and mature leaves, suggesting widespread and persistent protein folding stress. The importance of ClpB3 in the clp2-1 mutant was demonstrated by the observation that a CLPR2 and CLPB3 double mutant was seedling-lethal. The observed up-regulation of chloroplast chaperones and protein sorting components further illustrated destabilization of protein homeostasis. Delayed rRNA processing and up-regulation of a chloroplast DEAD box RNA helicase and polynucleotide phosphorylase, but no significant change in accumulation of ribosomal subunits, suggested a bottleneck in ribosome assembly or RNA metabolism. Strong up-regulation of a chloroplast translational regulator TypA/BipA GTPase suggested a specific response in plastid gene expression to the distorted homeostasis. The stromal proteases PreP1,2 were up-regulated, likely constituting compensation for reduced Clp protease activity and possibly shared substrates between the ClpP and PreP protease systems. The thylakoid photosynthetic apparatus was decreased in the seedlings, whereas several structural thylakoid-associated plastoglobular proteins were strongly up-regulated. Two thylakoid-associated reductases involved in isoprenoid and chlorophyll synthesis were up-regulated reflecting feedback from rate

  12. Characteristics of aerosol types during large-scale transport of air pollution over the Yellow Sea region and at Cheongwon, Korea, in 2008.

    PubMed

    Kim, Hak-Sung; Chung, Yong-Seung; Lee, Sun-Gu

    2012-04-01

    Episodes of large-scale transport of airborne dust and anthropogenic pollutant particles from different sources in the East Asian continent in 2008 were identified by National Oceanic and Atmospheric Administration satellite RGB (red, green, and blue)-composite images and the mass concentrations of ground level particulate matter. These particles were divided into dust, sea salt, smoke plume, and sulfate by an aerosol classification algorithm. To analyze the aerosol size distribution during large-scale transport of atmospheric aerosols, aerosol optical depth (AOD) and fine aerosol weighting (FW) of moderate imaging spectroradiometer aerosol products were used over the East Asian region. Six episodes of massive airborne dust particles, originating from sandstorms in northern China, Mongolia, and the Loess Plateau of China, were observed at Cheongwon. Classified dust aerosol types were distributed on a large-scale over the Yellow Sea region. The average PM10 and PM2.5 ratio to the total mass concentration TSP were 70% and 15%, respectively. However, the mass concentration of PM2.5 among TSP increased to as high as 23% in an episode where dust traveled in by way of an industrial area in eastern China. In the other five episodes of anthropogenic pollutant particles that flowed into the Korean Peninsula from eastern China, the anthropogenic pollutant particles were largely detected in the form of smoke over the Yellow Sea region. The average PM10 and PM2.5 ratios to TSP were 82% and 65%, respectively. The ratio of PM2.5 mass concentrations among TSP varied significantly depending on the origin and pathway of the airborne dust particles. The average AOD for the large-scale transport of anthropogenic pollutant particles in the East Asian region was measured to be 0.42 ± 0.17, which is higher in terms of the rate against atmospheric aerosols as compared with the AOD (0.36 ± 0.13) for airborne dust particles with sandstorms. In particular, the region ranging from eastern

  13. Large scale analysis of co-existing post-translational modifications in histone tails reveals global fine structure of cross-talk.

    PubMed

    Schwämmle, Veit; Aspalter, Claudia-Maria; Sidoli, Simone; Jensen, Ole N

    2014-07-01

    Mass spectrometry (MS) is a powerful analytical method for the identification and quantification of co-existing post-translational modifications in histone proteins. One of the most important challenges in current chromatin biology is to characterize the relationships between co-existing histone marks, the order and hierarchy of their deposition, and their distinct biological functions. We developed the database CrossTalkDB to organize observed and reported co-existing histone marks as revealed by MS experiments of histone proteins and their derived peptides. Statistical assessment revealed sample-specific patterns for the co-frequency of histone post-translational modifications. We implemented a new method to identify positive and negative interplay between pairs of methylation and acetylation marks in proteins. Many of the detected features were conserved between different cell types or exist across species, thereby revealing general rules for cross-talk between histone marks. The observed features are in accordance with previously reported examples of cross-talk. We observed novel types of interplay among acetylated residues, revealing positive cross-talk between nearby acetylated sites but negative cross-talk for distant ones, and for discrete methylation states at Lys-9, Lys-27, and Lys-36 of histone H3, suggesting a more differentiated functional role of methylation beyond the general expectation of enhanced activity at higher methylation states.

  14. Large-Scale Air Mass Characteristics Observed Over the Remote Tropical Pacific Ocean During March-April 1999: Results from PEM-Tropics B Field Experiment

    NASA Technical Reports Server (NTRS)

    Browell, Edward V.; Fenn, Marta A.; Butler, Carolyn F.; Grant, William B.; Ismail, Syed; Ferrare, Richard A.; Kooi, Susan A.; Brackett, Vincent G.; Clayton, Marian B.; Avery, Melody A.

    2001-01-01

    Eighteen long-range flights over the Pacific Ocean between 38 S to 20 N and 166 E to 90 W were made by the NASA DC-8 aircraft during the NASA Pacific Exploratory Mission (PEM) Tropics B conducted from March 6 to April 18, 1999. Two lidar systems were flown on the DC-8 to remotely measure vertical profiles of ozone (O3), water vapor (H2O), aerosols, and clouds from near the surface to the upper troposphere along their flight track. In situ measurements of a wide range of gases and aerosols were made on the DC-8 for comprehensive characterization of the air and for correlation with the lidar remote measurements. The transition from northeasterly flow of Northern Hemispheric (NH) air on the northern side of the Intertropical Convergence Zone (ITCZ) to generally easterly flow of Southern Hemispheric (SH) air south of the ITCZ was accompanied by a significant decrease in O3, carbon monoxide, hydrocarbons, and aerosols and an increase in H2O. Trajectory analyses indicate that air north of the ITCZ came from Asia and/or the United States, while the air south of the ITCZ had a long residence time over the Pacific, perhaps originating over South America several weeks earlier. Air south of the South Pacific Convergence Zone (SPCZ) came rapidly from the west originating over Australia or Africa. This air had enhanced O3 and aerosols and an associated decrease in H2O. Average latitudinal and longitudinal distributions of O3 and H2O were constructed from the remote and in situ O3 and H2O data, and these distributions are compared with results from PEM-Tropics A conducted in August-October 1996. During PEM-Tropics B, low O3 air was found in the SH across the entire Pacific Basin at low latitudes. This was in strong contrast to the photochemically enhanced O3 levels found across the central and eastern Pacific low latitudes during PEM-Tropics A. Nine air mass types were identified for PEM-Tropics B based on their O3, aerosols, clouds, and potential vorticity characteristics. The

  15. Drastic Compensation of Electronic and Solvation Effects on ATP Hydrolysis Revealed through Large-Scale QM/MM Simulations Combined with a Theory of Solutions.

    PubMed

    Takahashi, Hideaki; Umino, Satoru; Miki, Yuji; Ishizuka, Ryosuke; Maeda, Shu; Morita, Akihiro; Suzuki, Makoto; Matubayasi, Nobuyuki

    2017-03-16

    Hydrolysis of adenosine triphosphate (ATP) is the "energy source" for a variety of biochemical processes. In the present work, we address key features of ATP hydrolysis: the relatively moderate value (about -10 kcal/mol) of the standard free energy, ΔGhyd, of reaction and the insensitivity of ΔGhyd to the number of excess electrons on ATP. We conducted quantum mechanical/molecular mechanical simulation combined with the energy-representation theory of solutions to analyze the electronic-state and solvation contributions to ΔGhyd. It was revealed that the electronic-state contribution in ΔGhyd is largely negative (favorable) upon hydrolysis, due to the reduction of electrostatic repulsion accompanying the breakage of the P-O bond. In contrast, the solvation effect was found to be strongly more favorable on the reactant side. Thus, we showed that a drastic compensation of the two opposite effects takes place, leading to the modest value of ΔGhyd at each number of excess electrons examined. The computational analyses were also conducted for pyrophosphate ions (PPi), and the parallelism between the ATP and PPi hydrolyses was confirmed. Classical molecular dynamics simulation was further carried out to discuss the effect of the solvent environment; the insensitivity of ΔGhyd to the number of excess electrons was seen to hold in solvent water and ethanol.

  16. Crystal structures of yeast beta-alanine synthase complexes reveal the mode of substrate binding and large scale domain closure movements.

    PubMed

    Lundgren, Stina; Andersen, Birgit; Piskur, Jure; Dobritzsch, Doreen

    2007-12-07

    Beta-alanine synthase is the final enzyme of the reductive pyrimidine catabolic pathway, which is responsible for the breakdown of uracil and thymine in higher organisms. The fold of the homodimeric enzyme from the yeast Saccharomyces kluyveri identifies it as a member of the AcyI/M20 family of metallopeptidases. Its subunit consists of a catalytic domain harboring a di-zinc center and a smaller dimerization domain. The present site-directed mutagenesis studies identify Glu(159) and Arg(322) as crucial for catalysis and His(262) and His(397) as functionally important but not essential. We determined the crystal structures of wild-type beta-alanine synthase in complex with the reaction product beta-alanine, and of the mutant E159A with the substrate N-carbamyl-beta-alanine, revealing the closed state of a dimeric AcyI/M20 metallopeptidase-like enzyme. Subunit closure is achieved by a approximately 30 degrees rigid body domain rotation, which completes the active site by integration of substrate binding residues that belong to the dimerization domain of the same or the partner subunit. Substrate binding is achieved via a salt bridge, a number of hydrogen bonds, and coordination to one of the zinc ions of the di-metal center.

  17. Molecular approach to annelid regeneration: cDNA subtraction cloning reveals various novel genes that are upregulated during the large-scale regeneration of the oligochaete, Enchytraeus japonensis.

    PubMed

    Myohara, Maroko; Niva, Cintia Carla; Lee, Jae Min

    2006-08-01

    To identify genes specifically activated during annelid regeneration, suppression subtractive hybridization was performed with cDNAs from regenerating and intact Enchytraeus japonensis, a terrestrial oligochaete that can regenerate a complete organism from small body fragments within 4-5 days. Filter array screening subsequently revealed that about 38% of the forward-subtracted cDNA clones contained genes that were upregulated during regeneration. Two hundred seventy-nine of these clones were sequenced and found to contain 165 different sequences (79 known and 86 unknown). Nine clones were fully sequenced and four of these sequences were matched to known genes for glutamine synthetase, glucosidase 1, retinal protein 4, and phosphoribosylaminoimidazole carboxylase, respectively. The remaining five clones encoded an unknown open-reading frame. The expression levels of these genes were highest during blastema formation. Our present results, therefore, demonstrate the great potential of annelids as a new experimental subject for the exploration of unknown genes that play critical roles in animal regeneration.

  18. Large-scale cortical networks and cognition.

    PubMed

    Bressler, S L

    1995-03-01

    The well-known parcellation of the mammalian cerebral cortex into a large number of functionally distinct cytoarchitectonic areas presents a problem for understanding the complex cortical integrative functions that underlie cognition. How do cortical areas having unique individual functional properties cooperate to accomplish these complex operations? Do neurons distributed throughout the cerebral cortex act together in large-scale functional assemblages? This review examines the substantial body of evidence supporting the view that complex integrative functions are carried out by large-scale networks of cortical areas. Pathway tracing studies in non-human primates have revealed widely distributed networks of interconnected cortical areas, providing an anatomical substrate for large-scale parallel processing of information in the cerebral cortex. Functional coactivation of multiple cortical areas has been demonstrated by neurophysiological studies in non-human primates and several different cognitive functions have been shown to depend on multiple distributed areas by human neuropsychological studies. Electrophysiological studies on interareal synchronization have provided evidence that active neurons in different cortical areas may become not only coactive, but also functionally interdependent. The computational advantages of synchronization between cortical areas in large-scale networks have been elucidated by studies using artificial neural network models. Recent observations of time-varying multi-areal cortical synchronization suggest that the functional topology of a large-scale cortical network is dynamically reorganized during visuomotor behavior.

  19. Association analyses of large-scale glycan microarray data reveal novel host-specific substructures in influenza A virus binding glycans

    PubMed Central

    Zhao, Nan; Martin, Brigitte E.; Yang, Chun-Kai; Luo, Feng; Wan, Xiu-Feng

    2015-01-01

    Influenza A viruses can infect a wide variety of animal species and, occasionally, humans. Infection occurs through the binding formed by viral surface glycoprotein hemagglutinin and certain types of glycan receptors on host cell membranes. Studies have shown that the α2,3-linked sialic acid motif (SA2,3Gal) in avian, equine, and canine species; the α2,6-linked sialic acid motif (SA2,6Gal) in humans; and SA2,3Gal and SA2,6Gal in swine are responsible for the corresponding host tropisms. However, more detailed and refined substructures that determine host tropisms are still not clear. Thus, in this study, we applied association mining on a set of glycan microarray data for 211 influenza viruses from five host groups: humans, swine, canine, migratory waterfowl, and terrestrial birds. The results suggest that besides Neu5Acα2–6Galβ, human-origin viruses could bind glycans with Neu5Acα2–8Neu5Acα2–8Neu5Ac and Neu5Gcα2–6Galβ1–4GlcNAc substructures; Galβ and GlcNAcβ terminal substructures, without sialic acid branches, were associated with the binding of human-, swine-, and avian-origin viruses; sulfated Neu5Acα2–3 substructures were associated with the binding of human- and swine-origin viruses. Finally, through three-dimensional structure characterization, we revealed that the role of glycan chain shapes is more important than that of torsion angles or of overall structural similarities in virus host tropisms. PMID:26508590

  20. Large-scale mitochondrial DNA analysis in Southeast Asia reveals evolutionary effects of cultural isolation in the multi-ethnic population of Myanmar

    PubMed Central

    2014-01-01

    Background Myanmar is the largest country in mainland Southeast Asia with a population of 55 million people subdivided into more than 100 ethnic groups. Ruled by changing kingdoms and dynasties and lying on the trade route between India and China, Myanmar was influenced by numerous cultures. Since its independence from British occupation, tensions between the ruling Bamar and ethnic minorities increased. Results Our aim was to search for genetic footprints of Myanmar’s geographic, historic and sociocultural characteristics and to contribute to the picture of human colonization by describing and dating of new mitochondrial DNA (mtDNA) haplogroups. Therefore, we sequenced the mtDNA control region of 327 unrelated donors and the complete mitochondrial genome of 44 selected individuals according to highest quality standards. Conclusion Phylogenetic analyses of the entire mtDNA genomes uncovered eight new haplogroups and three unclassified basal M-lineages. The multi-ethnic population and the complex history of Myanmar were reflected in its mtDNA heterogeneity. Population genetic analyses of Burmese control region sequences combined with population data from neighboring countries revealed that the Myanmar haplogroup distribution showed a typical Southeast Asian pattern, but also Northeast Asian and Indian influences. The population structure of the extraordinarily diverse Bamar differed from that of the Karen people who displayed signs of genetic isolation. Migration analyses indicated a considerable genetic exchange with an overall positive migration balance from Myanmar to neighboring countries. Age estimates of the newly described haplogroups point to the existence of evolutionary windows where climatic and cultural changes gave rise to mitochondrial haplogroup diversification in Asia. PMID:24467713

  1. Association analyses of large-scale glycan microarray data reveal novel host-specific substructures in influenza A virus binding glycans

    NASA Astrophysics Data System (ADS)

    Zhao, Nan; Martin, Brigitte E.; Yang, Chun-Kai; Luo, Feng; Wan, Xiu-Feng

    2015-10-01

    Influenza A viruses can infect a wide variety of animal species and, occasionally, humans. Infection occurs through the binding formed by viral surface glycoprotein hemagglutinin and certain types of glycan receptors on host cell membranes. Studies have shown that the α2,3-linked sialic acid motif (SA2,3Gal) in avian, equine, and canine species; the α2,6-linked sialic acid motif (SA2,6Gal) in humans; and SA2,3Gal and SA2,6Gal in swine are responsible for the corresponding host tropisms. However, more detailed and refined substructures that determine host tropisms are still not clear. Thus, in this study, we applied association mining on a set of glycan microarray data for 211 influenza viruses from five host groups: humans, swine, canine, migratory waterfowl, and terrestrial birds. The results suggest that besides Neu5Acα2-6Galβ, human-origin viruses could bind glycans with Neu5Acα2-8Neu5Acα2-8Neu5Ac and Neu5Gcα2-6Galβ1-4GlcNAc substructures; Galβ and GlcNAcβ terminal substructures, without sialic acid branches, were associated with the binding of human-, swine-, and avian-origin viruses; sulfated Neu5Acα2-3 substructures were associated with the binding of human- and swine-origin viruses. Finally, through three-dimensional structure characterization, we revealed that the role of glycan chain shapes is more important than that of torsion angles or of overall structural similarities in virus host tropisms.

  2. Large Scale Screening of Digeneans for Neorickettsia Endosymbionts Using Real-Time PCR Reveals New Neorickettsia Genotypes, Host Associations and Geographic Records

    PubMed Central

    Greiman, Stephen E.; Tkach, Vasyl V.; Pulis, Eric; Fayton, Thomas J.; Curran, Stephen S.

    2014-01-01

    Digeneans are endoparasitic flatworms with complex life cycles including one or two intermediate hosts (first of which is always a mollusk) and a vertebrate definitive host. Digeneans may harbor intracellular endosymbiotic bacteria belonging to the genus Neorickettsia (order Rickettsiales, family Anaplasmataceae). Some Neorickettsia are able to invade cells of the digenean's vertebrate host and are known to cause diseases of wildlife and humans. In this study we report the results of screening 771 digenean samples for Neorickettsia collected from various vertebrates in terrestrial, freshwater, brackish, and marine habitats in the United States, China and Australia. Neorickettsia were detected using a newly designed real-time PCR protocol targeting a 152 bp fragment of the heat shock protein coding gene, GroEL, and verified with nested PCR and sequencing of a 1371 bp long region of 16S rRNA. Eight isolates of Neorickettsia have been obtained. Sequence comparison and phylogenetic analysis demonstrated that 7 of these isolates, provisionally named Neorickettsia sp. 1–7 (obtained from allocreadiid Crepidostomum affine, haploporids Saccocoelioides beauforti and Saccocoelioides lizae, faustulid Bacciger sprenti, deropegid Deropegus aspina, a lecithodendriid, and a pleurogenid) represent new genotypes and one (obtained from Metagonimoides oregonensis) was identical to a published sequence of Neorickettsia known as SF agent. All digenean species reported in this study represent new host records. Three of the 6 digenean families (Haploporidae, Pleurogenidae, and Faustulidae) are also reported for the first time as hosts of Neorickettsia. We have detected Neorickettsia in digeneans from China and Australia for the first time based on PCR and sequencing evidence. Our findings suggest that further surveys from broader geographic regions and wider selection of digenean taxa are likely to reveal new Neorickettsia lineages as well as new digenean host associations. PMID

  3. Recent Suicidal Ideation and Suicide Attempts in a Large-Scale Survey of the U.S. Air Force: Prevalences and Demographic Risk Factors

    ERIC Educational Resources Information Center

    Snarr, Jeffery D.; Heyman, Richard E.; Slep, Amy M. Smith

    2010-01-01

    One-year prevalences of self-reported noteworthy suicidal ideation and nonfatal suicide attempts were assessed in a large sample of U.S. Air Force active duty members (N = 52,780). Participants completed the 2006 Community Assessment, which was conducted online. Over 3% of male and 5.5% of female participants reported having experienced noteworthy…

  4. Large-scale structural optimization

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, J.

    1983-01-01

    Problems encountered by aerospace designers in attempting to optimize whole aircraft are discussed, along with possible solutions. Large scale optimization, as opposed to component-by-component optimization, is hindered by computational costs, software inflexibility, concentration on a single, rather than trade-off, design methodology and the incompatibility of large-scale optimization with single program, single computer methods. The software problem can be approached by placing the full analysis outside of the optimization loop. Full analysis is then performed only periodically. Problem-dependent software can be removed from the generic code using a systems programming technique, and then embody the definitions of design variables, objective function and design constraints. Trade-off algorithms can be used at the design points to obtain quantitative answers. Finally, decomposing the large-scale problem into independent subproblems allows systematic optimization of the problems by an organization of people and machines.

  5. Large-scale generic test stand for testing of multiple configurations of air filters utilizing a range of particle size distributions.

    PubMed

    Giffin, Paxton K; Parsons, Michael S; Unz, Ronald J; Waggoner, Charles A

    2012-05-01

    The Institute for Clean Energy Technology (ICET) at Mississippi State University has developed a test stand capable of lifecycle testing of high efficiency particulate air filters and other filters specified in American Society of Mechanical Engineers Code on Nuclear Air and Gas Treatment (AG-1) filters. The test stand is currently equipped to test AG-1 Section FK radial flow filters, and expansion is currently underway to increase testing capabilities for other types of AG-1 filters. The test stand is capable of producing differential pressures of 12.45 kPa (50 in. w.c.) at volumetric air flow rates up to 113.3 m(3)/min (4000 CFM). Testing is performed at elevated and ambient conditions for temperature and relative humidity. Current testing utilizes three challenge aerosols: carbon black, alumina, and Arizona road dust (A1-Ultrafine). Each aerosol has a different mass median diameter to test loading over a wide range of particles sizes. The test stand is designed to monitor and maintain relative humidity and temperature to required specifications. Instrumentation is implemented on the upstream and downstream sections of the test stand as well as on the filter housing itself. Representative data are presented herein illustrating the test stand's capabilities. Digital images of the filter pack collected during and after testing is displayed after the representative data are discussed. In conclusion, the ICET test stand with AG-1 filter testing capabilities has been developed and hurdles such as test parameter stability and design flexibility overcome.

  6. A spatio-temporal screening tool for outlier detection in long term / large scale air quality observation time series and monitoring networks

    NASA Astrophysics Data System (ADS)

    Kracht, Oliver; Reuter, Hannes I.; Gerboles, Michel

    2013-04-01

    We present a consolidated screening tool for the detection of outliers in air quality monitoring data, which considers both attribute values and spatio-temporal relationships. Furthermore, an application example of warnings on abnormal values in time series of PM10 datasets in AirBase is presented. Spatial or temporal outliers in air quality datasets represent stations or individual measurements which differ significantly from other recordings within their spatio-temporal neighbourhood. Such abnormal values can be identified as being extreme compared to their neighbours, even though they do not necessarily require to differ significantly from the statistical distribution of the entire population. The identification of such outliers can be of interest as the basis of data quality control systems when several contributors report their measurements to the collection of larger datasets. Beyond this, it can also provide a simple solution to investigate the accuracy of station classifications. Seen from another viewpoint, it can be used as a tool to detect irregular air pollution emission events (e.g. the influence of fires, wind erosion events, or other accidental situations). The presented procedure for outlier detection was designed based on already existing literature. Specifically, we adapted the "Smooth Spatial Attribute Method" that was first developed for the identification of outlier values in networks of traffic sensors [1]. Since a free and extensible simulation platform was considered important, all codes were prototyped in the R environment which is available under the GNU General Public License [2]. Our algorithms are based on the definition of a neighbourhood for each air quality measurement, corresponding to a spatio-temporal domain limited by time (e.g., +/- 2 days) and distance (e.g., +/- 1 spherical degrees) around the location of ambient air monitoring stations. The objective of the method is that within such a given spatio-temporal domain, in which

  7. Galaxy clustering on large scales.

    PubMed Central

    Efstathiou, G

    1993-01-01

    I describe some recent observations of large-scale structure in the galaxy distribution. The best constraints come from two-dimensional galaxy surveys and studies of angular correlation functions. Results from galaxy redshift surveys are much less precise but are consistent with the angular correlations, provided the distortions in mapping between real-space and redshift-space are relatively weak. The galaxy two-point correlation function, rich-cluster two-point correlation function, and galaxy-cluster cross-correlation function are all well described on large scales ( greater, similar 20h-1 Mpc, where the Hubble constant, H0 = 100h km.s-1.Mpc; 1 pc = 3.09 x 10(16) m) by the power spectrum of an initially scale-invariant, adiabatic, cold-dark-matter Universe with Gamma = Omegah approximately 0.2. I discuss how this fits in with the Cosmic Background Explorer (COBE) satellite detection of large-scale anisotropies in the microwave background radiation and other measures of large-scale structure in the Universe. PMID:11607400

  8. FLAME facility: The effect of obstacles and transverse venting on flame acceleration and transition on detonation for hydrogen-air mixtures at large scale

    SciTech Connect

    Sherman, M.P.; Tieszen, S.R.; Benedick, W.B.

    1989-04-01

    This report describes research on flame acceleration and deflagration-to-detonation transition (DDT) for hydrogen-air mixtures carried out in the FLAME facility, and describes its relevance to nuclear reactor safety. Flame acceleration and DDT can generate high peak pressures that may cause failure of containment. FLAME is a large rectangular channel 30.5 m long, 2.44 m high, and 1.83 m wide. It is closed on the ignition end and open on the far end. The three test variables were hydrogen mole fraction (12--30%), degree of transverse venting (by moving steel top plates---0%, 13%, and 50%), and the absence or presence of certain obstacles in the channel (zero or 33% blockage ratio). The most important variable was the hydrogen mole fraction. The presence of the obstacles tested greatly increased the flame speeds, overpressures, and tendency for DDT compared to similar tests without obstacles. Different obstacle configurations could have greater or lesser effects on flame acceleration and DDT. Large degrees of transverse venting reduced the flame speeds, overpressures, and possibility of DDT. For small degrees of transverse venting (13% top venting), the flame speeds and overpressures were higher than for no transverse venting with reactive mixtures (>18% H/sub 2/), but they were lower with leaner mixtures. The effect of the turbulence generated by the flow out the vents on increasing flame speed can be larger than the effect of venting gas out of the channel and hence reducing the overpressure. With no obstacles and 50% top venting, the flame speeds and overpressures were low, and there was no DDT. For all other cases, DDT was observed above some threshold hydrogen concentration. DDT was obtained at 15% H/sub 2/ with obstacles and no transverse venting. 67 refs., 62 figs.

  9. Application of bioreactor system for large-scale production of Eleutherococcus sessiliflorus somatic embryos in an air-lift bioreactor and production of eleutherosides.

    PubMed

    Shohael, A M; Chakrabarty, D; Yu, K W; Hahn, E J; Paek, K Y

    2005-11-04

    Embryogenic callus was induced from leaf explants of Eleutherococcus sessiliflorus cultured on Murashige and Skoog (MS) basal medium supplemented with 1 mg l(-1) 2,4-dichlorophenoxyacetic acid (2,4-D), while no plant growth regulators were needed for embryo maturation. The addition of 1 mg l(-1) 2,4-D was needed to maintain the embryogenic culture by preventing embryo maturation. Optimal embryo germination and plantlet development was achieved on MS medium with 4 mg l(-1) gibberellic acid (GA(3)). Low-strength MS medium (1/2 and 1/3 strength) was more effective than full-strength MS for the production of normal plantlets with well-developed shoots and roots. The plants were successfully transferred to soil. Embryogenic callus was used to establish a suspension culture for subsequent production of somatic embryos in bioreactor. By inoculating 10 g of embryogenic cells (fresh weight) into a 3l balloon type bubble bioreactor (BTBB) containing 2l MS medium without plant growth regulators, 121.8 g mature somatic embryos at different developmental stages were harvested and could be separated by filtration. Cotyledonary somatic embryos were germinated, and these converted into plantlets following transfer to a 3l BTBB containing 2l MS medium with 4 mg l(-1) GA3. HPLC analysis revealed that the total eleutherosides were significantly higher in leaves of field grown plants as compared to different stages of somatic embryo. However, the content of eleutheroside B was highest in germinated embryos. Germinated embryos also had higher contents of eleutheroside E and eleutheroside E1 as compared to other developmental stages. This result indicates that an efficient protocol for the mass production of E. sessiliflorus biomass can be achieved by bioreactor culture of somatic embryos and can be used as a source of medicinal raw materials.

  10. Domain regulation of imprinting cluster in Kip2/Lit1 subdomain on mouse chromosome 7F4/F5: large-scale DNA methylation analysis reveals that DMR-Lit1 is a putative imprinting control region.

    PubMed

    Yatsuki, Hitomi; Joh, Keiichiro; Higashimoto, Ken; Soejima, Hidenobu; Arai, Yuji; Wang, Youdong; Hatada, Izuho; Obata, Yayoi; Morisaki, Hiroko; Zhang, Zhongming; Nakagawachi, Tetsuji; Satoh, Yuji; Mukai, Tsunehiro

    2002-12-01

    Mouse chromosome 7F4/F5, where the imprinting domain is located, is syntenic to human 11p15.5, the locus for Beckwith-Wiedemann syndrome. The domain is thought to consist of the two subdomains Kip2 (p57(kip2))/Lit1 and Igf2/H19. Because DNA methylation is believed to be a key factor in genomic imprinting, we performed large-scale DNA methylation analysis to identify the cis-element crucial for the regulation of the Kip2/Lit1 subdomain. Ten CpG islands (CGIs) were found, and these were located at the promoter sites, upstream of genes, and within intergenic regions. Bisulphite sequencing revealed that CGIs 4, 5, 8, and 10 were differentially methylated regions (DMRs). CGIs 4, 5, and 10 were methylated paternally in somatic tissues but not in germ cells. CGI8 was methylated in oocyte and maternally in somatic tissues during development. Parental-specific DNase I hypersensitive sites (HSSs) were found near CGI8. These data indicate that CGI8, called DMR-Lit1, is not only the region for gametic methylation but might also be the imprinting control region (ICR) of the subdomain.

  11. Large-scale PACS implementation.

    PubMed

    Carrino, J A; Unkel, P J; Miller, I D; Bowser, C L; Freckleton, M W; Johnson, T G

    1998-08-01

    The transition to filmless radiology is a much more formidable task than making the request for proposal to purchase a (Picture Archiving and Communications System) PACS. The Department of Defense and the Veterans Administration have been pioneers in the transformation of medical diagnostic imaging to the electronic environment. Many civilian sites are expected to implement large-scale PACS in the next five to ten years. This presentation will related the empirical insights gleaned at our institution from a large-scale PACS implementation. Our PACS integration was introduced into a fully operational department (not a new hospital) in which work flow had to continue with minimal impact. Impediments to user acceptance will be addressed. The critical components of this enormous task will be discussed. The topics covered during this session will include issues such as phased implementation, DICOM (digital imaging and communications in medicine) standard-based interaction of devices, hospital information system (HIS)/radiology information system (RIS) interface, user approval, networking, workstation deployment and backup procedures. The presentation will make specific suggestions regarding the implementation team, operating instructions, quality control (QC), training and education. The concept of identifying key functional areas is relevant to transitioning the facility to be entirely on line. Special attention must be paid to specific functional areas such as the operating rooms and trauma rooms where the clinical requirements may not match the PACS capabilities. The printing of films may be necessary for certain circumstances. The integration of teleradiology and remote clinics into a PACS is a salient topic with respect to the overall role of the radiologists providing rapid consultation. A Web-based server allows a clinician to review images and reports on a desk-top (personal) computer and thus reduce the number of dedicated PACS review workstations. This session

  12. Large-Scale Sequence Comparison.

    PubMed

    Lal, Devi; Verma, Mansi

    2017-01-01

    There are millions of sequences deposited in genomic databases, and it is an important task to categorize them according to their structural and functional roles. Sequence comparison is a prerequisite for proper categorization of both DNA and protein sequences, and helps in assigning a putative or hypothetical structure and function to a given sequence. There are various methods available for comparing sequences, alignment being first and foremost for sequences with a small number of base pairs as well as for large-scale genome comparison. Various tools are available for performing pairwise large sequence comparison. The best known tools either perform global alignment or generate local alignments between the two sequences. In this chapter we first provide basic information regarding sequence comparison. This is followed by the description of the PAM and BLOSUM matrices that form the basis of sequence comparison. We also give a practical overview of currently available methods such as BLAST and FASTA, followed by a description and overview of tools available for genome comparison including LAGAN, MumMER, BLASTZ, and AVID.

  13. Large Scale Magnetostrictive Valve Actuator

    NASA Technical Reports Server (NTRS)

    Richard, James A.; Holleman, Elizabeth; Eddleman, David

    2008-01-01

    Marshall Space Flight Center's Valves, Actuators and Ducts Design and Development Branch developed a large scale magnetostrictive valve actuator. The potential advantages of this technology are faster, more efficient valve actuators that consume less power and provide precise position control and deliver higher flow rates than conventional solenoid valves. Magnetostrictive materials change dimensions when a magnetic field is applied; this property is referred to as magnetostriction. Magnetostriction is caused by the alignment of the magnetic domains in the material s crystalline structure and the applied magnetic field lines. Typically, the material changes shape by elongating in the axial direction and constricting in the radial direction, resulting in no net change in volume. All hardware and testing is complete. This paper will discuss: the potential applications of the technology; overview of the as built actuator design; discuss problems that were uncovered during the development testing; review test data and evaluate weaknesses of the design; and discuss areas for improvement for future work. This actuator holds promises of a low power, high load, proportionally controlled actuator for valves requiring 440 to 1500 newtons load.

  14. Large scale cluster computing workshop

    SciTech Connect

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  15. Large-Scale Information Systems

    SciTech Connect

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  16. Conifer defence against insects: microarray gene expression profiling of Sitka spruce (Picea sitchensis) induced by mechanical wounding or feeding by spruce budworms (Choristoneura occidentalis) or white pine weevils (Pissodes strobi) reveals large-scale changes of the host transcriptome.

    PubMed

    Ralph, Steven G; Yueh, Hesther; Friedmann, Michael; Aeschliman, Dana; Zeznik, Jeffrey A; Nelson, Colleen C; Butterfield, Yaron S N; Kirkpatrick, Robert; Liu, Jerry; Jones, Steven J M; Marra, Marco A; Douglas, Carl J; Ritland, Kermit; Bohlmann, Jörg

    2006-08-01

    defence. Refined expression analysis using gene-specific primers and real-time PCR for selected transcripts was in agreement with microarray results for most genes tested. This study provides the first large-scale survey of insect-induced defence transcripts in a gymnosperm and provides a platform for functional investigation of plant-insect interactions in spruce. Induction of spruce genes of octadecanoid and ethylene signalling, terpenoid biosynthesis, and phenolic secondary metabolism are discussed in more detail.

  17. A study of MLFMA for large-scale scattering problems

    NASA Astrophysics Data System (ADS)

    Hastriter, Michael Larkin

    This research is centered in computational electromagnetics with a focus on solving large-scale problems accurately in a timely fashion using first principle physics. Error control of the translation operator in 3-D is shown. A parallel implementation of the multilevel fast multipole algorithm (MLFMA) was studied as far as parallel efficiency and scaling. The large-scale scattering program (LSSP), based on the ScaleME library, was used to solve ultra-large-scale problems including a 200lambda sphere with 20 million unknowns. As these large-scale problems were solved, techniques were developed to accurately estimate the memory requirements. Careful memory management is needed in order to solve these massive problems. The study of MLFMA in large-scale problems revealed significant errors that stemmed from inconsistencies in constants used by different parts of the algorithm. These were fixed to produce the most accurate data possible for large-scale surface scattering problems. Data was calculated on a missile-like target using both high frequency methods and MLFMA. This data was compared and analyzed to determine possible strategies to increase data acquisition speed and accuracy through multiple computation method hybridization.

  18. Large scale mechanical metamaterials as seismic shields

    NASA Astrophysics Data System (ADS)

    Miniaci, Marco; Krushynska, Anastasiia; Bosia, Federico; Pugno, Nicola M.

    2016-08-01

    Earthquakes represent one of the most catastrophic natural events affecting mankind. At present, a universally accepted risk mitigation strategy for seismic events remains to be proposed. Most approaches are based on vibration isolation of structures rather than on the remote shielding of incoming waves. In this work, we propose a novel approach to the problem and discuss the feasibility of a passive isolation strategy for seismic waves based on large-scale mechanical metamaterials, including for the first time numerical analysis of both surface and guided waves, soil dissipation effects, and adopting a full 3D simulations. The study focuses on realistic structures that can be effective in frequency ranges of interest for seismic waves, and optimal design criteria are provided, exploring different metamaterial configurations, combining phononic crystals and locally resonant structures and different ranges of mechanical properties. Dispersion analysis and full-scale 3D transient wave transmission simulations are carried out on finite size systems to assess the seismic wave amplitude attenuation in realistic conditions. Results reveal that both surface and bulk seismic waves can be considerably attenuated, making this strategy viable for the protection of civil structures against seismic risk. The proposed remote shielding approach could open up new perspectives in the field of seismology and in related areas of low-frequency vibration damping or blast protection.

  19. Large Scale Metal Additive Techniques Review

    SciTech Connect

    Nycz, Andrzej; Adediran, Adeola I; Noakes, Mark W; Love, Lonnie J

    2016-01-01

    In recent years additive manufacturing made long strides toward becoming a main stream production technology. Particularly strong progress has been made in large-scale polymer deposition. However, large scale metal additive has not yet reached parity with large scale polymer. This paper is a review study of the metal additive techniques in the context of building large structures. Current commercial devices are capable of printing metal parts on the order of several cubic feet compared to hundreds of cubic feet for the polymer side. In order to follow the polymer progress path several factors are considered: potential to scale, economy, environment friendliness, material properties, feedstock availability, robustness of the process, quality and accuracy, potential for defects, and post processing as well as potential applications. This paper focuses on current state of art of large scale metal additive technology with a focus on expanding the geometric limits.

  20. Large-scale regions of antimatter

    SciTech Connect

    Grobov, A. V. Rubin, S. G.

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  1. Large-scale velocity structures in turbulent thermal convection.

    PubMed

    Qiu, X L; Tong, P

    2001-09-01

    A systematic study of large-scale velocity structures in turbulent thermal convection is carried out in three different aspect-ratio cells filled with water. Laser Doppler velocimetry is used to measure the velocity profiles and statistics over varying Rayleigh numbers Ra and at various spatial positions across the whole convection cell. Large velocity fluctuations are found both in the central region and near the cell boundary. Despite the large velocity fluctuations, the flow field still maintains a large-scale quasi-two-dimensional structure, which rotates in a coherent manner. This coherent single-roll structure scales with Ra and can be divided into three regions in the rotation plane: (1) a thin viscous boundary layer, (2) a fully mixed central core region with a constant mean velocity gradient, and (3) an intermediate plume-dominated buffer region. The experiment reveals a unique driving mechanism for the large-scale coherent rotation in turbulent convection.

  2. Global Wildfire Forecasts Using Large Scale Climate Indices

    NASA Astrophysics Data System (ADS)

    Shen, Huizhong; Tao, Shu

    2016-04-01

    Using weather readings, fire early warning can provided forecast 4-6 hour in advance to minimize fire loss. The benefit would be dramatically enhanced if relatively accurate long-term projection can be also provided. Here we present a novel method for predicting global fire season severity (FSS) at least three months in advance using multiple large-scale climate indices (CIs). The predictive ability is proven effective for various geographic locations and resolution. Globally, as well as in most continents, the El Niño Southern Oscillation (ENSO) is the dominant driving force controlling interannual FSS variability, whereas other CIs also play indispensable roles. We found that a moderate El Niño event is responsible for 465 (272-658 as interquartile range) Tg carbon release and an annual increase of 29,500 (24,500-34,800) deaths from inhalation exposure to air pollutants. Southeast Asia accounts for half of the deaths. Both intercorrelation and interaction of WPs and CIs are revealed, suggesting possible climate-induced modification of fire responses to weather conditions. Our models can benefit fire management in response to climate change.

  3. Large-scale integration of small molecule-induced genome-wide transcriptional responses, Kinome-wide binding affinities and cell-growth inhibition profiles reveal global trends characterizing systems-level drug action.

    PubMed

    Vidović, Dušica; Koleti, Amar; Schürer, Stephan C

    2014-01-01

    The Library of Integrated Network-based Cellular Signatures (LINCS) project is a large-scale coordinated effort to build a comprehensive systems biology reference resource. The goals of the program include the generation of a very large multidimensional data matrix and informatics and computational tools to integrate, analyze, and make the data readily accessible. LINCS data include genome-wide transcriptional signatures, biochemical protein binding profiles, cellular phenotypic response profiles and various other datasets for a wide range of cell model systems and molecular and genetic perturbations. Here we present a partial survey of this data facilitated by data standards and in particular a robust compound standardization workflow; we integrated several types of LINCS signatures and analyzed the results with a focus on mechanism of action (MoA) and chemical compounds. We illustrate how kinase targets can be related to disease models and relevant drugs. We identified some fundamental trends that appear to link Kinome binding profiles and transcriptional signatures to chemical information and biochemical binding profiles to transcriptional responses independent of chemical similarity. To fill gaps in the datasets we developed and applied predictive models. The results can be interpreted at the systems level as demonstrated based on a large number of signaling pathways. We can identify clear global relationships, suggesting robustness of cellular responses to chemical perturbation. Overall, the results suggest that chemical similarity is a useful measure at the systems level, which would support phenotypic drug optimization efforts. With this study we demonstrate the potential of such integrated analysis approaches and suggest prioritizing further experiments to fill the gaps in the current data.

  4. Survey on large scale system control methods

    NASA Technical Reports Server (NTRS)

    Mercadal, Mathieu

    1987-01-01

    The problem inherent to large scale systems such as power network, communication network and economic or ecological systems were studied. The increase in size and flexibility of future spacecraft has put those dynamical systems into the category of large scale systems, and tools specific to the class of large systems are being sought to design control systems that can guarantee more stability and better performance. Among several survey papers, reference was found to a thorough investigation on decentralized control methods. Especially helpful was the classification made of the different existing approaches to deal with large scale systems. A very similar classification is used, even though the papers surveyed are somehow different from the ones reviewed in other papers. Special attention is brought to the applicability of the existing methods to controlling large mechanical systems like large space structures. Some recent developments are added to this survey.

  5. Large-scale nanophotonic phased array.

    PubMed

    Sun, Jie; Timurdogan, Erman; Yaacobi, Ami; Hosseini, Ehsan Shah; Watts, Michael R

    2013-01-10

    Electromagnetic phased arrays at radio frequencies are well known and have enabled applications ranging from communications to radar, broadcasting and astronomy. The ability to generate arbitrary radiation patterns with large-scale phased arrays has long been pursued. Although it is extremely expensive and cumbersome to deploy large-scale radiofrequency phased arrays, optical phased arrays have a unique advantage in that the much shorter optical wavelength holds promise for large-scale integration. However, the short optical wavelength also imposes stringent requirements on fabrication. As a consequence, although optical phased arrays have been studied with various platforms and recently with chip-scale nanophotonics, all of the demonstrations so far are restricted to one-dimensional or small-scale two-dimensional arrays. Here we report the demonstration of a large-scale two-dimensional nanophotonic phased array (NPA), in which 64 × 64 (4,096) optical nanoantennas are densely integrated on a silicon chip within a footprint of 576 μm × 576 μm with all of the nanoantennas precisely balanced in power and aligned in phase to generate a designed, sophisticated radiation pattern in the far field. We also show that active phase tunability can be realized in the proposed NPA by demonstrating dynamic beam steering and shaping with an 8 × 8 array. This work demonstrates that a robust design, together with state-of-the-art complementary metal-oxide-semiconductor technology, allows large-scale NPAs to be implemented on compact and inexpensive nanophotonic chips. In turn, this enables arbitrary radiation pattern generation using NPAs and therefore extends the functionalities of phased arrays beyond conventional beam focusing and steering, opening up possibilities for large-scale deployment in applications such as communication, laser detection and ranging, three-dimensional holography and biomedical sciences, to name just a few.

  6. The large-scale distribution of galaxies

    NASA Technical Reports Server (NTRS)

    Geller, Margaret J.

    1989-01-01

    The spatial distribution of galaxies in the universe is characterized on the basis of the six completed strips of the Harvard-Smithsonian Center for Astrophysics redshift-survey extension. The design of the survey is briefly reviewed, and the results are presented graphically. Vast low-density voids similar to the void in Bootes are found, almost completely surrounded by thin sheets of galaxies. Also discussed are the implications of the results for the survey sampling problem, the two-point correlation function of the galaxy distribution, the possibility of detecting large-scale coherent flows, theoretical models of large-scale structure, and the identification of groups and clusters of galaxies.

  7. Large-scale assembly of colloidal particles

    NASA Astrophysics Data System (ADS)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  8. Large scale features and energetics of the hybrid subtropical low `Duck' over the Tasman Sea

    NASA Astrophysics Data System (ADS)

    Pezza, Alexandre Bernardes; Garde, Luke Andrew; Veiga, José Augusto Paixão; Simmonds, Ian

    2014-01-01

    New aspects of the genesis and partial tropical transition of a rare hybrid subtropical cyclone on the eastern Australian coast are presented. The `Duck' (March 2001) attracted more recent attention due to its underlying genesis mechanisms being remarkably similar to the first South Atlantic hurricane (March 2004). Here we put this cyclone in climate perspective, showing that it belongs to a class within the 1 % lowest frequency percentile in the Southern Hemisphere as a function of its thermal evolution. A large scale analysis reveals a combined influence from an existing tropical cyclone and a persistent mid-latitude block. A Lagrangian tracer showed that the upper level air parcels arriving at the cyclone's center had been modified by the blocking. Lorenz energetics is used to identify connections with both tropical and extratropical processes, and reveal how these create the large scale environment conducive to the development of the vortex. The results reveal that the blocking exerted the most important influence, with a strong peak in barotropic generation of kinetic energy over a large area traversed by the air parcels just before genesis. A secondary peak also coincided with the first time the cyclone developed an upper level warm core, but with insufficient amplitude to allow for a full tropical transition. The applications of this technique are numerous and promising, particularly on the use of global climate models to infer changes in environmental parameters associated with severe storms.

  9. Management of large-scale technology

    NASA Technical Reports Server (NTRS)

    Levine, A.

    1985-01-01

    Two major themes are addressed in this assessment of the management of large-scale NASA programs: (1) how a high technology agency was a decade marked by a rapid expansion of funds and manpower in the first half and almost as rapid contraction in the second; and (2) how NASA combined central planning and control with decentralized project execution.

  10. Evaluating Large-Scale Interactive Radio Programmes

    ERIC Educational Resources Information Center

    Potter, Charles; Naidoo, Gordon

    2009-01-01

    This article focuses on the challenges involved in conducting evaluations of interactive radio programmes in South Africa with large numbers of schools, teachers, and learners. It focuses on the role such large-scale evaluation has played during the South African radio learning programme's development stage, as well as during its subsequent…

  11. Large-scale Intermittency In A Topographically Perturbed Atmospheric Boundary Layer

    NASA Astrophysics Data System (ADS)

    Cava, D.; Schipa, S.; Giostra, U.

    Flow perturbation due to complex topography has been investigated. Analysed data were collected upstream and at the top of a steep ridge, Inexpressible Island, Antarc- tica. The comparison between spectra relative to wind velocity data collected upstream and at the obstacle top highlights the non-equilibrium of large eddies on the ridge. According to the rapid distortion theory, the normalised spectra at the summit of the obstacle display a spectral lag effect. Moreover, the topographic perturbation produces low-frequency secondary spectral maxima in all wind velocity components. Time se- ries of the wind velocity fluctuations and of the instantaneous momentum flux clearly reveal large-scale burst-like structures. The intermittent character of the large scale perturbation has been investigated using an Adaptive Multiresolution data filter based on the wavelet transform theory. This technique allows us to determine the occur- rence and the duration of the intermittent events. The quadrant analysis of the wavelet coefficients evidences the leading role of the sweeps (i.e. motions of high-speed air towards the surface) in transporting momentum flux at frequencies characteristics of large-scale topographic perturbation.

  12. Large-scale Advanced Propfan (LAP) program

    NASA Technical Reports Server (NTRS)

    Sagerser, D. A.; Ludemann, S. G.

    1985-01-01

    The propfan is an advanced propeller concept which maintains the high efficiencies traditionally associated with conventional propellers at the higher aircraft cruise speeds associated with jet transports. The large-scale advanced propfan (LAP) program extends the research done on 2 ft diameter propfan models to a 9 ft diameter article. The program includes design, fabrication, and testing of both an eight bladed, 9 ft diameter propfan, designated SR-7L, and a 2 ft diameter aeroelastically scaled model, SR-7A. The LAP program is complemented by the propfan test assessment (PTA) program, which takes the large-scale propfan and mates it with a gas generator and gearbox to form a propfan propulsion system and then flight tests this system on the wing of a Gulfstream 2 testbed aircraft.

  13. Condition Monitoring of Large-Scale Facilities

    NASA Technical Reports Server (NTRS)

    Hall, David L.

    1999-01-01

    This document provides a summary of the research conducted for the NASA Ames Research Center under grant NAG2-1182 (Condition-Based Monitoring of Large-Scale Facilities). The information includes copies of view graphs presented at NASA Ames in the final Workshop (held during December of 1998), as well as a copy of a technical report provided to the COTR (Dr. Anne Patterson-Hine) subsequent to the workshop. The material describes the experimental design, collection of data, and analysis results associated with monitoring the health of large-scale facilities. In addition to this material, a copy of the Pennsylvania State University Applied Research Laboratory data fusion visual programming tool kit was also provided to NASA Ames researchers.

  14. Large-scale instabilities of helical flows

    NASA Astrophysics Data System (ADS)

    Cameron, Alexandre; Alexakis, Alexandros; Brachet, Marc-Étienne

    2016-10-01

    Large-scale hydrodynamic instabilities of periodic helical flows of a given wave number K are investigated using three-dimensional Floquet numerical computations. In the Floquet formalism the unstable field is expanded in modes of different spacial periodicity. This allows us (i) to clearly distinguish large from small scale instabilities and (ii) to study modes of wave number q of arbitrarily large-scale separation q ≪K . Different flows are examined including flows that exhibit small-scale turbulence. The growth rate σ of the most unstable mode is measured as a function of the scale separation q /K ≪1 and the Reynolds number Re. It is shown that the growth rate follows the scaling σ ∝q if an AKA effect [Frisch et al., Physica D: Nonlinear Phenomena 28, 382 (1987), 10.1016/0167-2789(87)90026-1] is present or a negative eddy viscosity scaling σ ∝q2 in its absence. This holds both for the Re≪1 regime where previously derived asymptotic results are verified but also for Re=O (1 ) that is beyond their range of validity. Furthermore, for values of Re above a critical value ReSc beyond which small-scale instabilities are present, the growth rate becomes independent of q and the energy of the perturbation at large scales decreases with scale separation. The nonlinear behavior of these large-scale instabilities is also examined in the nonlinear regime where the largest scales of the system are found to be the most dominant energetically. These results are interpreted by low-order models.

  15. Large-Scale Aerosol Modeling and Analysis

    DTIC Science & Technology

    2008-09-30

    aerosol species up to six days in advance anywhere on the globe. NAAPS and COAMPS are particularly useful for forecasts of dust storms in areas...impact cloud processes globally. With increasing dust storms due to climate change and land use changes in desert regions, the impact of the...bacteria in large-scale dust storms is expected to significantly impact warm ice cloud formation, human health, and ecosystems globally. In Niemi et al

  16. Economically viable large-scale hydrogen liquefaction

    NASA Astrophysics Data System (ADS)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  17. Large-Scale Visual Data Analysis

    NASA Astrophysics Data System (ADS)

    Johnson, Chris

    2014-04-01

    Modern high performance computers have speeds measured in petaflops and handle data set sizes measured in terabytes and petabytes. Although these machines offer enormous potential for solving very large-scale realistic computational problems, their effectiveness will hinge upon the ability of human experts to interact with their simulation results and extract useful information. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most most important tools in helping to understand such large-scale information. Our research at the Scientific Computing and Imaging (SCI) Institute at the University of Utah has focused on innovative, scalable techniques for large-scale 3D visual data analysis. In this talk, I will present state- of-the-art visualization techniques, including scalable visualization algorithms and software, cluster-based visualization methods and innovate visualization techniques applied to problems in computational science, engineering, and medicine. I will conclude with an outline for a future high performance visualization research challenges and opportunities.

  18. Large-scale neuromorphic computing systems

    NASA Astrophysics Data System (ADS)

    Furber, Steve

    2016-10-01

    Neuromorphic computing covers a diverse range of approaches to information processing all of which demonstrate some degree of neurobiological inspiration that differentiates them from mainstream conventional computing systems. The philosophy behind neuromorphic computing has its origins in the seminal work carried out by Carver Mead at Caltech in the late 1980s. This early work influenced others to carry developments forward, and advances in VLSI technology supported steady growth in the scale and capability of neuromorphic devices. Recently, a number of large-scale neuromorphic projects have emerged, taking the approach to unprecedented scales and capabilities. These large-scale projects are associated with major new funding initiatives for brain-related research, creating a sense that the time and circumstances are right for progress in our understanding of information processing in the brain. In this review we present a brief history of neuromorphic engineering then focus on some of the principal current large-scale projects, their main features, how their approaches are complementary and distinct, their advantages and drawbacks, and highlight the sorts of capabilities that each can deliver to neural modellers.

  19. Equivalent common path method in large-scale laser comparator

    NASA Astrophysics Data System (ADS)

    He, Mingzhao; Li, Jianshuang; Miao, Dongjing

    2015-02-01

    Large-scale laser comparator is main standard device that providing accurate, reliable and traceable measurements for high precision large-scale line and 3D measurement instruments. It mainly composed of guide rail, motion control system, environmental parameters monitoring system and displacement measurement system. In the laser comparator, the main error sources are temperature distribution, straightness of guide rail and pitch and yaw of measuring carriage. To minimize the measurement uncertainty, an equivalent common optical path scheme is proposed and implemented. Three laser interferometers are adjusted to parallel with the guide rail. The displacement in an arbitrary virtual optical path is calculated using three displacements without the knowledge of carriage orientations at start and end positions. The orientation of air floating carriage is calculated with displacements of three optical path and position of three retroreflectors which are precisely measured by Laser Tracker. A 4th laser interferometer is used in the virtual optical path as reference to verify this compensation method. This paper analyzes the effect of rail straightness on the displacement measurement. The proposed method, through experimental verification, can improve the measurement uncertainty of large-scale laser comparator.

  20. Large-Scale Fusion of Gray Matter and Resting-State Functional MRI Reveals Common and Distinct Biological Markers across the Psychosis Spectrum in the B-SNIP Cohort

    PubMed Central

    Wang, Zheng; Meda, Shashwath A.; Keshavan, Matcheri S.; Tamminga, Carol A.; Sweeney, John A.; Clementz, Brett A.; Schretlen, David J.; Calhoun, Vince D.; Lui, Su; Pearlson, Godfrey D.

    2015-01-01

    To investigate whether aberrant interactions between brain structure and function present similarly or differently across probands with psychotic illnesses [schizophrenia (SZ), schizoaffective disorder (SAD), and bipolar I disorder with psychosis (BP)] and whether these deficits are shared with their first-degree non-psychotic relatives. A total of 1199 subjects were assessed, including 220 SZ, 147 SAD, 180 psychotic BP, 150 first-degree relatives of SZ, 126 SAD relatives, 134 BP relatives, and 242 healthy controls (1). All subjects underwent structural MRI (sMRI) and resting-state functional MRI (rs-fMRI) scanning. Joint-independent component analysis (jICA) was used to fuse sMRI gray matter and rs-fMRI amplitude of low-frequency fluctuations data to identify the relationship between the two modalities. jICA revealed two significantly fused components. The association between functional brain alteration in a prefrontal–striatal–thalamic–cerebellar network and structural abnormalities in the default mode network was found to be common across psychotic diagnoses and correlated with cognitive function, social function, and schizo-bipolar scale scores. The fused alteration in the temporal lobe was unique to SZ and SAD. The above effects were not seen in any relative group (including those with cluster-A personality). Using a multivariate-fused approach involving two widely used imaging markers, we demonstrate both shared and distinct biological traits across the psychosis spectrum. Furthermore, our results suggest that the above traits are psychosis biomarkers rather than endophenotypes. PMID:26732139

  1. Large-scale sodium spray fire code validation (SOFICOV) test

    SciTech Connect

    Jeppson, D.W.; Muhlestein, L.D.

    1985-01-01

    A large-scale, sodium, spray fire code validation test was performed in the HEDL 850-m/sup 3/ Containment System Test Facility (CSTF) as part of the Sodium Spray Fire Code Validation (SOFICOV) program. Six hundred fifty eight kilograms of sodium spray was sprayed in an air atmosphere for a period of 2400 s. The sodium spray droplet sizes and spray pattern distribution were estimated. The containment atmosphere temperature and pressure response, containment wall temperature response and sodium reaction rate with oxygen were measured. These results are compared to post-test predictions using SPRAY and NACOM computer codes.

  2. Large-Scale Compton Imaging for Wide-Area Surveillance

    SciTech Connect

    Lange, D J; Manini, H A; Wright, D M

    2006-03-01

    We study the performance of a large-scale Compton imaging detector placed in a low-flying aircraft, used to search wide areas for rad/nuc threat sources. In this paper we investigate the performance potential of equipping aerial platforms with gamma-ray detectors that have photon sensitivity up to a few MeV. We simulate the detector performance, and present receiver operating characteristics (ROC) curves for a benchmark scenario using a {sup 137}Cs source. The analysis uses a realistic environmental background energy spectrum and includes air attenuation.

  3. Experimental Simulations of Large-Scale Collisions

    NASA Technical Reports Server (NTRS)

    Housen, Kevin R.

    2002-01-01

    This report summarizes research on the effects of target porosity on the mechanics of impact cratering. Impact experiments conducted on a centrifuge provide direct simulations of large-scale cratering on porous asteroids. The experiments show that large craters in porous materials form mostly by compaction, with essentially no deposition of material into the ejecta blanket that is a signature of cratering in less-porous materials. The ratio of ejecta mass to crater mass is shown to decrease with increasing crater size or target porosity. These results are consistent with the observation that large closely-packed craters on asteroid Mathilde appear to have formed without degradation to earlier craters.

  4. Large-Scale PV Integration Study

    SciTech Connect

    Lu, Shuai; Etingov, Pavel V.; Diao, Ruisheng; Ma, Jian; Samaan, Nader A.; Makarov, Yuri V.; Guo, Xinxin; Hafen, Ryan P.; Jin, Chunlian; Kirkham, Harold; Shlatz, Eugene; Frantzis, Lisa; McClive, Timothy; Karlson, Gregory; Acharya, Dhruv; Ellis, Abraham; Stein, Joshua; Hansen, Clifford; Chadliev, Vladimir; Smart, Michael; Salgo, Richard; Sorensen, Rahn; Allen, Barbara; Idelchik, Boris

    2011-07-29

    This research effort evaluates the impact of large-scale photovoltaic (PV) and distributed generation (DG) output on NV Energy’s electric grid system in southern Nevada. It analyzes the ability of NV Energy’s generation to accommodate increasing amounts of utility-scale PV and DG, and the resulting cost of integrating variable renewable resources. The study was jointly funded by the United States Department of Energy and NV Energy, and conducted by a project team comprised of industry experts and research scientists from Navigant Consulting Inc., Sandia National Laboratories, Pacific Northwest National Laboratory and NV Energy.

  5. What is a large-scale dynamo?

    NASA Astrophysics Data System (ADS)

    Nigro, G.; Pongkitiwanichakul, P.; Cattaneo, F.; Tobias, S. M.

    2017-01-01

    We consider kinematic dynamo action in a sheared helical flow at moderate to high values of the magnetic Reynolds number (Rm). We find exponentially growing solutions which, for large enough shear, take the form of a coherent part embedded in incoherent fluctuations. We argue that at large Rm large-scale dynamo action should be identified by the presence of structures coherent in time, rather than those at large spatial scales. We further argue that although the growth rate is determined by small-scale processes, the period of the coherent structures is set by mean-field considerations.

  6. Neutrinos and large-scale structure

    SciTech Connect

    Eisenstein, Daniel J.

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  7. Large-scale planar lightwave circuits

    NASA Astrophysics Data System (ADS)

    Bidnyk, Serge; Zhang, Hua; Pearson, Matt; Balakrishnan, Ashok

    2011-01-01

    By leveraging advanced wafer processing and flip-chip bonding techniques, we have succeeded in hybrid integrating a myriad of active optical components, including photodetectors and laser diodes, with our planar lightwave circuit (PLC) platform. We have combined hybrid integration of active components with monolithic integration of other critical functions, such as diffraction gratings, on-chip mirrors, mode-converters, and thermo-optic elements. Further process development has led to the integration of polarization controlling functionality. Most recently, all these technological advancements have been combined to create large-scale planar lightwave circuits that comprise hundreds of optical elements integrated on chips less than a square inch in size.

  8. Colloquium: Large scale simulations on GPU clusters

    NASA Astrophysics Data System (ADS)

    Bernaschi, Massimo; Bisson, Mauro; Fatica, Massimiliano

    2015-06-01

    Graphics processing units (GPU) are currently used as a cost-effective platform for computer simulations and big-data processing. Large scale applications require that multiple GPUs work together but the efficiency obtained with cluster of GPUs is, at times, sub-optimal because the GPU features are not exploited at their best. We describe how it is possible to achieve an excellent efficiency for applications in statistical mechanics, particle dynamics and networks analysis by using suitable memory access patterns and mechanisms like CUDA streams, profiling tools, etc. Similar concepts and techniques may be applied also to other problems like the solution of Partial Differential Equations.

  9. Large scale phononic metamaterials for seismic isolation

    SciTech Connect

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  10. Large-scale Heterogeneous Network Data Analysis

    DTIC Science & Technology

    2012-07-31

    Data for Multi-Player Influence Maximization on Social Networks.” KDD 2012 (Demo).  Po-Tzu Chang , Yen-Chieh Huang, Cheng-Lun Yang, Shou-De Lin, Pu...Jen Cheng. “Learning-Based Time-Sensitive Re-Ranking for Web Search.” SIGIR 2012 (poster)  Hung -Che Lai, Cheng-Te Li, Yi-Chen Lo, and Shou-De Lin...Exploiting and Evaluating MapReduce for Large-Scale Graph Mining.” ASONAM 2012 (Full, 16% acceptance ratio).  Hsun-Ping Hsieh , Cheng-Te Li, and Shou

  11. Large-scale Intelligent Transporation Systems simulation

    SciTech Connect

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  12. Local gravity and large-scale structure

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, Roman; Vittorio, Nicola; Wyse, Rosemary F. G.

    1990-01-01

    The magnitude and direction of the observed dipole anisotropy of the galaxy distribution can in principle constrain the amount of large-scale power present in the spectrum of primordial density fluctuations. This paper confronts the data, provided by a recent redshift survey of galaxies detected by the IRAS satellite, with the predictions of two cosmological models with very different levels of large-scale power: the biased Cold Dark Matter dominated model (CDM) and a baryon-dominated model (BDM) with isocurvature initial conditions. Model predictions are investigated for the Local Group peculiar velocity, v(R), induced by mass inhomogeneities distributed out to a given radius, R, for R less than about 10,000 km/s. Several convergence measures for v(R) are developed, which can become powerful cosmological tests when deep enough samples become available. For the present data sets, the CDM and BDM predictions are indistinguishable at the 2 sigma level and both are consistent with observations. A promising discriminant between cosmological models is the misalignment angle between v(R) and the apex of the dipole anisotropy of the microwave background.

  13. Large-scale Globally Propagating Coronal Waves.

    PubMed

    Warmuth, Alexander

    Large-scale, globally propagating wave-like disturbances have been observed in the solar chromosphere and by inference in the corona since the 1960s. However, detailed analysis of these phenomena has only been conducted since the late 1990s. This was prompted by the availability of high-cadence coronal imaging data from numerous spaced-based instruments, which routinely show spectacular globally propagating bright fronts. Coronal waves, as these perturbations are usually referred to, have now been observed in a wide range of spectral channels, yielding a wealth of information. Many findings have supported the "classical" interpretation of the disturbances: fast-mode MHD waves or shocks that are propagating in the solar corona. However, observations that seemed inconsistent with this picture have stimulated the development of alternative models in which "pseudo waves" are generated by magnetic reconfiguration in the framework of an expanding coronal mass ejection. This has resulted in a vigorous debate on the physical nature of these disturbances. This review focuses on demonstrating how the numerous observational findings of the last one and a half decades can be used to constrain our models of large-scale coronal waves, and how a coherent physical understanding of these disturbances is finally emerging.

  14. Engineering management of large scale systems

    NASA Technical Reports Server (NTRS)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  15. Large-scale parametric survival analysis.

    PubMed

    Mittal, Sushil; Madigan, David; Cheng, Jerry Q; Burd, Randall S

    2013-10-15

    Survival analysis has been a topic of active statistical research in the past few decades with applications spread across several areas. Traditional applications usually consider data with only a small numbers of predictors with a few hundreds or thousands of observations. Recent advances in data acquisition techniques and computation power have led to considerable interest in analyzing very-high-dimensional data where the number of predictor variables and the number of observations range between 10(4) and 10(6). In this paper, we present a tool for performing large-scale regularized parametric survival analysis using a variant of the cyclic coordinate descent method. Through our experiments on two real data sets, we show that application of regularized models to high-dimensional data avoids overfitting and can provide improved predictive performance and calibration over corresponding low-dimensional models.

  16. Primer design for large scale sequencing.

    PubMed

    Haas, S; Vingron, M; Poustka, A; Wiemann, S

    1998-06-15

    We have developed PRIDE, a primer design program that automatically designs primers in single contigs or whole sequencing projects to extend the already known sequence and to double strand single-stranded regions. The program is fully integrated into the Staden package (GAP4) and accessible with a graphical user interface. PRIDE uses a fuzzy logic-based system to calculate primer qualities. The computational performance of PRIDE is enhanced by using suffix trees to store the huge amount of data being produced. A test set of 110 sequencing primers and 11 PCR primer pairs has been designed on genomic templates, cDNAs and sequences containing repetitive elements to analyze PRIDE's success rate. The high performance of PRIDE, combined with its minimal requirement of user interaction and its fast algorithm, make this program useful for the large scale design of primers, especially in large sequencing projects.

  17. Large scale preparation of pure phycobiliproteins.

    PubMed

    Padgett, M P; Krogmann, D W

    1987-01-01

    This paper describes simple procedures for the purification of large amounts of phycocyanin and allophycocyanin from the cyanobacterium Microcystis aeruginosa. A homogeneous natural bloom of this organism provided hundreds of kilograms of cells. Large samples of cells were broken by freezing and thawing. Repeated extraction of the broken cells with distilled water released phycocyanin first, then allophycocyanin, and provides supporting evidence for the current models of phycobilisome structure. The very low ionic strength of the aqueous extracts allowed allophycocyanin release in a particulate form so that this protein could be easily concentrated by centrifugation. Other proteins in the extract were enriched and concentrated by large scale membrane filtration. The biliproteins were purified to homogeneity by chromatography on DEAE cellulose. Purity was established by HPLC and by N-terminal amino acid sequence analysis. The proteins were examined for stability at various pHs and exposures to visible light.

  18. Large-Scale Organization of Glycosylation Networks

    NASA Astrophysics Data System (ADS)

    Kim, Pan-Jun; Lee, Dong-Yup; Jeong, Hawoong

    2009-03-01

    Glycosylation is a highly complex process to produce a diverse repertoire of cellular glycans that are frequently attached to proteins and lipids. Glycans participate in fundamental biological processes including molecular trafficking and clearance, cell proliferation and apoptosis, developmental biology, immune response, and pathogenesis. N-linked glycans found on proteins are formed by sequential attachments of monosaccharides with the help of a relatively small number of enzymes. Many of these enzymes can accept multiple N-linked glycans as substrates, thus generating a large number of glycan intermediates and their intermingled pathways. Motivated by the quantitative methods developed in complex network research, we investigate the large-scale organization of such N-glycosylation pathways in a mammalian cell. The uncovered results give the experimentally-testable predictions for glycosylation process, and can be applied to the engineering of therapeutic glycoproteins.

  19. Efficient, large scale separation of coal macerals

    SciTech Connect

    Dyrkacz, G.R.; Bloomquist, C.A.A.

    1988-01-01

    The authors believe that the separation of macerals by continuous flow centrifugation offers a simple technique for the large scale separation of macerals. With relatively little cost (/approximately/ $10K), it provides an opportunity for obtaining quite pure maceral fractions. Although they have not completely worked out all the nuances of this separation system, they believe that the problems they have indicated can be minimized to pose only minor inconvenience. It cannot be said that this system completely bypasses the disagreeable tedium or time involved in separating macerals, nor will it by itself overcome the mental inertia required to make maceral separation an accepted necessary fact in fundamental coal science. However, they find their particular brand of continuous flow centrifugation is considerably faster than sink/float separation, can provide a good quality product with even one separation cycle, and permits the handling of more material than a conventional sink/float centrifuge separation.

  20. Large scale cryogenic fluid systems testing

    NASA Technical Reports Server (NTRS)

    1992-01-01

    NASA Lewis Research Center's Cryogenic Fluid Systems Branch (CFSB) within the Space Propulsion Technology Division (SPTD) has the ultimate goal of enabling the long term storage and in-space fueling/resupply operations for spacecraft and reusable vehicles in support of space exploration. Using analytical modeling, ground based testing, and on-orbit experimentation, the CFSB is studying three primary categories of fluid technology: storage, supply, and transfer. The CFSB is also investigating fluid handling, advanced instrumentation, and tank structures and materials. Ground based testing of large-scale systems is done using liquid hydrogen as a test fluid at the Cryogenic Propellant Tank Facility (K-site) at Lewis' Plum Brook Station in Sandusky, Ohio. A general overview of tests involving liquid transfer, thermal control, pressure control, and pressurization is given.

  1. Large-scale optimization of neuron arbors

    NASA Astrophysics Data System (ADS)

    Cherniak, Christopher; Changizi, Mark; Won Kang, Du

    1999-05-01

    At the global as well as local scales, some of the geometry of types of neuron arbors-both dendrites and axons-appears to be self-organizing: Their morphogenesis behaves like flowing water, that is, fluid dynamically; waterflow in branching networks in turn acts like a tree composed of cords under tension, that is, vector mechanically. Branch diameters and angles and junction sites conform significantly to this model. The result is that such neuron tree samples globally minimize their total volume-rather than, for example, surface area or branch length. In addition, the arbors perform well at generating the cheapest topology interconnecting their terminals: their large-scale layouts are among the best of all such possible connecting patterns, approaching 5% of optimum. This model also applies comparably to arterial and river networks.

  2. Grid sensitivity capability for large scale structures

    NASA Technical Reports Server (NTRS)

    Nagendra, Gopal K.; Wallerstein, David V.

    1989-01-01

    The considerations and the resultant approach used to implement design sensitivity capability for grids into a large scale, general purpose finite element system (MSC/NASTRAN) are presented. The design variables are grid perturbations with a rather general linking capability. Moreover, shape and sizing variables may be linked together. The design is general enough to facilitate geometric modeling techniques for generating design variable linking schemes in an easy and straightforward manner. Test cases have been run and validated by comparison with the overall finite difference method. The linking of a design sensitivity capability for shape variables in MSC/NASTRAN with an optimizer would give designers a powerful, automated tool to carry out practical optimization design of real life, complicated structures.

  3. Large Scale Quantum Simulations of Nuclear Pasta

    NASA Astrophysics Data System (ADS)

    Fattoyev, Farrukh J.; Horowitz, Charles J.; Schuetrumpf, Bastian

    2016-03-01

    Complex and exotic nuclear geometries collectively referred to as ``nuclear pasta'' are expected to naturally exist in the crust of neutron stars and in supernovae matter. Using a set of self-consistent microscopic nuclear energy density functionals we present the first results of large scale quantum simulations of pasta phases at baryon densities 0 . 03 < ρ < 0 . 10 fm-3, proton fractions 0 . 05

  4. Primer design for large scale sequencing.

    PubMed Central

    Haas, S; Vingron, M; Poustka, A; Wiemann, S

    1998-01-01

    We have developed PRIDE, a primer design program that automatically designs primers in single contigs or whole sequencing projects to extend the already known sequence and to double strand single-stranded regions. The program is fully integrated into the Staden package (GAP4) and accessible with a graphical user interface. PRIDE uses a fuzzy logic-based system to calculate primer qualities. The computational performance of PRIDE is enhanced by using suffix trees to store the huge amount of data being produced. A test set of 110 sequencing primers and 11 PCR primer pairs has been designed on genomic templates, cDNAs and sequences containing repetitive elements to analyze PRIDE's success rate. The high performance of PRIDE, combined with its minimal requirement of user interaction and its fast algorithm, make this program useful for the large scale design of primers, especially in large sequencing projects. PMID:9611248

  5. Large scale study of tooth enamel

    SciTech Connect

    Bodart, F.; Deconninck, G.; Martin, M.Th.

    1981-04-01

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. One hundred eighty samples of teeth were first analysed using PIXE, backscattering and nuclear reaction techniques. The results were analysed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population.

  6. Modeling the Internet's large-scale topology

    PubMed Central

    Yook, Soon-Hyung; Jeong, Hawoong; Barabási, Albert-László

    2002-01-01

    Network generators that capture the Internet's large-scale topology are crucial for the development of efficient routing protocols and modeling Internet traffic. Our ability to design realistic generators is limited by the incomplete understanding of the fundamental driving forces that affect the Internet's evolution. By combining several independent databases capturing the time evolution, topology, and physical layout of the Internet, we identify the universal mechanisms that shape the Internet's router and autonomous system level topology. We find that the physical layout of nodes form a fractal set, determined by population density patterns around the globe. The placement of links is driven by competition between preferential attachment and linear distance dependence, a marked departure from the currently used exponential laws. The universal parameters that we extract significantly restrict the class of potentially correct Internet models and indicate that the networks created by all available topology generators are fundamentally different from the current Internet. PMID:12368484

  7. Voids in the Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    El-Ad, Hagai; Piran, Tsvi

    1997-12-01

    Voids are the most prominent feature of the large-scale structure of the universe. Still, their incorporation into quantitative analysis of it has been relatively recent, owing essentially to the lack of an objective tool to identify the voids and to quantify them. To overcome this, we present here the VOID FINDER algorithm, a novel tool for objectively quantifying voids in the galaxy distribution. The algorithm first classifies galaxies as either wall galaxies or field galaxies. Then, it identifies voids in the wall-galaxy distribution. Voids are defined as continuous volumes that do not contain any wall galaxies. The voids must be thicker than an adjustable limit, which is refined in successive iterations. In this way, we identify the same regions that would be recognized as voids by the eye. Small breaches in the walls are ignored, avoiding artificial connections between neighboring voids. We test the algorithm using Voronoi tesselations. By appropriate scaling of the parameters with the selection function, we apply it to two redshift surveys, the dense SSRS2 and the full-sky IRAS 1.2 Jy. Both surveys show similar properties: ~50% of the volume is filled by voids. The voids have a scale of at least 40 h-1 Mpc and an average -0.9 underdensity. Faint galaxies do not fill the voids, but they do populate them more than bright ones. These results suggest that both optically and IRAS-selected galaxies delineate the same large-scale structure. Comparison with the recovered mass distribution further suggests that the observed voids in the galaxy distribution correspond well to underdense regions in the mass distribution. This confirms the gravitational origin of the voids.

  8. Supporting large-scale computational science

    SciTech Connect

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  9. Improving Recent Large-Scale Pulsar Surveys

    NASA Astrophysics Data System (ADS)

    Cardoso, Rogerio Fernando; Ransom, S.

    2011-01-01

    Pulsars are unique in that they act as celestial laboratories for precise tests of gravity and other extreme physics (Kramer 2004). There are approximately 2000 known pulsars today, which is less than ten percent of pulsars in the Milky Way according to theoretical models (Lorimer 2004). Out of these 2000 known pulsars, approximately ten percent are known millisecond pulsars, objects used for their period stability for detailed physics tests and searches for gravitational radiation (Lorimer 2008). As the field and instrumentation progress, pulsar astronomers attempt to overcome observational biases and detect new pulsars, consequently discovering new millisecond pulsars. We attempt to improve large scale pulsar surveys by examining three recent pulsar surveys. The first, the Green Bank Telescope 350MHz Drift Scan, a low frequency isotropic survey of the northern sky, has yielded a large number of candidates that were visually inspected and identified, resulting in over 34.000 thousands candidates viewed, dozens of detections of known pulsars, and the discovery of a new low-flux pulsar, PSRJ1911+22. The second, the PALFA survey, is a high frequency survey of the galactic plane with the Arecibo telescope. We created a processing pipeline for the PALFA survey at the National Radio Astronomy Observatory in Charlottesville- VA, in addition to making needed modifications upon advice from the PALFA consortium. The third survey examined is a new GBT 820MHz survey devoted to find new millisecond pulsars by observing the target-rich environment of unidentified sources in the FERMI LAT catalogue. By approaching these three pulsar surveys at different stages, we seek to improve the success rates of large scale surveys, and hence the possibility for ground-breaking work in both basic physics and astrophysics.

  10. Supporting large-scale computational science

    SciTech Connect

    Musick, R., LLNL

    1998-02-19

    Business needs have driven the development of commercial database systems since their inception. As a result, there has been a strong focus on supporting many users, minimizing the potential corruption or loss of data, and maximizing performance metrics like transactions per second, or TPC-C and TPC-D results. It turns out that these optimizations have little to do with the needs of the scientific community, and in particular have little impact on improving the management and use of large-scale high-dimensional data. At the same time, there is an unanswered need in the scientific community for many of the benefits offered by a robust DBMS. For example, tying an ad-hoc query language such as SQL together with a visualization toolkit would be a powerful enhancement to current capabilities. Unfortunately, there has been little emphasis or discussion in the VLDB community on this mismatch over the last decade. The goal of the paper is to identify the specific issues that need to be resolved before large-scale scientific applications can make use of DBMS products. This topic is addressed in the context of an evaluation of commercial DBMS technology applied to the exploration of data generated by the Department of Energy`s Accelerated Strategic Computing Initiative (ASCI). The paper describes the data being generated for ASCI as well as current capabilities for interacting with and exploring this data. The attraction of applying standard DBMS technology to this domain is discussed, as well as the technical and business issues that currently make this an infeasible solution.

  11. Introducing Large-Scale Innovation in Schools

    NASA Astrophysics Data System (ADS)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  12. Large-scale sequential quadratic programming algorithms

    SciTech Connect

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  13. Large-Scale Hybrid Motor Testing. Chapter 10

    NASA Technical Reports Server (NTRS)

    Story, George

    2006-01-01

    Hybrid rocket motors can be successfully demonstrated at a small scale virtually anywhere. There have been many suitcase sized portable test stands assembled for demonstration of hybrids. They show the safety of hybrid rockets to the audiences. These small show motors and small laboratory scale motors can give comparative burn rate data for development of different fuel/oxidizer combinations, however questions that are always asked when hybrids are mentioned for large scale applications are - how do they scale and has it been shown in a large motor? To answer those questions, large scale motor testing is required to verify the hybrid motor at its true size. The necessity to conduct large-scale hybrid rocket motor tests to validate the burn rate from the small motors to application size has been documented in several place^'^^.^. Comparison of small scale hybrid data to that of larger scale data indicates that the fuel burn rate goes down with increasing port size, even with the same oxidizer flux. This trend holds for conventional hybrid motors with forward oxidizer injection and HTPB based fuels. While the reason this is occurring would make a great paper or study or thesis, it is not thoroughly understood at this time. Potential causes include the fact that since hybrid combustion is boundary layer driven, the larger port sizes reduce the interaction (radiation, mixing and heat transfer) from the core region of the port. This chapter focuses on some of the large, prototype sized testing of hybrid motors. The largest motors tested have been AMROC s 250K-lbf thrust motor at Edwards Air Force Base and the Hybrid Propulsion Demonstration Program s 250K-lbf thrust motor at Stennis Space Center. Numerous smaller tests were performed to support the burn rate, stability and scaling concepts that went into the development of those large motors.

  14. Cooling biogeophysical effect of large-scale tropical deforestation in three Earth System models

    NASA Astrophysics Data System (ADS)

    Brovkin, V.; Pugh, T.; Robertson, E.; Bathiany, S.; Jones, C.; Arneth, A.

    2015-12-01

    Vegetation cover in the tropics is limited by moisture availability. Since transpiration from forests is generally greater than from grasslands, the sensitivity of precipitation in the Amazon to large-scale deforestation has long been seen as a critical parameter of climate-vegetation interactions. Most Amazon deforestation experiments to date have been performed with interactive land-atmosphere models but prescribed sea surface temperatures (SSTs). They reveal a strong reduction in evapotranspiration and precipitation, and an increase in global air surface temperature due to reduced latent heat flux. We performed large-scale tropical deforestation experiments with three Earth system models (ESMs) including interactive ocean models, which participated in the FP7 project EMBRACE. In response to tropical deforestation, all models simulate a significant reduction in tropical precipitation, similar to the experiments with prescribed SSTs. However, all three models suggest that the response of global temperature to the deforestation is a cooling or no change, differing from the result of a global warming in prescribed SSTs runs. Presumably, changes in the hydrological cycle and in the water vapor feedback due to deforestation operate in the direction of a global cooling. In addition, one of the models simulates a local cooling over the deforested tropical region. This is opposite to the local warming in the other models. This suggests that the balance between warming due to latent heat flux decrease and cooling due to albedo increase is rather subtle and model-dependent. Last but not least, we suggest using large-scale deforestation as a standard biogeophysical experiment for model intercomparison within the CMIP6 framework.

  15. Northern East Asian Monsoon Precipitation Revealed by Air Mass Variability and Its Prediction

    NASA Astrophysics Data System (ADS)

    Son, J. H.; Seo, K. H.

    2015-12-01

    This work provides a new perspective on the major factors controlling the East Asian summer monsoon (EASM) in July, and a promising physical-statistical forecasting of the EASM ahead of summer. Dominant modes of the EASM are revealed from the variability of large-scale air masses discerned by equivalent potential temperature, and are found to be dynamically connected with the anomalous sea surface temperatures (SSTs) over the three major oceans of the world and their counterparts of prevailing atmospheric oscillation or teleconnection patterns. Precipitation over Northeast Asia (NEA) during July is enhanced by the tropical central Indian Ocean warming and central Pacific El Niño-related SST warming, the northwestern Pacific cooling off the coast of NEA, and the North Atlantic Ocean warming. Using these factors and data from the preceding spring seasons, the authors build a multiple linear regression model for seasonal forecasting. The cross-validated correlation skill predicted for the period 1994 to 2012 is up to 0.84, which far exceeds the skill level of contemporary climate models.

  16. Climate variability rather than overstocking causes recent large scale cover changes of Tibetan pastures

    NASA Astrophysics Data System (ADS)

    Lehnert, L. W.; Wesche, K.; Trachte, K.; Reudenbach, C.; Bendix, J.

    2016-04-01

    The Tibetan Plateau (TP) is a globally important “water tower” that provides water for nearly 40% of the world’s population. This supply function is claimed to be threatened by pasture degradation on the TP and the associated loss of water regulation functions. However, neither potential large scale degradation changes nor their drivers are known. Here, we analyse trends in a high-resolution dataset of grassland cover to determine the interactions among vegetation dynamics, climate change and human impacts on the TP. The results reveal that vegetation changes have regionally different triggers: While the vegetation cover has increased since the year 2000 in the north-eastern part of the TP due to an increase in precipitation, it has declined in the central and western parts of the TP due to rising air temperature and declining precipitation. Increasing livestock numbers as a result of land use changes exacerbated the negative trends but were not their exclusive driver. Thus, we conclude that climate variability instead of overgrazing has been the primary cause for large scale vegetation cover changes on the TP since the new millennium. Since areas of positive and negative changes are almost equal in extent, pasture degradation is not generally proceeding.

  17. Climate variability rather than overstocking causes recent large scale cover changes of Tibetan pastures

    PubMed Central

    Lehnert, L. W.; Wesche, K.; Trachte, K.; Reudenbach, C.; Bendix, J.

    2016-01-01

    The Tibetan Plateau (TP) is a globally important “water tower” that provides water for nearly 40% of the world’s population. This supply function is claimed to be threatened by pasture degradation on the TP and the associated loss of water regulation functions. However, neither potential large scale degradation changes nor their drivers are known. Here, we analyse trends in a high-resolution dataset of grassland cover to determine the interactions among vegetation dynamics, climate change and human impacts on the TP. The results reveal that vegetation changes have regionally different triggers: While the vegetation cover has increased since the year 2000 in the north-eastern part of the TP due to an increase in precipitation, it has declined in the central and western parts of the TP due to rising air temperature and declining precipitation. Increasing livestock numbers as a result of land use changes exacerbated the negative trends but were not their exclusive driver. Thus, we conclude that climate variability instead of overgrazing has been the primary cause for large scale vegetation cover changes on the TP since the new millennium. Since areas of positive and negative changes are almost equal in extent, pasture degradation is not generally proceeding. PMID:27073126

  18. Large-scale wind turbine structures

    NASA Technical Reports Server (NTRS)

    Spera, David A.

    1988-01-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  19. Large-scale tides in general relativity

    NASA Astrophysics Data System (ADS)

    Ip, Hiu Yan; Schmidt, Fabian

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the "separate universe" paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  20. Large-scale autostereoscopic outdoor display

    NASA Astrophysics Data System (ADS)

    Reitterer, Jörg; Fidler, Franz; Saint Julien-Wallsee, Ferdinand; Schmid, Gerhard; Gartner, Wolfgang; Leeb, Walter; Schmid, Ulrich

    2013-03-01

    State-of-the-art autostereoscopic displays are often limited in size, effective brightness, number of 3D viewing zones, and maximum 3D viewing distances, all of which are mandatory requirements for large-scale outdoor displays. Conventional autostereoscopic indoor concepts like lenticular lenses or parallax barriers cannot simply be adapted for these screens due to the inherent loss of effective resolution and brightness, which would reduce both image quality and sunlight readability. We have developed a modular autostereoscopic multi-view laser display concept with sunlight readable effective brightness, theoretically up to several thousand 3D viewing zones, and maximum 3D viewing distances of up to 60 meters. For proof-of-concept purposes a prototype display with two pixels was realized. Due to various manufacturing tolerances each individual pixel has slightly different optical properties, and hence the 3D image quality of the display has to be calculated stochastically. In this paper we present the corresponding stochastic model, we evaluate the simulation and measurement results of the prototype display, and we calculate the achievable autostereoscopic image quality to be expected for our concept.

  1. Large scale digital atlases in neuroscience

    NASA Astrophysics Data System (ADS)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  2. Food appropriation through large scale land acquisitions

    NASA Astrophysics Data System (ADS)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  3. Large-scale carbon fiber tests

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A realistic release of carbon fibers was established by burning a minimum of 45 kg of carbon fiber composite aircraft structural components in each of five large scale, outdoor aviation jet fuel fire tests. This release was quantified by several independent assessments with various instruments developed specifically for these tests. The most likely values for the mass of single carbon fibers released ranged from 0.2 percent of the initial mass of carbon fiber for the source tests (zero wind velocity) to a maximum of 0.6 percent of the initial carbon fiber mass for dissemination tests (5 to 6 m/s wind velocity). Mean fiber lengths for fibers greater than 1 mm in length ranged from 2.5 to 3.5 mm. Mean diameters ranged from 3.6 to 5.3 micrometers which was indicative of significant oxidation. Footprints of downwind dissemination of the fire released fibers were measured to 19.1 km from the fire.

  4. Development of Large-Scale Functional Brain Networks in Children

    PubMed Central

    Supekar, Kaustubh; Musen, Mark; Menon, Vinod

    2009-01-01

    The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7–9 y) and 22 young-adults (ages 19–22 y). Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar “small-world” organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism. PMID:19621066

  5. Large-scale isentropic mixing properties of the Antarctic polar vortex from analyzed winds

    NASA Technical Reports Server (NTRS)

    Bowman, Kenneth P.

    1993-01-01

    Winds derived from analyzed geopotential height fields are used to study quasi-horizontal mixing by the large-scale flow in the lower stratosphere during austral spring. This is the period when the Antarctic ozone hole appears and disappears. Trajectories are computed for large ensembles of particles initially inside and outside the main polar vortex. Mixing and transport are diagnosed through estimates of finite time Lyapunov exponents and Lagrangian dispersion statistics of the tracer trajectories. At 450 K and above prior to the vortex breakdown: Lyapunov exponents are a factor of 2 smaller inside the vortex than outside; diffusion coefficients are an order of magnitude smaller inside than outside the vortex; and the trajectories reveal little exchange of air across the vortex boundary. At lower levels (425 and 400 K) mixing is greater, and there is substantial exchange of air across the vortex boundary. In some years there are large wave events that expel small amounts of vortex air into the mid-latitudes. At the end of the spring season during the vortex breakdown there is rapid mixing of air across the vortex boundary, which is evident in the mixing diagnostics and the tracer trajectories.

  6. Creation of the dam for the No. 2 Kambaratinskaya HPP by large-scale blasting: analysis of planning experience and lessons learned

    SciTech Connect

    Shuifer, M. I.; Argal, E. S.

    2012-05-15

    Results of complex instrument observations and video taping during large-scale blasts detonated for creation of the dam at the No. 2 Kambaratinskaya HPP on the Naryn River in the Kyrgyz Republic are analyzed. Tests of the energy effectiveness of the explosives are evaluated, characteristics of LSB manifestations in seismic and air waves are revealed, and the shaping and movement of the rock mass are examined. A methodological analysis of the planning and production of the LSB is given.

  7. Sensitivity technologies for large scale simulation.

    SciTech Connect

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias; Wilcox, Lucas C.; Hill, Judith C.; Ghattas, Omar; Berggren, Martin Olof; Akcelik, Volkan; Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  8. Large-scale coherent structures as drivers of combustion instability

    SciTech Connect

    Schadow, K.C.; Gutmark, E.; Parr, T.P.; Parr, D.M.; Wilson, K.J.

    1987-06-01

    The role of flow coherent structures as drivers of combustion instabilities in a dump combustor was studied. Results of nonreacting tests in air and water flows as well as combustion experiments in a diffusion flame and dump combustor are discussed to provide insight into the generation process of large-scale structures in the combustor flow and their interaction with the combustion process. It is shown that the flow structures, or vortices, are formed by interaction between the flow instabilities and the chamber acoustic resonance. When these vortices dominate the reacting flow, the combustion is confined to their cores, leading to periodic heat release, which may result in the driving of high amplitude pressure oscillations. These oscillations are typical to the occurrence of combustion instabilities for certain operating conditions. The basic understanding of the interaction between flow dynamics and the combustion process opens up the possibility for rational control of combustion-induced pressure oscillations. 42 references.

  9. Large Scale Flame Spread Environmental Characterization Testing

    NASA Technical Reports Server (NTRS)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  10. Synchronization of coupled large-scale Boolean networks

    NASA Astrophysics Data System (ADS)

    Li, Fangfei

    2014-03-01

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  11. Synchronization of coupled large-scale Boolean networks

    SciTech Connect

    Li, Fangfei

    2014-03-15

    This paper investigates the complete synchronization and partial synchronization of two large-scale Boolean networks. First, the aggregation algorithm towards large-scale Boolean network is reviewed. Second, the aggregation algorithm is applied to study the complete synchronization and partial synchronization of large-scale Boolean networks. Finally, an illustrative example is presented to show the efficiency of the proposed results.

  12. Safety aspects of large-scale combustion of hydrogen

    SciTech Connect

    Edeskuty, F.J.; Haugh, J.J.; Thompson, R.T.

    1986-01-01

    Recent hydrogen-safety investigations have studied the possible large-scale effects from phenomena such as the accumulation of combustible hydrogen-air mixtures in large, confined volumes. Of particular interest are safe methods for the disposal of the hydrogen and the pressures which can arise from its confined combustion. Consequently, tests of the confined combustion of hydrogen-air mixtures were conducted in a 2100 m/sup 3/ volume. These tests show that continuous combustion, as the hydrogen is generated, is a safe method for its disposal. It also has been seen that, for hydrogen concentrations up to 13 vol %, it is possible to predict maximum pressures that can occur upon ignition of premixed hydrogen-air atmospheres. In addition information has been obtained concerning the survivability of the equipment that is needed to recover from an accident involving hydrogen combustion. An accident that involved the inadvertent mixing of hydrogen and oxygen gases in a tube trailer gave evidence that under the proper conditions hydrogen combustion can transit to a detonation. If detonation occurs the pressures which can be experienced are much higher although short in duration.

  13. Very large-scale motions in a turbulent pipe flow

    NASA Astrophysics Data System (ADS)

    Lee, Jae Hwa; Jang, Seong Jae; Sung, Hyung Jin

    2011-11-01

    Direct numerical simulation of a turbulent pipe flow with ReD=35000 was performed to investigate the spatially coherent structures associated with very large-scale motions. The corresponding friction Reynolds number, based on pipe radius R, is R+=934, and the computational domain length is 30 R. The computed mean flow statistics agree well with previous DNS data at ReD=44000 and 24000. Inspection of the instantaneous fields and two-point correlation of the streamwise velocity fluctuations showed that the very long meandering motions exceeding 25R exist in logarithmic and wake regions, and the streamwise length scale is almost linearly increased up to y/R ~0.3, while the structures in the turbulent boundary layer only reach up to the edge of the log-layer. Time-resolved instantaneous fields revealed that the hairpin packet-like structures grow with continuous stretching along the streamwise direction and create the very large-scale structures with meandering in the spanwise direction, consistent with the previous conceptual model of Kim & Adrian (1999). This work was supported by the Creative Research Initiatives of NRF/MEST of Korea (No. 2011-0000423).

  14. A study of synthetic large scales in turbulent boundary layers

    NASA Astrophysics Data System (ADS)

    Duvvuri, Subrahmanyam; Luhar, Mitul; Barnard, Casey; Sheplak, Mark; McKeon, Beverley

    2013-11-01

    Synthetic spanwise-constant spatio-temporal disturbances are excited in a turbulent boundary layer through a spatially impulsive patch of dynamic wall-roughness. The downstream flow response is studied through hot wire anemometry, pressure measurements at the wall and direct measurements of wall-shear-stress made using a novel micro-machined capacitive floating element sensor. These measurements are phase-locked to the input perturbation to recover the synthetic large-scale motion and characterize its structure and wall signature. The phase relationship between the synthetic large scale and small scale activity provides further insights into the apparent amplitude modulation effect between them, and the dynamics of wall-bounded turbulent flows in general. Results from these experiments will be discussed in the context of the critical-layer behavior revealed by the resolvent analysis of McKeon & Sharma (J Fluid Mech, 2010), and compared with similar earlier work by Jacobi & McKeon (J Fluid Mech, 2011). Model predictions are shown to be in broad agreement with experiments. The support of AFOSR grant #FA 9550-12-1-0469, Resnick Institute Graduate Research Fellowship (S.D.) and Sandia Graduate Fellowship (C.B.) are gratefully acknowledged.

  15. Large-Scale Spacecraft Fire Safety Tests

    NASA Technical Reports Server (NTRS)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; Toth, Balazs; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Jomaas, Grunde

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  16. The galaxy distribution and the large-scale structure of the universe

    NASA Technical Reports Server (NTRS)

    Geller, M. J.; Kurtz, M. J.; De Lapparent, V.

    1986-01-01

    Data related to the large-scale galaxy distribution are discussed. The galaxy counts of Shane-Wirtanen (1967) are analyzed; the effects of residual systematic errors on the galaxy distribution measurements are considered. The analysis reveals that the Shane-Wirtanen data are not applicable to the study of large-scale structure. A model which is capable of measuring galaxy correlation functions on scales greater than about 10 Mpc is evaluated.

  17. Training pilots to visualize large-scale spatial relationships in a stereoscopic display

    NASA Astrophysics Data System (ADS)

    Mowafy, Lyn; Thurman, Richard A.

    1993-09-01

    In flying air intercepts, a fighter pilot must plan most tactical maneuvers well before acquiring visual contact. Success depends on one's ability to create an accurate mental model of dynamic 3D spatial relationships from 2D information displays. This paper describes an Air Force training program for visualizing large- scale dynamic spatial relationships. It employs a low-cost, portable system in which the helmet-mounted stereoscopic display reveals the unobservable spatial relationships in a virtual world. We also describe recent research which evaluated the training effectiveness of this interactive three-dimensional display technology. Three display formats have been tested for their impact on the pilot's ability to encode, retain and recall functionally relevant spatial information: (1) a set of 2D orthographic plan views, (2) a flat panel 3D perspective rendering and, (3) the 3D virtual environment. Trainees flew specified air intercepts and reviewed the flights in one of the display formats. Experts' trajectories were provided for comparison. After training, flight performance was tested on a new set of scenarios. Differences in pilots' performances under the three formats suggest how virtual environment displays can aid people learning to visualize 3D spatial relationships from 2D information.

  18. Population generation for large-scale simulation

    NASA Astrophysics Data System (ADS)

    Hannon, Andrew C.; King, Gary; Morrison, Clayton; Galstyan, Aram; Cohen, Paul

    2005-05-01

    Computer simulation is used to research phenomena ranging from the structure of the space-time continuum to population genetics and future combat.1-3 Multi-agent simulations in particular are now commonplace in many fields.4, 5 By modeling populations whose complex behavior emerges from individual interactions, these simulations help to answer questions about effects where closed form solutions are difficult to solve or impossible to derive.6 To be useful, simulations must accurately model the relevant aspects of the underlying domain. In multi-agent simulation, this means that the modeling must include both the agents and their relationships. Typically, each agent can be modeled as a set of attributes drawn from various distributions (e.g., height, morale, intelligence and so forth). Though these can interact - for example, agent height is related to agent weight - they are usually independent. Modeling relations between agents, on the other hand, adds a new layer of complexity, and tools from graph theory and social network analysis are finding increasing application.7, 8 Recognizing the role and proper use of these techniques, however, remains the subject of ongoing research. We recently encountered these complexities while building large scale social simulations.9-11 One of these, the Hats Simulator, is designed to be a lightweight proxy for intelligence analysis problems. Hats models a "society in a box" consisting of many simple agents, called hats. Hats gets its name from the classic spaghetti western, in which the heroes and villains are known by the color of the hats they wear. The Hats society also has its heroes and villains, but the challenge is to identify which color hat they should be wearing based on how they behave. There are three types of hats: benign hats, known terrorists, and covert terrorists. Covert terrorists look just like benign hats but act like terrorists. Population structure can make covert hat identification significantly more

  19. In situ vitrification large-scale operational acceptance test analysis

    SciTech Connect

    Buelt, J.L.; Carter, J.G.

    1986-05-01

    A thermal treatment process is currently under study to provide possible enhancement of in-place stabilization of transuranic and chemically contaminated soil sites. The process is known as in situ vitrification (ISV). In situ vitrification is a remedial action process that destroys solid and liquid organic contaminants and incorporates radionuclides into a glass-like material that renders contaminants substantially less mobile and less likely to impact the environment. A large-scale operational acceptance test (LSOAT) was recently completed in which more than 180 t of vitrified soil were produced in each of three adjacent settings. The LSOAT demonstrated that the process conforms to the functional design criteria necessary for the large-scale radioactive test (LSRT) to be conducted following verification of the performance capabilities of the process. The energy requirements and vitrified block size, shape, and mass are sufficiently equivalent to those predicted by the ISV mathematical model to confirm its usefulness as a predictive tool. The LSOAT demonstrated an electrode replacement technique, which can be used if an electrode fails, and techniques have been identified to minimize air oxidation, thereby extending electrode life. A statistical analysis was employed during the LSOAT to identify graphite collars and an insulative surface as successful cold cap subsidence techniques. The LSOAT also showed that even under worst-case conditions, the off-gas system exceeds the flow requirements necessary to maintain a negative pressure on the hood covering the area being vitrified. The retention of simulated radionuclides and chemicals in the soil and off-gas system exceeds requirements so that projected emissions are one to two orders of magnitude below the maximum permissible concentrations of contaminants at the stack.

  20. Sheltering in buildings from large-scale outdoor releases

    SciTech Connect

    Chan, W.R.; Price, P.N.; Gadgil, A.J.

    2004-06-01

    Intentional or accidental large-scale airborne toxic release (e.g. terrorist attacks or industrial accidents) can cause severe harm to nearby communities. Under these circumstances, taking shelter in buildings can be an effective emergency response strategy. Some examples where shelter-in-place was successful at preventing injuries and casualties have been documented [1, 2]. As public education and preparedness are vital to ensure the success of an emergency response, many agencies have prepared documents advising the public on what to do during and after sheltering [3, 4, 5]. In this document, we will focus on the role buildings play in providing protection to occupants. The conclusions to this article are: (1) Under most circumstances, shelter-in-place is an effective response against large-scale outdoor releases. This is particularly true for release of short duration (a few hours or less) and chemicals that exhibit non-linear dose-response characteristics. (2) The building envelope not only restricts the outdoor-indoor air exchange, but can also filter some biological or even chemical agents. Once indoors, the toxic materials can deposit or sorb onto indoor surfaces. All these processes contribute to the effectiveness of shelter-in-place. (3) Tightening of building envelope and improved filtration can enhance the protection offered by buildings. Common mechanical ventilation system present in most commercial buildings, however, should be turned off and dampers closed when sheltering from an outdoor release. (4) After the passing of the outdoor plume, some residuals will remain indoors. It is therefore important to terminate shelter-in-place to minimize exposure to the toxic materials.

  1. Unfolding large-scale online collaborative human dynamics

    PubMed Central

    Zha, Yilong; Zhou, Tao; Zhou, Changsong

    2016-01-01

    Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with a precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double–power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to previous actions with a power-law waiting time, and (iii) population growth due to the increasing number of interacting individuals. This unfolding allows us to obtain an analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal “simplicity” beyond complex interacting human activities. PMID:27911766

  2. Successful Physician Training Program for Large Scale EMR Implementation

    PubMed Central

    Stevens, L.A.; Mailes, E.S.; Goad, B.A.; Longhurst, C.A.

    2015-01-01

    Summary End-user training is an essential element of electronic medical record (EMR) implementation and frequently suffers from minimal institutional investment. In addition, discussion of successful EMR training programs for physicians is limited in the literature. The authors describe a successful physician-training program at Stanford Children’s Health as part of a large scale EMR implementation. Evaluations of classroom training, obtained at the conclusion of each class, revealed high physician satisfaction with the program. Free-text comments from learners focused on duration and timing of training, the learning environment, quality of the instructors, and specificity of training to their role or department. Based upon participant feedback and institutional experience, best practice recommendations, including physician engagement, curricular design, and assessment of proficiency and recognition, are suggested for future provider EMR training programs. The authors strongly recommend the creation of coursework to group providers by common workflow. PMID:25848415

  3. Large-scale treeline changes recorded in Siberia

    NASA Astrophysics Data System (ADS)

    Esper, Jan; Schweingruber, Fritz H.

    2004-03-01

    Analysis of a multi-species network of western Siberian ecotone sites revealed pulses of tree invasion into genuine treeless tundra environments in the 1940s and 1950s and after the early 1970s. In addition, increases in radial stem growth synchronous to the late 20th century treeline change are observed. Both treeline changes and growth increases correspond with decadal-scale periods of temperature that are warmer than in any other period since observations started, suggesting - even if indirect - the sensitivity of large-scale treeline changes to this climatic forcing. The mid 20th century recruitment period reported here for the western Siberian network is compared with local findings from Europe and North America suggesting a circumpolar trend perhaps related to climate warming patterns. For western Siberia, the presence of relict stumps, nevertheless, indicates that this present colonization is reoccupying sites that had tree cover earlier in the last millennium.

  4. Multitree Algorithms for Large-Scale Astrostatistics

    NASA Astrophysics Data System (ADS)

    March, William B.; Ozakin, Arkadas; Lee, Dongryeol; Riegel, Ryan; Gray, Alexander G.

    2012-03-01

    this number every week, resulting in billions of objects. At such scales, even linear-time analysis operations present challenges, particularly since statistical analyses are inherently interactive processes, requiring that computations complete within some reasonable human attention span. The quadratic (or worse) runtimes of straightforward implementations become quickly unbearable. Examples of applications. These analysis subroutines occur ubiquitously in astrostatistical work. We list just a few examples. The need to cross-match objects across different catalogs has led to various algorithms, which at some point perform an AllNN computation. 2-point and higher-order spatial correlations for the basis of spatial statistics, and are utilized in astronomy to compare the spatial structures of two datasets, such as an observed sample and a theoretical sample, for example, forming the basis for two-sample hypothesis testing. Friends-of-friends clustering is often used to identify halos in data from astrophysical simulations. Minimum spanning tree properties have also been proposed as statistics of large-scale structure. Comparison of the distributions of different kinds of objects requires accurate density estimation, for which KDE is the overall statistical method of choice. The prediction of redshifts from optical data requires accurate regression, for which kernel regression is a powerful method. The identification of objects of various types in astronomy, such as stars versus galaxies, requires accurate classification, for which KDA is a powerful method. Overview. In this chapter, we will briefly sketch the main ideas behind recent fast algorithms which achieve, for example, linear runtimes for pairwise-distance problems, or similarly dramatic reductions in computational growth. In some cases, the runtime orders for these algorithms are mathematically provable statements, while in others we have only conjectures backed by experimental observations for the time being

  5. The role of large-scale, extratropical dynamics in climate change

    SciTech Connect

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  6. Probing large-scale structure with radio observations

    NASA Astrophysics Data System (ADS)

    Brown, Shea D.

    This thesis focuses on detecting magnetized relativistic plasma in the intergalactic medium (IGM) of filamentary large-scale structure (LSS) by observing synchrotron emission emitted by structure formation shocks. Little is known about the IGM beyond the largest clusters of galaxies, and synchrotron emission holds enormous promise as a means of probing magnetic fields and relativistic particle populations in these low density regions. I'll first report on observations taken at the Very Large Array and the Westerbork Synthesis Radio Telescope of the diffuse radio source 0809+39. I use these observations to demonstrate that 0809+39 is likely the first "radio relic" discovered that is not associated with a rich |"X-ray emitting cluster of galaxies. I then demonstrate that an unconventional reprocessing of the NVSS polarization survey can reveal structures on scales from 15' to hundreds of degrees, far larger than the nominal shortest-baseline scale. This yields hundreds of new diffuse sources as well as the identification of a new nearby galactic loop . These observations also highlight the major obstacle that diffuse galactic foreground emission poses for any search for large-scale, low surface- brightness extragalactic emission. I therefore explore the cross-correlation of diffuse radio emission with optical tracers of LSS as a means of statistically detecting the presence of magnetic fields in the low-density regions of the cosmic web. This initial study with the Bonn 1.4 GHz radio survey yields an upper limit of 0.2 mG for large-scale filament magnetic fields. Finally, I report on new Green Bank Telescope and Westerbork Synthesis Radio Telescope observations of the famous Coma cluster of galaxies. Major findings include an extension to the Coma cluster radio relic source 1253+275 which makes its total extent ~2 Mpc, as well as a sharp edge, or "front", on the Western side of the radio halo which shows a strong correlation with merger activity associated with an

  7. Interaction of a cumulus cloud ensemble with the large-scale environment

    NASA Technical Reports Server (NTRS)

    Arakawa, A.; Schubert, W.

    1973-01-01

    Large-scale modification of the environment by cumulus clouds is discussed in terms of entrainment, detrainment, evaporation, and subsidence. Drying, warming, and condensation by vertical displacement of air are considered as well as budget equations for mass, static energy, water vapor, and liquid water.

  8. GAS MIXING ANALYSIS IN A LARGE-SCALED SALTSTONE FACILITY

    SciTech Connect

    Lee, S

    2008-05-28

    Computational fluid dynamics (CFD) methods have been used to estimate the flow patterns mainly driven by temperature gradients inside vapor space in a large-scaled Saltstone vault facility at Savannah River site (SRS). The purpose of this work is to examine the gas motions inside the vapor space under the current vault configurations by taking a three-dimensional transient momentum-energy coupled approach for the vapor space domain of the vault. The modeling calculations were based on prototypic vault geometry and expected normal operating conditions as defined by Waste Solidification Engineering. The modeling analysis was focused on the air flow patterns near the ventilated corner zones of the vapor space inside the Saltstone vault. The turbulence behavior and natural convection mechanism used in the present model were benchmarked against the literature information and theoretical results. The verified model was applied to the Saltstone vault geometry for the transient assessment of the air flow patterns inside the vapor space of the vault region using the potential operating conditions. The baseline model considered two cases for the estimations of the flow patterns within the vapor space. One is the reference nominal case. The other is for the negative temperature gradient between the roof inner and top grout surface temperatures intended for the potential bounding condition. The flow patterns of the vapor space calculated by the CFD model demonstrate that the ambient air comes into the vapor space of the vault through the lower-end ventilation hole, and it gets heated up by the Benard-cell type circulation before leaving the vault via the higher-end ventilation hole. The calculated results are consistent with the literature information. Detailed results and the cases considered in the calculations will be discussed here.

  9. Biomimetic gas sensors for large-scale drying of wood particles

    NASA Astrophysics Data System (ADS)

    Paczkowski, Sebastian; Sauerwald, Tilman; Weiß, Alexander; Bauer, Marco; Kohl, Dieter; Schütz, Stefan

    2011-04-01

    The sensitivity and selectivity of insect antennae are evolutionary tuned to specific needs of the insect. The Australian pyrophilic beetle Merimna atrata needs freshly heated wood to bring up its offspring and, consequently, shows a very high sensitivity to volatiles specific for wood-fires and heated wood. Volatile organic compounds released by wood particles heated at different temperatures were collected. Parallel trace analytical examination and antennal responses of the pyrophilic beetles to volatiles released by the wood reveal a highly differentiated detection system of these insects for early and late products of wood fires. This enabled a selection of marker compounds used by insects since several million years for the discrimination of different stages of wood fires. In the industrial production of engineered wood such as particle boards, wooden particles are dried in large-scale high temperature dryers. Air temperatures between 150-600°C are essential for the required material flow in the particle board production. Despite the resulting energy-efficiency of high temperature drying, high temperatures are avoided because of the increased risk of spontaneous combustion. Losses in productivity caused by fire have a strong impact on the whole production system. In order to raise the drying temperature without risking a fire, it is important to develop a monitoring system that will reliably detect early fire stages by their characteristic volatile pattern. Thus, perception filters and evaluation algorithms of pyrophilic insects can provide blue prints for biomimetic gas sensors for large-scale drying of wood particles. Especially tungsten oxide sensor elements exhibit a high sensitivity to some of the key substances. Their high sensitivity and selectivity to terpenes and aldehydes in combination with high sensitivity and selectivity of tin oxide sensor elements to hydroxylated and phenolic compounds, both showing low cross-reactivity with water and carbon

  10. Information Tailoring Enhancements for Large-Scale Social Data

    DTIC Science & Technology

    2016-06-15

    Intelligent Automation Incorporated Information Tailoring Enhancements for Large-Scale...Automation Incorporated Progress Report No. 3 Information Tailoring Enhancements for Large-Scale Social Data Submitted in accordance with...also gathers information about entities from all news articles and displays it on over one million entity pages [5][6], and the information is made

  11. 78 FR 11632 - Record of Decision for Land Acquisition and Airspace Establishment To Support Large-Scale Marine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-19

    ...-Scale Marine Air Ground Task Force Live- Fire and Maneuver Training at the Marine Corps Air Ground... decision to establish a large-scale Marine Air Ground Task Force (MAGTF) training facility at the Marine... Marine Expeditionary Brigade (MEB)-sized MAGTF, including full-scale MEB Exercises and associated...

  12. SALSA - a Sectional Aerosol module for Large Scale Applications

    NASA Astrophysics Data System (ADS)

    Kokkola, H.; Korhonen, H.; Lehtinen, K. E. J.; Makkonen, R.; Asmi, A.; Järvenoja, S.; Anttila, T.; Partanen, A.-I.; Kulmala, M.; Järvinen, H.; Laaksonen, A.; Kerminen, V.-M.

    2007-12-01

    The sectional aerosol module SALSA is introduced. The model has been designed to be implemented in large scale climate models, which require both accuracy and computational efficiency. We have used multiple methods to reduce the computational burden of different aerosol processes to optimize the model performance without losing physical features relevant to problematics of climate importance. The optimizations include limiting the chemical compounds and physical processes available in different size sections of aerosol particles; division of the size distribution into size sections using size sections of variable width depending on the sensitivity of microphysical processing to the particles sizes; the total amount of size sections to describe the size distribution is kept to the minimum; furthermore, only the relevant microphysical processes affecting each size section are calculated. The ability of the module to describe different microphysical processes was evaluated against explicit microphysical models and several microphysical models used in air quality models. The results from the current module show good consistency when compared to more explicit models. Also, the module was used to simulate a new particle formation event typical in highly polluted conditions with comparable results to a more explicit model setup.

  13. SALSA - a Sectional Aerosol module for Large Scale Applications

    NASA Astrophysics Data System (ADS)

    Kokkola, H.; Korhonen, H.; Lehtinen, K. E. J.; Makkonen, R.; Asmi, A.; Järvenoja, S.; Anttila, T.; Partanen, A.-I.; Kulmala, M.; Järvinen, H.; Laaksonen, A.; Kerminen, V.-M.

    2008-05-01

    The sectional aerosol module SALSA is introduced. The model has been designed to be implemented in large scale climate models, which require both accuracy and computational efficiency. We have used multiple methods to reduce the computational burden of different aerosol processes to optimize the model performance without losing physical features relevant to problematics of climate importance. The optimizations include limiting the chemical compounds and physical processes available in different size sections of aerosol particles; division of the size distribution into size sections using size sections of variable width depending on the sensitivity of microphysical processing to the particles sizes; the total amount of size sections to describe the size distribution is kept to the minimum; furthermore, only the relevant microphysical processes affecting each size section are calculated. The ability of the module to describe different microphysical processes was evaluated against explicit microphysical models and several microphysical models used in air quality models. The results from the current module show good consistency when compared to more explicit models. Also, the module was used to simulate a new particle formation event typical in highly polluted conditions with comparable results to more explicit model setup.

  14. Large-scale epitaxial growth kinetics of graphene: A kinetic Monte Carlo study

    SciTech Connect

    Jiang, Huijun; Hou, Zhonghuai

    2015-08-28

    Epitaxial growth via chemical vapor deposition is considered to be the most promising way towards synthesizing large area graphene with high quality. However, it remains a big theoretical challenge to reveal growth kinetics with atomically energetic and large-scale spatial information included. Here, we propose a minimal kinetic Monte Carlo model to address such an issue on an active catalyst surface with graphene/substrate lattice mismatch, which facilitates us to perform large scale simulations of the growth kinetics over two dimensional surface with growth fronts of complex shapes. A geometry-determined large-scale growth mechanism is revealed, where the rate-dominating event is found to be C{sub 1}-attachment for concave growth-front segments and C{sub 5}-attachment for others. This growth mechanism leads to an interesting time-resolved growth behavior which is well consistent with that observed in a recent scanning tunneling microscopy experiment.

  15. Distribution probability of large-scale landslides in central Nepal

    NASA Astrophysics Data System (ADS)

    Timilsina, Manita; Bhandary, Netra P.; Dahal, Ranjan Kumar; Yatabe, Ryuichi

    2014-12-01

    Large-scale landslides in the Himalaya are defined as huge, deep-seated landslide masses that occurred in the geological past. They are widely distributed in the Nepal Himalaya. The steep topography and high local relief provide high potential for such failures, whereas the dynamic geology and adverse climatic conditions play a key role in the occurrence and reactivation of such landslides. The major geoscientific problems related with such large-scale landslides are 1) difficulties in their identification and delineation, 2) sources of small-scale failures, and 3) reactivation. Only a few scientific publications have been published concerning large-scale landslides in Nepal. In this context, the identification and quantification of large-scale landslides and their potential distribution are crucial. Therefore, this study explores the distribution of large-scale landslides in the Lesser Himalaya. It provides simple guidelines to identify large-scale landslides based on their typical characteristics and using a 3D schematic diagram. Based on the spatial distribution of landslides, geomorphological/geological parameters and logistic regression, an equation of large-scale landslide distribution is also derived. The equation is validated by applying it to another area. For the new area, the area under the receiver operating curve of the landslide distribution probability in the new area is 0.699, and a distribution probability value could explain > 65% of existing landslides. Therefore, the regression equation can be applied to areas of the Lesser Himalaya of central Nepal with similar geological and geomorphological conditions.

  16. Analyzing large-scale proteomics projects with latent semantic indexing.

    PubMed

    Klie, Sebastian; Martens, Lennart; Vizcaíno, Juan Antonio; Côté, Richard; Jones, Phil; Apweiler, Rolf; Hinneburg, Alexander; Hermjakob, Henning

    2008-01-01

    Since the advent of public data repositories for proteomics data, readily accessible results from high-throughput experiments have been accumulating steadily. Several large-scale projects in particular have contributed substantially to the amount of identifications available to the community. Despite the considerable body of information amassed, very few successful analyses have been performed and published on this data, leveling off the ultimate value of these projects far below their potential. A prominent reason published proteomics data is seldom reanalyzed lies in the heterogeneous nature of the original sample collection and the subsequent data recording and processing. To illustrate that at least part of this heterogeneity can be compensated for, we here apply a latent semantic analysis to the data contributed by the Human Proteome Organization's Plasma Proteome Project (HUPO PPP). Interestingly, despite the broad spectrum of instruments and methodologies applied in the HUPO PPP, our analysis reveals several obvious patterns that can be used to formulate concrete recommendations for optimizing proteomics project planning as well as the choice of technologies used in future experiments. It is clear from these results that the analysis of large bodies of publicly available proteomics data by noise-tolerant algorithms such as the latent semantic analysis holds great promise and is currently underexploited.

  17. Bio-Inspired Wooden Actuators for Large Scale Applications

    PubMed Central

    Rüggeberg, Markus; Burgert, Ingo

    2015-01-01

    Implementing programmable actuation into materials and structures is a major topic in the field of smart materials. In particular the bilayer principle has been employed to develop actuators that respond to various kinds of stimuli. A multitude of small scale applications down to micrometer size have been developed, but up-scaling remains challenging due to either limitations in mechanical stiffness of the material or in the manufacturing processes. Here, we demonstrate the actuation of wooden bilayers in response to changes in relative humidity, making use of the high material stiffness and a good machinability to reach large scale actuation and application. Amplitude and response time of the actuation were measured and can be predicted and controlled by adapting the geometry and the constitution of the bilayers. Field tests in full weathering conditions revealed long-term stability of the actuation. The potential of the concept is shown by a first demonstrator. With the sensor and actuator intrinsically incorporated in the wooden bilayers, the daily change in relative humidity is exploited for an autonomous and solar powered movement of a tracker for solar modules. PMID:25835386

  18. Silver nanoparticles: Large scale solvothermal synthesis and optical properties

    SciTech Connect

    Wani, Irshad A.; Khatoon, Sarvari; Ganguly, Aparna; Ahmed, Jahangeer; Ganguli, Ashok K.; Ahmad, Tokeer

    2010-08-15

    Silver nanoparticles have been successfully synthesized by a simple and modified solvothermal method at large scale using ethanol as the refluxing solvent and NaBH{sub 4} as reducing agent. The nanopowder was investigated by means of X-ray diffraction (XRD), transmission electron microscopy (TEM), dynamic light scattering (DLS), UV-visible and BET surface area studies. XRD studies reveal the monophasic nature of these highly crystalline silver nanoparticles. Transmission electron microscopic studies show the monodisperse and highly uniform nanoparticles of silver of the particle size of 5 nm, however, the size is found to be 7 nm using dynamic light scattering which is in good agreement with the TEM and X-ray line broadening studies. The surface area was found to be 34.5 m{sup 2}/g. UV-visible studies show the absorption band at {approx}425 nm due to surface plasmon resonance. The percentage yield of silver nanoparticles was found to be as high as 98.5%.

  19. Bio-inspired wooden actuators for large scale applications.

    PubMed

    Rüggeberg, Markus; Burgert, Ingo

    2015-01-01

    Implementing programmable actuation into materials and structures is a major topic in the field of smart materials. In particular the bilayer principle has been employed to develop actuators that respond to various kinds of stimuli. A multitude of small scale applications down to micrometer size have been developed, but up-scaling remains challenging due to either limitations in mechanical stiffness of the material or in the manufacturing processes. Here, we demonstrate the actuation of wooden bilayers in response to changes in relative humidity, making use of the high material stiffness and a good machinability to reach large scale actuation and application. Amplitude and response time of the actuation were measured and can be predicted and controlled by adapting the geometry and the constitution of the bilayers. Field tests in full weathering conditions revealed long-term stability of the actuation. The potential of the concept is shown by a first demonstrator. With the sensor and actuator intrinsically incorporated in the wooden bilayers, the daily change in relative humidity is exploited for an autonomous and solar powered movement of a tracker for solar modules.

  20. Learning networks for sustainable, large-scale improvement.

    PubMed

    McCannon, C Joseph; Perla, Rocco J

    2009-05-01

    Large-scale improvement efforts known as improvement networks offer structured opportunities for exchange of information and insights into the adaptation of clinical protocols to a variety of settings.

  1. Modified gravity and large scale flows, a review

    NASA Astrophysics Data System (ADS)

    Mould, Jeremy

    2017-02-01

    Large scale flows have been a challenging feature of cosmography ever since galaxy scaling relations came on the scene 40 years ago. The next generation of surveys will offer a serious test of the standard cosmology.

  2. Needs, opportunities, and options for large scale systems research

    SciTech Connect

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  3. Bayesian hierarchical model for large-scale covariance matrix estimation.

    PubMed

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  4. Reynolds stress spectral contribution to the large scale motions in turbulent boundary layers

    NASA Astrophysics Data System (ADS)

    Hommema, Scott E.; Guala, Michele; Adrian, Ronald J.

    2003-11-01

    The importance of large-scale (and very-large-scale) structures has been investigated by Kim & Adrian (1999) but their contribution to the Reynolds stress has not been thoroughly addressed in the literature. In this work, the role of large-scale and very-large-scale motions in the energetics of turbulent pipe flow is investigated through the spectral analyisis of two components thermal anemometry measurements at three different Reynolds numbers (Re_τ = 3815 , 5884 and 7959) and interpreted in the context of the hairpin-vortex-packet model of wall turbulence (Adrian, Meinhart & Tomkins, 2000) Particular attention is dedicated to the cumulative energy fraction of the uv co-spectra, which revealed that half of the energy is associated with structures larger than two times the diameter of the pipe. The estimate of the vertical derivative of the velocity co-spectra, allowed also to shed some light on the contributions of the different scale motions to the Reynolds stress production. Adrian, R.J., Meinhart, C.D., Tomkins, C.D. 2000, Vortex organization in the outer layer region of the boundary layer. J. Fluid Mech., vol 422, 1-54. Kim, K.J., Adrian, R. J., 1999, Very large scale motion in the outer layer. Phys. Fluids, vol. 11(2), 417-422.

  5. Recursive architecture for large-scale adaptive system

    NASA Astrophysics Data System (ADS)

    Hanahara, Kazuyuki; Sugiyama, Yoshihiko

    1994-09-01

    'Large scale' is one of major trends in the research and development of recent engineering, especially in the field of aerospace structural system. This term expresses the large scale of an artifact in general, however, it also implies the large number of the components which make up the artifact in usual. Considering a large scale system which is especially used in remote space or deep-sea, such a system should be adaptive as well as robust by itself, because its control as well as maintenance by human operators are not easy due to the remoteness. An approach to realizing this large scale, adaptive and robust system is to build the system as an assemblage of components which are respectively adaptive by themselves. In this case, the robustness of the system can be achieved by using a large number of such components and suitable adaptation as well as maintenance strategies. Such a system gathers many research's interest and their studies such as decentralized motion control, configurating algorithm and characteristics of structural elements are reported. In this article, a recursive architecture concept is developed and discussed towards the realization of large scale system which consists of a number of uniform adaptive components. We propose an adaptation strategy based on the architecture and its implementation by means of hierarchically connected processing units. The robustness and the restoration from degeneration of the processing unit are also discussed. Two- and three-dimensional adaptive truss structures are conceptually designed based on the recursive architecture.

  6. EINSTEIN'S SIGNATURE IN COSMOLOGICAL LARGE-SCALE STRUCTURE

    SciTech Connect

    Bruni, Marco; Hidalgo, Juan Carlos; Wands, David

    2014-10-10

    We show how the nonlinearity of general relativity generates a characteristic nonGaussian signal in cosmological large-scale structure that we calculate at all perturbative orders in a large-scale limit. Newtonian gravity and general relativity provide complementary theoretical frameworks for modeling large-scale structure in ΛCDM cosmology; a relativistic approach is essential to determine initial conditions, which can then be used in Newtonian simulations studying the nonlinear evolution of the matter density. Most inflationary models in the very early universe predict an almost Gaussian distribution for the primordial metric perturbation, ζ. However, we argue that it is the Ricci curvature of comoving-orthogonal spatial hypersurfaces, R, that drives structure formation at large scales. We show how the nonlinear relation between the spatial curvature, R, and the metric perturbation, ζ, translates into a specific nonGaussian contribution to the initial comoving matter density that we calculate for the simple case of an initially Gaussian ζ. Our analysis shows the nonlinear signature of Einstein's gravity in large-scale structure.

  7. The Internet As a Large-Scale Complex System

    NASA Astrophysics Data System (ADS)

    Park, Kihong; Willinger, Walter

    2005-06-01

    The Internet may be viewed as a "complex system" with diverse features and many components that can give rise to unexpected emergent phenomena, revealing much about its own engineering. This book brings together chapter contributions from a workshop held at the Santa Fe Institute in March 2001. This volume captures a snapshot of some features of the Internet that may be fruitfully approached using a complex systems perspective, meaning using interdisciplinary tools and methods to tackle the subject area. The Internet penetrates the socioeconomic fabric of everyday life; a broader and deeper grasp of the Internet may be needed to meet the challenges facing the future. The resulting empirical data have already proven to be invaluable for gaining novel insights into the network's spatio-temporal dynamics, and can be expected to become even more important when tryin to explain the Internet's complex and emergent behavior in terms of elementary networking-based mechanisms. The discoveries of fractal or self-similar network traffic traces, power-law behavior in network topology and World Wide Web connectivity are instances of unsuspected, emergent system traits. Another important factor at the heart of fair, efficient, and stable sharing of network resources is user behavior. Network systems, when habited by selfish or greedy users, take on the traits of a noncooperative multi-party game, and their stability and efficiency are integral to understanding the overall system and its dynamics. Lastly, fault-tolerance and robustness of large-scale network systems can exhibit spatial and temporal correlations whose effective analysis and management may benefit from rescaling techniques applied in certain physical and biological systems. The present book will bring together several of the leading workers involved in the analysis of complex systems with the future development of the Internet.

  8. APoc: large-scale identification of similar protein pockets

    PubMed Central

    Gao, Mu; Skolnick, Jeffrey

    2013-01-01

    Motivation: Most proteins interact with small-molecule ligands such as metabolites or drug compounds. Over the past several decades, many of these interactions have been captured in high-resolution atomic structures. From a geometric point of view, most interaction sites for grasping these small-molecule ligands, as revealed in these structures, form concave shapes, or ‘pockets’, on the protein’s surface. An efficient method for comparing these pockets could greatly assist the classification of ligand-binding sites, prediction of protein molecular function and design of novel drug compounds. Results: We introduce a computational method, APoc (Alignment of Pockets), for the large-scale, sequence order-independent, structural comparison of protein pockets. A scoring function, the Pocket Similarity Score (PS-score), is derived to measure the level of similarity between pockets. Statistical models are used to estimate the significance of the PS-score based on millions of comparisons of randomly related pockets. APoc is a general robust method that may be applied to pockets identified by various approaches, such as ligand-binding sites as observed in experimental complex structures, or predicted pockets identified by a pocket-detection method. Finally, we curate large benchmark datasets to evaluate the performance of APoc and present interesting examples to demonstrate the usefulness of the method. We also demonstrate that APoc has better performance than the geometric hashing-based method SiteEngine. Availability and implementation: The APoc software package including the source code is freely available at http://cssb.biology.gatech.edu/APoc. Contact: skolnick@gatech.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23335017

  9. Developments in large-scale coastal flood hazard mapping

    NASA Astrophysics Data System (ADS)

    Vousdoukas, Michalis I.; Voukouvalas, Evangelos; Mentaschi, Lorenzo; Dottori, Francesco; Giardino, Alessio; Bouziotas, Dimitrios; Bianchi, Alessandra; Salamon, Peter; Feyen, Luc

    2016-08-01

    Coastal flooding related to marine extreme events has severe socioeconomic impacts, and even though the latter are projected to increase under the changing climate, there is a clear deficit of information and predictive capacity related to coastal flood mapping. The present contribution reports on efforts towards a new methodology for mapping coastal flood hazard at European scale, combining (i) the contribution of waves to the total water level; (ii) improved inundation modeling; and (iii) an open, physics-based framework which can be constantly upgraded, whenever new and more accurate data become available. Four inundation approaches of gradually increasing complexity and computational costs were evaluated in terms of their applicability to large-scale coastal flooding mapping: static inundation (SM); a semi-dynamic method, considering the water volume discharge over the dykes (VD); the flood intensity index approach (Iw); and the model LISFLOOD-FP (LFP). A validation test performed against observed flood extents during the Xynthia storm event showed that SM and VD can lead to an overestimation of flood extents by 232 and 209 %, while Iw and LFP showed satisfactory predictive skill. Application at pan-European scale for the present-day 100-year event confirmed that static approaches can overestimate flood extents by 56 % compared to LFP; however, Iw can deliver results of reasonable accuracy in cases when reduced computational costs are a priority. Moreover, omitting the wave contribution in the extreme total water level (TWL) can result in a ˜ 60 % underestimation of the flooded area. The present findings have implications for impact assessment studies, since combination of the estimated inundation maps with population exposure maps revealed differences in the estimated number of people affected within the 20-70 % range.

  10. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    SciTech Connect

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  11. Temperature dependence of large-scale water retention curves: Acase study

    SciTech Connect

    Liu, Hui-Hai; Bodvarsson, G.S.; Dane, J.H.

    2001-10-26

    A local-scale model for temperature-dependence of water-retention curves may be applicable to large scales. Consideration of this temperature dependence is important for modeling unsaturated flow and transport in the subsurface in numerous cases. Although significant progress has been made in understanding and modeling this temperature effect, almost all the previous studies have been limited to small scales (on the order of several centimeters). Numerical experiments were used to investigate the possibility of extending a local-scale model for the temperature-dependence of water retention curves to large scales (on the order of meters). Temperature effects on large-scale hydraulic properties are of interest in many practical applications. Numerical experiment results indicate that the local-scale model can indeed be applicable to large-scale problems for special porous media with high air entry values. A typical porous medium of this kind is the porous tuff matrix in the unsaturated zone of Yucca Mountain, Nevada, the proposed geologic disposal site for national high-level nuclear wastes. Whether this finding can approximately hold for general cases needs to be investigated in future studies.

  12. Modulation analysis of large-scale discrete vortices.

    PubMed

    Cisneros, Luis A; Minzoni, Antonmaria A; Panayotaros, Panayotis; Smyth, Noel F

    2008-09-01

    The behavior of large-scale vortices governed by the discrete nonlinear Schrödinger equation is studied. Using a discrete version of modulation theory, it is shown how vortices are trapped and stabilized by the self-consistent Peierls-Nabarro potential that they generate in the lattice. Large-scale circular and polygonal vortices are studied away from the anticontinuum limit, which is the limit considered in previous studies. In addition numerical studies are performed on large-scale, straight structures, and it is found that they are stabilized by a nonconstant mean level produced by standing waves generated at the ends of the structure. Finally, numerical evidence is produced for long-lived, localized, quasiperiodic structures.

  13. Large-scale simulations of complex physical systems

    NASA Astrophysics Data System (ADS)

    Belić, A.

    2007-04-01

    Scientific computing has become a tool as vital as experimentation and theory for dealing with scientific challenges of the twenty-first century. Large scale simulations and modelling serve as heuristic tools in a broad problem-solving process. High-performance computing facilities make possible the first step in this process - a view of new and previously inaccessible domains in science and the building up of intuition regarding the new phenomenology. The final goal of this process is to translate this newly found intuition into better algorithms and new analytical results. In this presentation we give an outline of the research themes pursued at the Scientific Computing Laboratory of the Institute of Physics in Belgrade regarding large-scale simulations of complex classical and quantum physical systems, and present recent results obtained in the large-scale simulations of granular materials and path integrals.

  14. Acoustic Studies of the Large Scale Ocean Circulation

    NASA Technical Reports Server (NTRS)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  15. A relativistic signature in large-scale structure

    NASA Astrophysics Data System (ADS)

    Bartolo, Nicola; Bertacca, Daniele; Bruni, Marco; Koyama, Kazuya; Maartens, Roy; Matarrese, Sabino; Sasaki, Misao; Verde, Licia; Wands, David

    2016-09-01

    In General Relativity, the constraint equation relating metric and density perturbations is inherently nonlinear, leading to an effective non-Gaussianity in the dark matter density field on large scales-even if the primordial metric perturbation is Gaussian. Intrinsic non-Gaussianity in the large-scale dark matter overdensity in GR is real and physical. However, the variance smoothed on a local physical scale is not correlated with the large-scale curvature perturbation, so that there is no relativistic signature in the galaxy bias when using the simplest model of bias. It is an open question whether the observable mass proxies such as luminosity or weak lensing correspond directly to the physical mass in the simple halo bias model. If not, there may be observables that encode this relativistic signature.

  16. Toward Improved Support for Loosely Coupled Large Scale Simulation Workflows

    SciTech Connect

    Boehm, Swen; Elwasif, Wael R; Naughton, III, Thomas J; Vallee, Geoffroy R

    2014-01-01

    High-performance computing (HPC) workloads are increasingly leveraging loosely coupled large scale simula- tions. Unfortunately, most large-scale HPC platforms, including Cray/ALPS environments, are designed for the execution of long-running jobs based on coarse-grained launch capabilities (e.g., one MPI rank per core on all allocated compute nodes). This assumption limits capability-class workload campaigns that require large numbers of discrete or loosely coupled simulations, and where time-to-solution is an untenable pacing issue. This paper describes the challenges related to the support of fine-grained launch capabilities that are necessary for the execution of loosely coupled large scale simulations on Cray/ALPS platforms. More precisely, we present the details of an enhanced runtime system to support this use case, and report on initial results from early testing on systems at Oak Ridge National Laboratory.

  17. PKI security in large-scale healthcare networks.

    PubMed

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  18. A large-scale perspective on stress-induced alterations in resting-state networks

    NASA Astrophysics Data System (ADS)

    Maron-Katz, Adi; Vaisvaser, Sharon; Lin, Tamar; Hendler, Talma; Shamir, Ron

    2016-02-01

    Stress is known to induce large-scale neural modulations. However, its neural effect once the stressor is removed and how it relates to subjective experience are not fully understood. Here we used a statistically sound data-driven approach to investigate alterations in large-scale resting-state functional connectivity (rsFC) induced by acute social stress. We compared rsfMRI profiles of 57 healthy male subjects before and after stress induction. Using a parcellation-based univariate statistical analysis, we identified a large-scale rsFC change, involving 490 parcel-pairs. Aiming to characterize this change, we employed statistical enrichment analysis, identifying anatomic structures that were significantly interconnected by these pairs. This analysis revealed strengthening of thalamo-cortical connectivity and weakening of cross-hemispheral parieto-temporal connectivity. These alterations were further found to be associated with change in subjective stress reports. Integrating report-based information on stress sustainment 20 minutes post induction, revealed a single significant rsFC change between the right amygdala and the precuneus, which inversely correlated with the level of subjective recovery. Our study demonstrates the value of enrichment analysis for exploring large-scale network reorganization patterns, and provides new insight on stress-induced neural modulations and their relation to subjective experience.

  19. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    NASA Astrophysics Data System (ADS)

    Blackman, Eric G.

    2015-05-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. Here I discuss how magnetic helicity has come to help us understand the saturation of and sustenance of large scale dynamos, the need for either local or global helicity fluxes to avoid dynamo quenching, and the associated observational consequences. I also discuss how magnetic helicity acts as a hindrance to turbulent diffusion of large scale fields, and thus a helper for fossil remnant large scale field origin models in some contexts. I briefly discuss the connection between large scale fields and accretion disk theory as well. The goal here is to provide a conceptual primer to help the reader efficiently penetrate the literature.

  20. Large scale purification of RNA nanoparticles by preparative ultracentrifugation.

    PubMed

    Jasinski, Daniel L; Schwartz, Chad T; Haque, Farzin; Guo, Peixuan

    2015-01-01

    Purification of large quantities of supramolecular RNA complexes is of paramount importance due to the large quantities of RNA needed and the purity requirements for in vitro and in vivo assays. Purification is generally carried out by liquid chromatography (HPLC), polyacrylamide gel electrophoresis (PAGE), or agarose gel electrophoresis (AGE). Here, we describe an efficient method for the large-scale purification of RNA prepared by in vitro transcription using T7 RNA polymerase by cesium chloride (CsCl) equilibrium density gradient ultracentrifugation and the large-scale purification of RNA nanoparticles by sucrose gradient rate-zonal ultracentrifugation or cushioned sucrose gradient rate-zonal ultracentrifugation.

  1. The Evolution of Baryons in Cosmic Large Scale Structure

    NASA Astrophysics Data System (ADS)

    Snedden, Ali; Arielle Phillips, Lara; Mathews, Grant James; Coughlin, Jared; Suh, In-Saeng; Bhattacharya, Aparna

    2015-01-01

    The environments of galaxies play a critical role in their formation and evolution. We study these environments using cosmological simulations with star formation and supernova feedback included. From these simulations, we parse the large scale structure into clusters, filaments and voids using a segmentation algorithm adapted from medical imaging. We trace the star formation history, gas phase and metal evolution of the baryons in the intergalactic medium as function of structure. We find that our algorithm reproduces the baryon fraction in the intracluster medium and that the majority of star formation occurs in cold, dense filaments. We present the consequences this large scale environment has for galactic halos and galaxy evolution.

  2. [Issues of large scale tissue culture of medicinal plant].

    PubMed

    Lv, Dong-Mei; Yuan, Yuan; Zhan, Zhi-Lai

    2014-09-01

    In order to increase the yield and quality of the medicinal plant and enhance the competitive power of industry of medicinal plant in our country, this paper analyzed the status, problem and countermeasure of the tissue culture of medicinal plant on large scale. Although the biotechnology is one of the most efficient and promising means in production of medicinal plant, it still has problems such as stability of the material, safety of the transgenic medicinal plant and optimization of cultured condition. Establishing perfect evaluation system according to the characteristic of the medicinal plant is the key measures to assure the sustainable development of the tissue culture of medicinal plant on large scale.

  3. Large-Scale Graph Processing Analysis using Supercomputer Cluster

    NASA Astrophysics Data System (ADS)

    Vildario, Alfrido; Fitriyani; Nugraha Nurkahfi, Galih

    2017-01-01

    Graph implementation is widely use in various sector such as automotive, traffic, image processing and many more. They produce graph in large-scale dimension, cause the processing need long computational time and high specification resources. This research addressed the analysis of implementation large-scale graph using supercomputer cluster. We impelemented graph processing by using Breadth-First Search (BFS) algorithm with single destination shortest path problem. Parallel BFS implementation with Message Passing Interface (MPI) used supercomputer cluster at High Performance Computing Laboratory Computational Science Telkom University and Stanford Large Network Dataset Collection. The result showed that the implementation give the speed up averages more than 30 times and eficiency almost 90%.

  4. Corridors Increase Plant Species Richness at Large Scales

    SciTech Connect

    Damschen, Ellen I.; Haddad, Nick M.; Orrock,John L.; Tewksbury, Joshua J.; Levey, Douglas J.

    2006-09-01

    Habitat fragmentation is one of the largest threats to biodiversity. Landscape corridors, which are hypothesized to reduce the negative consequences of fragmentation, have become common features of ecological management plans worldwide. Despite their popularity, there is little evidence documenting the effectiveness of corridors in preserving biodiversity at large scales. Using a large-scale replicated experiment, we showed that habitat patches connected by corridors retain more native plant species than do isolated patches, that this difference increases over time, and that corridors do not promote invasion by exotic species. Our results support the use of corridors in biodiversity conservation.

  5. Clearing and Labeling Techniques for Large-Scale Biological Tissues

    PubMed Central

    Seo, Jinyoung; Choe, Minjin; Kim, Sung-Yon

    2016-01-01

    Clearing and labeling techniques for large-scale biological tissues enable simultaneous extraction of molecular and structural information with minimal disassembly of the sample, facilitating the integration of molecular, cellular and systems biology across different scales. Recent years have witnessed an explosive increase in the number of such methods and their applications, reflecting heightened interest in organ-wide clearing and labeling across many fields of biology and medicine. In this review, we provide an overview and comparison of existing clearing and labeling techniques and discuss challenges and opportunities in the investigations of large-scale biological systems. PMID:27239813

  6. Modulational instability, wave breaking, and formation of large-scale dipoles in the atmosphere.

    PubMed

    Iafrati, A; Babanin, A; Onorato, M

    2013-05-03

    We use direct numerical simulation of the Navier-Stokes equations for a two-phase flow (water and air) to study the dynamics of the modulational instability of free surface waves and its contribution to the interaction between the ocean and atmosphere. If the steepness of the initial wave exceeds a threshold value, we observe wave-breaking events and the formation of large-scale dipole structures in the air. Because of the multiple steepening and breaking of the waves under unstable wave packets, a train of dipoles is released in the atmosphere; those dipoles propagate at a height comparable with the wavelength. The amount of energy dissipated by the breaker in water and air is considered, and contrary to expectations, we observe that the energy dissipation in air is greater than that in water. The possible consequences on the wave modeling and on the exchange of aerosols and gases between air and water are discussed.

  7. Large-scale search for dark-matter axions

    SciTech Connect

    Kinion, D; van Bibber, K

    2000-08-30

    We review the status of two ongoing large-scale searches for axions which may constitute the dark matter of our Milky Way halo. The experiments are based on the microwave cavity technique proposed by Sikivie, and marks a ''second-generation'' to the original experiments performed by the Rochester-Brookhaven-Fermilab collaboration, and the University of Florida group.

  8. Measurement, Sampling, and Equating Errors in Large-Scale Assessments

    ERIC Educational Resources Information Center

    Wu, Margaret

    2010-01-01

    In large-scale assessments, such as state-wide testing programs, national sample-based assessments, and international comparative studies, there are many steps involved in the measurement and reporting of student achievement. There are always sources of inaccuracies in each of the steps. It is of interest to identify the source and magnitude of…

  9. Resilience of Florida Keys coral communities following large scale disturbances

    EPA Science Inventory

    The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...

  10. Large-Scale Machine Learning for Classification and Search

    ERIC Educational Resources Information Center

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  11. Efficient On-Demand Operations in Large-Scale Infrastructures

    ERIC Educational Resources Information Center

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  12. Assuring Quality in Large-Scale Online Course Development

    ERIC Educational Resources Information Center

    Parscal, Tina; Riemer, Deborah

    2010-01-01

    Student demand for online education requires colleges and universities to rapidly expand the number of courses and programs offered online while maintaining high quality. This paper outlines two universities respective processes to assure quality in large-scale online programs that integrate instructional design, eBook custom publishing, Quality…

  13. Large-Scale Assessments and Educational Policies in Italy

    ERIC Educational Resources Information Center

    Damiani, Valeria

    2016-01-01

    Despite Italy's extensive participation in most large-scale assessments, their actual influence on Italian educational policies is less easy to identify. The present contribution aims at highlighting and explaining reasons for the weak and often inconsistent relationship between international surveys and policy-making processes in Italy.…

  14. Improving the Utility of Large-Scale Assessments in Canada

    ERIC Educational Resources Information Center

    Rogers, W. Todd

    2014-01-01

    Principals and teachers do not use large-scale assessment results because the lack of distinct and reliable subtests prevents identifying strengths and weaknesses of students and instruction, the results arrive too late to be used, and principals and teachers need assistance to use the results to improve instruction so as to improve student…

  15. Current Scientific Issues in Large Scale Atmospheric Dynamics

    NASA Technical Reports Server (NTRS)

    Miller, T. L. (Compiler)

    1986-01-01

    Topics in large scale atmospheric dynamics are discussed. Aspects of atmospheric blocking, the influence of transient baroclinic eddies on planetary-scale waves, cyclogenesis, the effects of orography on planetary scale flow, small scale frontal structure, and simulations of gravity waves in frontal zones are discussed.

  16. Large-Scale Innovation and Change in UK Higher Education

    ERIC Educational Resources Information Center

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  17. Mixing Metaphors: Building Infrastructure for Large Scale School Turnaround

    ERIC Educational Resources Information Center

    Peurach, Donald J.; Neumerski, Christine M.

    2015-01-01

    The purpose of this analysis is to increase understanding of the possibilities and challenges of building educational infrastructure--the basic, foundational structures, systems, and resources--to support large-scale school turnaround. Building educational infrastructure often exceeds the capacity of schools, districts, and state education…

  18. Large-Scale Environmental Influences on Aquatic Animal Health

    EPA Science Inventory

    In the latter portion of the 20th century, North America experienced numerous large-scale mortality events affecting a broad diversity of aquatic animals. Short-term forensic investigations of these events have sometimes characterized a causative agent or condition, but have rare...

  19. A bibliographical surveys of large-scale systems

    NASA Technical Reports Server (NTRS)

    Corliss, W. R.

    1970-01-01

    A limited, partly annotated bibliography was prepared on the subject of large-scale system control. Approximately 400 references are divided into thirteen application areas, such as large societal systems and large communication systems. A first-author index is provided.

  20. Probabilistic Cuing in Large-Scale Environmental Search

    ERIC Educational Resources Information Center

    Smith, Alastair D.; Hood, Bruce M.; Gilchrist, Iain D.

    2010-01-01

    Finding an object in our environment is an important human ability that also represents a critical component of human foraging behavior. One type of information that aids efficient large-scale search is the likelihood of the object being in one location over another. In this study we investigated the conditions under which individuals respond to…

  1. Newton Methods for Large Scale Problems in Machine Learning

    ERIC Educational Resources Information Center

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  2. Large-Scale Physical Separation of Depleted Uranium from Soil

    DTIC Science & Technology

    2012-09-01

    ER D C/ EL T R -1 2 - 2 5 Army Range Technology Program Large-Scale Physical Separation of Depleted Uranium from Soil E nv ir on m en ta l...Separation ................................................................................................................ 2   Project Background...5  2   Materials and Methods

  3. Lessons from Large-Scale Renewable Energy Integration Studies: Preprint

    SciTech Connect

    Bird, L.; Milligan, M.

    2012-06-01

    In general, large-scale integration studies in Europe and the United States find that high penetrations of renewable generation are technically feasible with operational changes and increased access to transmission. This paper describes other key findings such as the need for fast markets, large balancing areas, system flexibility, and the use of advanced forecasting.

  4. Computational Complexity, Efficiency and Accountability in Large Scale Teleprocessing Systems.

    DTIC Science & Technology

    1980-12-01

    COMPLEXITY, EFFICIENCY AND ACCOUNTABILITY IN LARGE SCALE TELEPROCESSING SYSTEMS DAAG29-78-C-0036 STANFORD UNIVERSITY JOHN T. GILL MARTIN E. BELLMAN...solve but easy to check. Ve have also suggested howy sucb random tapes can be simulated by determin- istically generating "pseudorandom" numbers by a

  5. Large-scale silicon optical switches for optical interconnection

    NASA Astrophysics Data System (ADS)

    Qiao, Lei; Tang, Weijie; Chu, Tao

    2016-11-01

    Large-scale optical switches are greatly demanded in building optical interconnections in data centers and high performance computers (HPCs). Silicon optical switches have advantages of being compact and CMOS process compatible, which can be easily monolithically integrated. However, there are difficulties to construct large ports silicon optical switches. One of them is the non-uniformity of the switch units in large scale silicon optical switches, which arises from the fabrication error and causes confusion in finding the unit optimum operation points. In this paper, we proposed a method to detect the optimum operating point in large scale switch with limited build-in power monitors. We also propose methods for improving the unbalanced crosstalk of cross/bar states in silicon electro-optical MZI switches and insertion losses. Our recent progress in large scale silicon optical switches, including 64 × 64 thermal-optical and 32 × 32 electro-optical switches will be introduced. To the best our knowledge, both of them are the largest scale silicon optical switches in their sections, respectively. The switches were fabricated on 340-nm SOI substrates with CMOS 180- nm processes. The crosstalk of the 32 × 32 electro-optic switch was -19.2dB to -25.1 dB, while the value of the 64 × 64 thermal-optic switch was -30 dB to -48.3 dB.

  6. The large scale microwave background anisotropy in decaying particle cosmology

    SciTech Connect

    Panek, M.

    1987-06-01

    We investigate the large-scale anisotropy of the microwave background radiation in cosmological models with decaying particles. The observed value of the quadrupole moment combined with other constraints gives an upper limit on the redshift of the decay z/sub d/ < 3-5. 12 refs., 2 figs.

  7. Large Scale Survey Data in Career Development Research

    ERIC Educational Resources Information Center

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  8. The Large-Scale Structure of Scientific Method

    ERIC Educational Resources Information Center

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  9. Ecosystem resilience despite large-scale altered hydro climatic conditions

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Climate change is predicted to increase both drought frequency and duration, and when coupled with substantial warming, will establish a new hydroclimatological paradigm for many regions. Large-scale, warm droughts have recently impacted North America, Africa, Europe, Amazonia, and Australia result...

  10. US National Large-scale City Orthoimage Standard Initiative

    USGS Publications Warehouse

    Zhou, G.; Song, C.; Benjamin, S.; Schickler, W.

    2003-01-01

    The early procedures and algorithms for National digital orthophoto generation in National Digital Orthophoto Program (NDOP) were based on earlier USGS mapping operations, such as field control, aerotriangulation (derived in the early 1920's), the quarter-quadrangle-centered (3.75 minutes of longitude and latitude in geographic extent), 1:40,000 aerial photographs, and 2.5 D digital elevation models. However, large-scale city orthophotos using early procedures have disclosed many shortcomings, e.g., ghost image, occlusion, shadow. Thus, to provide the technical base (algorithms, procedure) and experience needed for city large-scale digital orthophoto creation is essential for the near future national large-scale digital orthophoto deployment and the revision of the Standards for National Large-scale City Digital Orthophoto in National Digital Orthophoto Program (NDOP). This paper will report our initial research results as follows: (1) High-precision 3D city DSM generation through LIDAR data processing, (2) Spatial objects/features extraction through surface material information and high-accuracy 3D DSM data, (3) 3D city model development, (4) Algorithm development for generation of DTM-based orthophoto, and DBM-based orthophoto, (5) True orthophoto generation by merging DBM-based orthophoto and DTM-based orthophoto, and (6) Automatic mosaic by optimizing and combining imagery from many perspectives.

  11. Developing and Understanding Methods for Large-Scale Nonlinear Optimization

    DTIC Science & Technology

    2006-07-24

    algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with -2- many thousands...34Published in peer-reviewed journals" E. Eskow, B. Bader, R. Byrd, S. Crivelli, T. Head-Gordon, V. Lamberti and R. Schnabel, "An optimization approach to the

  12. International Large-Scale Assessments: What Uses, What Consequences?

    ERIC Educational Resources Information Center

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  13. Extracting Useful Semantic Information from Large Scale Corpora of Text

    ERIC Educational Resources Information Center

    Mendoza, Ray Padilla, Jr.

    2012-01-01

    Extracting and representing semantic information from large scale corpora is at the crux of computer-assisted knowledge generation. Semantic information depends on collocation extraction methods, mathematical models used to represent distributional information, and weighting functions which transform the space. This dissertation provides a…

  14. Large scale structure of the sun's radio corona

    NASA Technical Reports Server (NTRS)

    Kundu, M. R.

    1986-01-01

    Results of studies of large scale structures of the corona at long radio wavelengths are presented, using data obtained with the multifrequency radioheliograph of the Clark Lake Radio Observatory. It is shown that features corresponding to coronal streamers and coronal holes are readily apparent in the Clark Lake maps.

  15. Moon-based Earth Observation for Large Scale Geoscience Phenomena

    NASA Astrophysics Data System (ADS)

    Guo, Huadong; Liu, Guang; Ding, Yixing

    2016-07-01

    The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.

  16. Large-scale screening by the automated Wassermann reaction

    PubMed Central

    Wagstaff, W.; Firth, R.; Booth, J. R.; Bowley, C. C.

    1969-01-01

    In view of the drawbacks in the use of the Kahn test for large-scale screening of blood donors, mainly those of human error through work overload and fatiguability, an attempt was made to adapt an existing automated complement-fixation technique for this purpose. This paper reports the successful results of that adaptation. PMID:5776559

  17. Large-scale societal changes and intentionality - an uneasy marriage.

    PubMed

    Bodor, Péter; Fokas, Nikos

    2014-08-01

    Our commentary focuses on juxtaposing the proposed science of intentional change with facts and concepts pertaining to the level of large populations or changes on a worldwide scale. Although we find a unified evolutionary theory promising, we think that long-term and large-scale, scientifically guided - that is, intentional - social change is not only impossible, but also undesirable.

  18. A novel computational approach towards the certification of large-scale boson sampling

    NASA Astrophysics Data System (ADS)

    Huh, Joonsuk

    Recent proposals of boson sampling and the corresponding experiments exhibit the possible disproof of extended Church-Turning Thesis. Furthermore, the application of boson sampling to molecular computation has been suggested theoretically. Till now, however, only small-scale experiments with a few photons have been successfully performed. The boson sampling experiments of 20-30 photons are expected to reveal the computational superiority of the quantum device. A novel theoretical proposal for the large-scale boson sampling using microwave photons is highly promising due to the deterministic photon sources and the scalability. Therefore, the certification protocol of large-scale boson sampling experiments should be presented to complete the exciting story. We propose, in this presentation, a computational protocol towards the certification of large-scale boson sampling. The correlations of paired photon modes and the time-dependent characteristic functional with its Fourier component can show the fingerprint of large-scale boson sampling. This work was supported by Basic Science Research Program through the National Research Foundation of Korea(NRF) funded by the Ministry of Education, Science and Technology(NRF-2015R1A6A3A04059773), the ICT R&D program of MSIP/IITP [2015-019, Fundamental Research Toward Secure Quantum Communication] and Mueunjae Institute for Chemistry (MIC) postdoctoral fellowship.

  19. Generation of large-scale equatorial F-region plasma depletions during geomagnetic storms: A review

    NASA Astrophysics Data System (ADS)

    Sahai, Y.; Fagundes, P.; Bittencourt, J.; Pimenta, A.

    All-sky imaging observations of the F-region OI 630 nm nightglow emission allow us to visualize large - scale equatorial plasma depletions, generally known as transequatorial plasma bubbles. These quasi north south direction aligned- ionospheric plasma depletions are o tical signatures of strong range type equatorialp spread-F. An extensive data base of the OI 630 nm emission all-sky imaging- observations has been obtained at Cachoeira Paulista (22.7o S, 45.0 o W; dip latitude ~16o S), Brazil, between the years 1987 and 2000. An analysis of these observations revealed that normally large-scale ionospheric plasma depletions do not occur during the months of May to August (southern winter) in the Brazilian sector. However, large-scale ionospheric plasma depletions during thes e months have been observed on several occasions in association with geomagnetic storms. In this paper, a detailed analysis of the events when large - scale ionospheric plasma depletions were initiated and evolved during magnetic disturbances will be present ed and discussed.

  20. Topological Properties of Some Integrated Circuits for Very Large Scale Integration Chip Designs

    NASA Astrophysics Data System (ADS)

    Swanson, S.; Lanzerotti, M.; Vernizzi, G.; Kujawski, J.; Weatherwax, A.

    2015-03-01

    This talk presents topological properties of integrated circuits for Very Large Scale Integration chip designs. These circuits can be implemented in very large scale integrated circuits, such as those in high performance microprocessors. Prior work considered basic combinational logic functions and produced a mathematical framework based on algebraic topology for integrated circuits composed of logic gates. Prior work also produced an historically-equivalent interpretation of Mr. E. F. Rent's work for today's complex circuitry in modern high performance microprocessors, where a heuristic linear relationship was observed between the number of connections and number of logic gates. This talk will examine topological properties and connectivity of more complex functionally-equivalent integrated circuits. The views expressed in this article are those of the author and do not reflect the official policy or position of the United States Air Force, Department of Defense or the U.S. Government.

  1. Interaction of a cumulus cloud ensemble with the large-scale environment. I

    NASA Technical Reports Server (NTRS)

    Arakawa, A.; Schuber, W. H.

    1974-01-01

    A theory of the interaction of a cumulus cloud ensemble with the large-scale environment is developed. In this theory, the large-scale environment is divided into the subcloud mixed layer and the region above. The time changes of the environment are governed by the heat and moisture budget equations for the subcloud mixed layer and for the region above, and by a prognostic equation for the depth of the mixed layer. In the environment above the mixed layer, the cumulus convection affects the temperature and moisture fields through cumulus-induced subsidence and detrainment of saturated air containing liquid water which evaporates in the environment. In the subcloud mixed layer, the cumulus convection does not act directly on the temperature and moisture fields, but it affects the depth of the mixed layer through cumulus-induced subsidence.

  2. The influence of large-scale wind power on global climate

    PubMed Central

    Keith, David W.; DeCarolis, Joseph F.; Denkenberger, David C.; Lenschow, Donald H.; Malyshev, Sergey L.; Pacala, Stephen; Rasch, Philip J.

    2004-01-01

    Large-scale use of wind power can alter local and global climate by extracting kinetic energy and altering turbulent transport in the atmospheric boundary layer. We report climate-model simulations that address the possible climatic impacts of wind power at regional to global scales by using two general circulation models and several parameterizations of the interaction of wind turbines with the boundary layer. We find that very large amounts of wind power can produce nonnegligible climatic change at continental scales. Although large-scale effects are observed, wind power has a negligible effect on global-mean surface temperature, and it would deliver enormous global benefits by reducing emissions of CO2 and air pollutants. Our results may enable a comparison between the climate impacts due to wind power and the reduction in climatic impacts achieved by the substitution of wind for fossil fuels. PMID:15536131

  3. The influence of large-scale wind power on global climate.

    PubMed

    Keith, David W; Decarolis, Joseph F; Denkenberger, David C; Lenschow, Donald H; Malyshev, Sergey L; Pacala, Stephen; Rasch, Philip J

    2004-11-16

    Large-scale use of wind power can alter local and global climate by extracting kinetic energy and altering turbulent transport in the atmospheric boundary layer. We report climate-model simulations that address the possible climatic impacts of wind power at regional to global scales by using two general circulation models and several parameterizations of the interaction of wind turbines with the boundary layer. We find that very large amounts of wind power can produce nonnegligible climatic change at continental scales. Although large-scale effects are observed, wind power has a negligible effect on global-mean surface temperature, and it would deliver enormous global benefits by reducing emissions of CO(2) and air pollutants. Our results may enable a comparison between the climate impacts due to wind power and the reduction in climatic impacts achieved by the substitution of wind for fossil fuels.

  4. Large-scale urbanization effects on eastern Asian summer monsoon circulation and climate

    NASA Astrophysics Data System (ADS)

    Chen, Haishan; Zhang, Ye; Yu, Miao; Hua, Wenjian; Sun, Shanlei; Li, Xing; Gao, Chujie

    2016-07-01

    Impacts of large-scale urbanization over eastern China on East Asian summer monsoon circulation and climate are investigated by comparing three 25-year climate simulations with and without incorporating modified land cover maps reflecting two different idealized large-scale urbanization scenarios. The global atmospheric general circulation model CAM4.0 that includes an urban canopy parameterization scheme is employed in this study. The large-scale urbanization over eastern China leads to a significant warming over most of the expanded urban areas, characterized by an increase of 3 K for surface skin temperature, 2.25 K for surface air temperature, significant warming of both daily minimum and daily maximum air temperatures, and 0.4 K for the averaged urban-rural temperature difference. The urbanization is also accompanied by an increase in surface sensible heat flux, a decrease of the net surface shortwave and long-wave radiation, and an enhanced surface thermal heating to the atmosphere in most Eastern Asia areas. It is noted that the responses of the East Asian summer monsoon circulation exhibits an evident month-to-month variation. Across eastern China, the summer monsoon in early summer is strengthened by the large-scale urbanization, but weakened (intensified) over southern (northern) part of East Asia in late summer. Meanwhile, early summer precipitation is intensified in northern and northeastern China and suppressed in south of ~35°N, but late summer precipitation is evidently suppressed over northeast China, the Korean Peninsula and Japan with enhancements in southern China, the South China Sea, and the oceanic region south and southeast of the Taiwan Island. This study highlights the evidently distinct month-to-month responses of the monsoon system to the large-scale urbanization, which might be attributed to different basic states, internal feedbacks (cloud, rainfall) as well as a dynamic adjustment of the atmosphere. Further investigation is required

  5. Large Scale Computing and Storage Requirements for High Energy Physics

    SciTech Connect

    Gerber, Richard A.; Wasserman, Harvey

    2010-11-24

    The National Energy Research Scientific Computing Center (NERSC) is the leading scientific computing facility for the Department of Energy's Office of Science, providing high-performance computing (HPC) resources to more than 3,000 researchers working on about 400 projects. NERSC provides large-scale computing resources and, crucially, the support and expertise needed for scientists to make effective use of them. In November 2009, NERSC, DOE's Office of Advanced Scientific Computing Research (ASCR), and DOE's Office of High Energy Physics (HEP) held a workshop to characterize the HPC resources needed at NERSC to support HEP research through the next three to five years. The effort is part of NERSC's legacy of anticipating users needs and deploying resources to meet those demands. The workshop revealed several key points, in addition to achieving its goal of collecting and characterizing computing requirements. The chief findings: (1) Science teams need access to a significant increase in computational resources to meet their research goals; (2) Research teams need to be able to read, write, transfer, store online, archive, analyze, and share huge volumes of data; (3) Science teams need guidance and support to implement their codes on future architectures; and (4) Projects need predictable, rapid turnaround of their computational jobs to meet mission-critical time constraints. This report expands upon these key points and includes others. It also presents a number of case studies as representative of the research conducted within HEP. Workshop participants were asked to codify their requirements in this case study format, summarizing their science goals, methods of solution, current and three-to-five year computing requirements, and software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, multi-core environment that is expected to dominate HPC architectures over the next few years. The report includes

  6. A first large-scale flood inundation forecasting model

    SciTech Connect

    Schumann, Guy J-P; Neal, Jeffrey C.; Voisin, Nathalie; Andreadis, Konstantinos M.; Pappenberger, Florian; Phanthuwongpakdee, Kay; Hall, Amanda C.; Bates, Paul D.

    2013-11-04

    At present continental to global scale flood forecasting focusses on predicting at a point discharge, with little attention to the detail and accuracy of local scale inundation predictions. Yet, inundation is actually the variable of interest and all flood impacts are inherently local in nature. This paper proposes a first large scale flood inundation ensemble forecasting model that uses best available data and modeling approaches in data scarce areas and at continental scales. The model was built for the Lower Zambezi River in southeast Africa to demonstrate current flood inundation forecasting capabilities in large data-scarce regions. The inundation model domain has a surface area of approximately 170k km2. ECMWF meteorological data were used to force the VIC (Variable Infiltration Capacity) macro-scale hydrological model which simulated and routed daily flows to the input boundary locations of the 2-D hydrodynamic model. Efficient hydrodynamic modeling over large areas still requires model grid resolutions that are typically larger than the width of many river channels that play a key a role in flood wave propagation. We therefore employed a novel sub-grid channel scheme to describe the river network in detail whilst at the same time representing the floodplain at an appropriate and efficient scale. The modeling system was first calibrated using water levels on the main channel from the ICESat (Ice, Cloud, and land Elevation Satellite) laser altimeter and then applied to predict the February 2007 Mozambique floods. Model evaluation showed that simulated flood edge cells were within a distance of about 1 km (one model resolution) compared to an observed flood edge of the event. Our study highlights that physically plausible parameter values and satisfactory performance can be achieved at spatial scales ranging from tens to several hundreds of thousands of km2 and at model grid resolutions up to several km2. However, initial model test runs in forecast mode

  7. Evaluation of variational principle based model for LDPE large scale film blowing process

    NASA Astrophysics Data System (ADS)

    Kolarik, Roman; Zatloukal, Martin

    2013-04-01

    In this work, variational principle based film blowing model combined with Pearson and Petrie formulation, considering non-isothermal processing conditions and novel generalized Newtonian model allowing to capture steady shear and uniaxial extensional viscosities has been validated by using experimentally determined bubble shape and velocity profile for LDPE sample on large scale film blowing line. It has been revealed that the minute change in the flow activation energy can significantly influence the film stretching level.

  8. The Large Scale Synthesis of Aligned Plate Nanostructures

    NASA Astrophysics Data System (ADS)

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-07-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ‧ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential.

  9. Large-scale linear nonparallel support vector machine solver.

    PubMed

    Tian, Yingjie; Ping, Yuan

    2014-02-01

    Twin support vector machines (TWSVMs), as the representative nonparallel hyperplane classifiers, have shown the effectiveness over standard SVMs from some aspects. However, they still have some serious defects restricting their further study and real applications: (1) They have to compute and store the inverse matrices before training, it is intractable for many applications where data appear with a huge number of instances as well as features; (2) TWSVMs lost the sparseness by using a quadratic loss function making the proximal hyperplane close enough to the class itself. This paper proposes a Sparse Linear Nonparallel Support Vector Machine, termed as L1-NPSVM, to deal with large-scale data based on an efficient solver-dual coordinate descent (DCD) method. Both theoretical analysis and experiments indicate that our method is not only suitable for large scale problems, but also performs as good as TWSVMs and SVMs.

  10. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    SciTech Connect

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  11. Electron drift in a large scale solid xenon

    SciTech Connect

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.

  12. Instrumentation Development for Large Scale Hypersonic Inflatable Aerodynamic Decelerator Characterization

    NASA Technical Reports Server (NTRS)

    Swanson, Gregory T.; Cassell, Alan M.

    2011-01-01

    Hypersonic Inflatable Aerodynamic Decelerator (HIAD) technology is currently being considered for multiple atmospheric entry applications as the limitations of traditional entry vehicles have been reached. The Inflatable Re-entry Vehicle Experiment (IRVE) has successfully demonstrated this technology as a viable candidate with a 3.0 m diameter vehicle sub-orbital flight. To further this technology, large scale HIADs (6.0 8.5 m) must be developed and tested. To characterize the performance of large scale HIAD technology new instrumentation concepts must be developed to accommodate the flexible nature inflatable aeroshell. Many of the concepts that are under consideration for the HIAD FY12 subsonic wind tunnel test series are discussed below.

  13. The workshop on iterative methods for large scale nonlinear problems

    SciTech Connect

    Walker, H.F.; Pernice, M.

    1995-12-01

    The aim of the workshop was to bring together researchers working on large scale applications with numerical specialists of various kinds. Applications that were addressed included reactive flows (combustion and other chemically reacting flows, tokamak modeling), porous media flows, cardiac modeling, chemical vapor deposition, image restoration, macromolecular modeling, and population dynamics. Numerical areas included Newton iterative (truncated Newton) methods, Krylov subspace methods, domain decomposition and other preconditioning methods, large scale optimization and optimal control, and parallel implementations and software. This report offers a brief summary of workshop activities and information about the participants. Interested readers are encouraged to look into an online proceedings available at http://www.usi.utah.edu/logan.proceedings. In this, the material offered here is augmented with hypertext abstracts that include links to locations such as speakers` home pages, PostScript copies of talks and papers, cross-references to related talks, and other information about topics addresses at the workshop.

  14. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    SciTech Connect

    Nusser, Adi; Branchini, Enzo; Davis, Marc E-mail: branchin@fis.uniroma3.it

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  15. The Large Scale Synthesis of Aligned Plate Nanostructures

    PubMed Central

    Zhou, Yang; Nash, Philip; Liu, Tian; Zhao, Naiqin; Zhu, Shengli

    2016-01-01

    We propose a novel technique for the large-scale synthesis of aligned-plate nanostructures that are self-assembled and self-supporting. The synthesis technique involves developing nanoscale two-phase microstructures through discontinuous precipitation followed by selective etching to remove one of the phases. The method may be applied to any alloy system in which the discontinuous precipitation transformation goes to completion. The resulting structure may have many applications in catalysis, filtering and thermal management depending on the phase selection and added functionality through chemical reaction with the retained phase. The synthesis technique is demonstrated using the discontinuous precipitation of a γ′ phase, (Ni, Co)3Al, followed by selective dissolution of the γ matrix phase. The production of the nanostructure requires heat treatments on the order of minutes and can be performed on a large scale making this synthesis technique of great economic potential. PMID:27439672

  16. Long gradient mode and large-scale structure observables

    NASA Astrophysics Data System (ADS)

    Allahyari, Alireza; Firouzjaee, Javad T.

    2017-03-01

    We extend the study of long-mode perturbations to other large-scale observables such as cosmic rulers, galaxy-number counts, and halo bias. The long mode is a pure gradient mode that is still outside an observer's horizon. We insist that gradient-mode effects on observables vanish. It is also crucial that the expressions for observables are relativistic. This allows us to show that the effects of a gradient mode on the large-scale observables vanish identically in a relativistic framework. To study the potential modulation effect of the gradient mode on halo bias, we derive a consistency condition to the first order in gradient expansion. We find that the matter variance at a fixed physical scale is not modulated by the long gradient mode perturbations when the consistency condition holds. This shows that the contribution of long gradient modes to bias vanishes in this framework.

  17. Lagrangian space consistency relation for large scale structure

    SciTech Connect

    Horn, Bart; Hui, Lam; Xiao, Xiao E-mail: lh399@columbia.edu

    2015-09-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space.

  18. Large Scale Deformation of the Western US Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2001-01-01

    Destructive earthquakes occur throughout the western US Cordillera (WUSC), not just within the San Andreas fault zone. But because we do not understand the present-day large-scale deformations of the crust throughout the WUSC, our ability to assess the potential for seismic hazards in this region remains severely limited. To address this problem, we are using a large collection of Global Positioning System (GPS) networks which spans the WUSC to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work can roughly be divided into an analysis of the GPS observations to infer the deformation field across and within the entire plate boundary zone and an investigation of the implications of this deformation field regarding plate boundary dynamics.

  19. In the fast lane: large-scale bacterial genome engineering.

    PubMed

    Fehér, Tamás; Burland, Valerie; Pósfai, György

    2012-07-31

    The last few years have witnessed rapid progress in bacterial genome engineering. The long-established, standard ways of DNA synthesis, modification, transfer into living cells, and incorporation into genomes have given way to more effective, large-scale, robust genome modification protocols. Expansion of these engineering capabilities is due to several factors. Key advances include: (i) progress in oligonucleotide synthesis and in vitro and in vivo assembly methods, (ii) optimization of recombineering techniques, (iii) introduction of parallel, large-scale, combinatorial, and automated genome modification procedures, and (iv) rapid identification of the modifications by barcode-based analysis and sequencing. Combination of the brute force of these techniques with sophisticated bioinformatic design and modeling opens up new avenues for the analysis of gene functions and cellular network interactions, but also in engineering more effective producer strains. This review presents a summary of recent technological advances in bacterial genome engineering.

  20. Electron drift in a large scale solid xenon

    DOE PAGES

    Yoo, J.; Jaskierny, W. F.

    2015-08-21

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Furthermore, it is demonstrated that a factor twomore » faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.« less

  1. LARGE-SCALE MOTIONS IN THE PERSEUS GALAXY CLUSTER

    SciTech Connect

    Simionescu, A.; Werner, N.; Urban, O.; Allen, S. W.; Fabian, A. C.; Sanders, J. S.; Mantz, A.; Nulsen, P. E. J.; Takei, Y.

    2012-10-01

    By combining large-scale mosaics of ROSAT PSPC, XMM-Newton, and Suzaku X-ray observations, we present evidence for large-scale motions in the intracluster medium of the nearby, X-ray bright Perseus Cluster. These motions are suggested by several alternating and interleaved X-ray bright, low-temperature, low-entropy arcs located along the east-west axis, at radii ranging from {approx}10 kpc to over a Mpc. Thermodynamic features qualitatively similar to these have previously been observed in the centers of cool-core clusters, and were successfully modeled as a consequence of the gas sloshing/swirling motions induced by minor mergers. Our observations indicate that such sloshing/swirling can extend out to larger radii than previously thought, on scales approaching the virial radius.

  2. The CLASSgal code for relativistic cosmological large scale structure

    SciTech Connect

    Dio, Enea Di; Montanari, Francesco; Durrer, Ruth; Lesgourgues, Julien E-mail: Francesco.Montanari@unige.ch E-mail: Ruth.Durrer@unige.ch

    2013-11-01

    We present accurate and efficient computations of large scale structure observables, obtained with a modified version of the CLASS code which is made publicly available. This code includes all relativistic corrections and computes both the power spectrum C{sub ℓ}(z{sub 1},z{sub 2}) and the corresponding correlation function ξ(θ,z{sub 1},z{sub 2}) of the matter density and the galaxy number fluctuations in linear perturbation theory. For Gaussian initial perturbations, these quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory. We illustrate the usefulness of our code for cosmological parameter estimation through a few simple examples.

  3. A Cloud Computing Platform for Large-Scale Forensic Computing

    NASA Astrophysics Data System (ADS)

    Roussev, Vassil; Wang, Liqiang; Richard, Golden; Marziale, Lodovico

    The timely processing of massive digital forensic collections demands the use of large-scale distributed computing resources and the flexibility to customize the processing performed on the collections. This paper describes MPI MapReduce (MMR), an open implementation of the MapReduce processing model that outperforms traditional forensic computing techniques. MMR provides linear scaling for CPU-intensive processing and super-linear scaling for indexing-related workloads.

  4. Large-Scale Weather Disturbances in Mars’ Southern Extratropics

    NASA Astrophysics Data System (ADS)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2015-11-01

    Between late autumn and early spring, Mars’ middle and high latitudes within its atmosphere support strong mean thermal gradients between the tropics and poles. Observations from both the Mars Global Surveyor (MGS) and Mars Reconnaissance Orbiter (MRO) indicate that this strong baroclinicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). These extratropical weather disturbances are key components of the global circulation. Such wave-like disturbances act as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of large-scale, traveling extratropical synoptic-period disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively lifted and radiatively active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to their northern-hemisphere counterparts, southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are examined. Simulations that adapt Mars’ full topography compared to simulations that utilize synthetic topographies emulating key large-scale features of the southern middle latitudes indicate that Mars’ transient barotropic/baroclinic eddies are highly influenced by the great impact basins of this hemisphere (e.g., Argyre and Hellas). The occurrence of a southern storm zone in late winter and early spring appears to be anchored to the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre

  5. Health-Terrain: Visualizing Large Scale Health Data

    DTIC Science & Technology

    2015-04-01

    Award Number: W81XWH-13-1-0020 TITLE: Health-Terrain: Visualizing Large Scale Health Data PRINCIPAL INVESTIGATOR: Ph.D. Fang, Shiaofen...ADDRESS. 1. REPORT DATE April 2015 2. REPORT TYPE Annual 3. DATES COVERED 7 MAR 2014 – 6 MAR 2015 4. TITLE AND SUBTITLE Health-Terrain: Visualizing ...1) creating a concept space data model, which represents a schema tailored to support diverse visualizations and provides a uniform ontology that

  6. A Holistic Management Architecture for Large-Scale Adaptive Networks

    DTIC Science & Technology

    2007-09-01

    MANAGEMENT ARCHITECTURE FOR LARGE-SCALE ADAPTIVE NETWORKS by Michael R. Clement September 2007 Thesis Advisor: Alex Bordetsky Second Reader...TECHNOLOGY MANAGEMENT from the NAVAL POSTGRADUATE SCHOOL September 2007 Author: Michael R. Clement Approved by: Dr. Alex ...achieve in life is by His will. Ad Majorem Dei Gloriam. To my parents, my family, and Caitlin: For supporting me, listening to me when I got

  7. Large-Scale Optimization for Bayesian Inference in Complex Systems

    SciTech Connect

    Willcox, Karen; Marzouk, Youssef

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their

  8. Large-scale detection of recombination in nucleotide sequences

    NASA Astrophysics Data System (ADS)

    Chan, Cheong Xin; Beiko, Robert G.; Ragan, Mark A.

    2008-01-01

    Genetic recombination following a genetic transfer event can produce heterogeneous phylogenetic histories within sets of genes that share a common ancestral origin. Delineating recombination events will enhance our understanding in genome evolution. However, the task of detecting recombination is not trivial due to effect of more-recent evolutionary changes that can obscure such event from detection. In this paper, we demonstrate the use of a two-phase strategy for detecting recombination events on a large-scale dataset.

  9. Multimodel Design of Large Scale Systems with Multiple Decision Makers.

    DTIC Science & Technology

    1982-08-01

    virtue. 5- , Lead me from darkneu to light. - Lead me from death to eternal Life. ( Vedic Payer) p. I, MULTIMODEL DESIGN OF LARGE SCALE SYSTEMS WITH...guidance during the course of *: this research . He would also like to thank Professors W. R. Perkins, P. V. Kokotovic, T. Basar, and T. N. Trick for...thesis concludes with Chapter 7 where we summarize the results obtained, outline the main contributions, and indicate directions for future research . 7- I

  10. Turbulent amplification of large-scale magnetic fields

    NASA Technical Reports Server (NTRS)

    Montgomery, D.; Chen, H.

    1984-01-01

    Previously-introduced methods for analytically estimating the effects of small-scale turbulent fluctuations on large-scale dynamics are extended to fully three-dimensional magnetohydrodynamics. The problem becomes algebraically tractable in the presence of sufficiently large spectral gaps. The calculation generalizes 'alpha dynamo' calculations, except that the velocity fluctuations and magnetic fluctuations are treated on an independent and equal footing. Earlier expressions for the 'alpha coefficients' of turbulent magnetic field amplification are recovered as a special case.

  11. Space transportation booster engine thrust chamber technology, large scale injector

    NASA Technical Reports Server (NTRS)

    Schneider, J. A.

    1993-01-01

    The objective of the Large Scale Injector (LSI) program was to deliver a 21 inch diameter, 600,000 lbf thrust class injector to NASA/MSFC for hot fire testing. The hot fire test program would demonstrate the feasibility and integrity of the full scale injector, including combustion stability, chamber wall compatibility (thermal management), and injector performance. The 21 inch diameter injector was delivered in September of 1991.

  12. Relic vector field and CMB large scale anomalies

    SciTech Connect

    Chen, Xingang; Wang, Yi E-mail: yw366@cam.ac.uk

    2014-10-01

    We study the most general effects of relic vector fields on the inflationary background and density perturbations. Such effects are observable if the number of inflationary e-folds is close to the minimum requirement to solve the horizon problem. We show that this can potentially explain two CMB large scale anomalies: the quadrupole-octopole alignment and the quadrupole power suppression. We discuss its effect on the parity anomaly. We also provide analytical template for more detailed data comparison.

  13. The large-scale anisotropy with the PAMELA calorimeter

    NASA Astrophysics Data System (ADS)

    Karelin, A.; Adriani, O.; Barbarino, G.; Bazilevskaya, G.; Bellotti, R.; Boezio, M.; Bogomolov, E.; Bongi, M.; Bonvicini, V.; Bottai, S.; Bruno, A.; Cafagna, F.; Campana, D.; Carbone, R.; Carlson, P.; Casolino, M.; Castellini, G.; De Donato, C.; De Santis, C.; De Simone, N.; Di Felice, V.; Formato, V.; Galper, A.; Koldashov, S.; Koldobskiy, S.; Krut'kov, S.; Kvashnin, A.; Leonov, A.; Malakhov, V.; Marcelli, L.; Martucci, M.; Mayorov, A.; Menn, W.; Mergé, M.; Mikhailov, V.; Mocchiutti, E.; Monaco, A.; Mori, N.; Munini, R.; Osteria, G.; Palma, F.; Panico, B.; Papini, P.; Pearce, M.; Picozza, P.; Ricci, M.; Ricciarini, S.; Sarkar, R.; Simon, M.; Scotti, V.; Sparvoli, R.; Spillantini, P.; Stozhkov, Y.; Vacchi, A.; Vannuccini, E.; Vasilyev, G.; Voronov, S.; Yurkin, Y.; Zampa, G.; Zampa, N.

    2015-10-01

    The large-scale anisotropy (or the so-called star-diurnal wave) has been studied using the calorimeter of the space-born experiment PAMELA. The cosmic ray anisotropy has been obtained for the Southern and Northern hemispheres simultaneously in the equatorial coordinate system for the time period 2006-2014. The dipole amplitude and phase have been measured for energies 1-20 TeV n-1.

  14. Supporting large scale applications on networks of workstations

    NASA Technical Reports Server (NTRS)

    Cooper, Robert; Birman, Kenneth P.

    1989-01-01

    Distributed applications on networks of workstations are an increasingly common way to satisfy computing needs. However, existing mechanisms for distributed programming exhibit poor performance and reliability as application size increases. Extension of the ISIS distributed programming system to support large scale distributed applications by providing hierarchical process groups is discussed. Incorporation of hierarchy in the program structure and exploitation of this to limit the communication and storage required in any one component of the distributed system is examined.

  15. The Phoenix series large scale LNG pool fire experiments.

    SciTech Connect

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  16. Information Tailoring Enhancements for Large Scale Social Data

    DTIC Science & Technology

    2016-03-15

    Social Data Progress Report No. 2 Reporting Period: December 16, 2015 – March 15, 2016 Contract No. N00014-15-P-5138 Sponsored by ONR...Intelligent Automation Incorporated Progress Report No. 2 Information Tailoring Enhancements for Large-Scale Social Data Submitted in accordance with...robustness. We imporoved the (i) messaging architecture, (ii) data redundancy, and (iii) service availability of Scraawl computational framework

  17. Host Immunity via Mutable Virtualized Large-Scale Network Containers

    DTIC Science & Technology

    2016-07-25

    system for host immunity that combines virtualization , emulation, and mutable network configurations. This system is deployed on a single host, and...entire !Pv4 address space within 5 Host Immunity via Mutable Virtualized Large-Scale Network Containers 45 minutes from a single machine. Second, when...URL, and we call it URL marker. A URL marker records the information about its parent web page’s URL and the user ID who collects the URL. Thus, when

  18. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    DTIC Science & Technology

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  19. Developing and Understanding Methods for Large Scale Nonlinear Optimization

    DTIC Science & Technology

    2001-12-01

    development of new algorithms for large-scale uncon- strained and constrained optimization problems, including limited-memory methods for problems with...analysis of tensor and SQP methods for singular con- strained optimization", to appear in SIAM Journal on Optimization. Published in peer-reviewed...Mathematica, Vol III, Journal der Deutschen Mathematiker-Vereinigung, 1998. S. Crivelli, B. Bader, R. Byrd, E. Eskow, V. Lamberti , R.Schnabel and T

  20. Large-scale Alfvén vortices

    SciTech Connect

    Onishchenko, O. G.; Horton, W.; Scullion, E.; Fedun, V.

    2015-12-15

    The new type of large-scale vortex structures of dispersionless Alfvén waves in collisionless plasma is investigated. It is shown that Alfvén waves can propagate in the form of Alfvén vortices of finite characteristic radius and characterised by magnetic flux ropes carrying orbital angular momentum. The structure of the toroidal and radial velocity, fluid and magnetic field vorticity, the longitudinal electric current in the plane orthogonal to the external magnetic field are discussed.

  1. Analysis plan for 1985 large-scale tests. Technical report

    SciTech Connect

    McMullan, F.W.

    1983-01-01

    The purpose of this effort is to assist DNA in planning for large-scale (upwards of 5000 tons) detonations of conventional explosives in the 1985 and beyond time frame. Primary research objectives were to investigate potential means to increase blast duration and peak pressures. This report identifies and analyzes several candidate explosives. It examines several charge designs and identifies advantages and disadvantages of each. Other factors including terrain and multiburst techniques are addressed as are test site considerations.

  2. Dispersal Mutualism Incorporated into Large-Scale, Infrequent Disturbances

    PubMed Central

    Parker, V. Thomas

    2015-01-01

    Because of their influence on succession and other community interactions, large-scale, infrequent natural disturbances also should play a major role in mutualistic interactions. Using field data and experiments, I test whether mutualisms have been incorporated into large-scale wildfire by whether the outcomes of a mutualism depend on disturbance. In this study a seed dispersal mutualism is shown to depend on infrequent, large-scale disturbances. A dominant shrubland plant (Arctostaphylos species) produces seeds that make up a persistent soil seed bank and requires fire to germinate. In post-fire stands, I show that seedlings emerging from rodent caches dominate sites experiencing higher fire intensity. Field experiments show that rodents (Perimyscus californicus, P. boylii) do cache Arctostaphylos fruit and bury most seed caches to a sufficient depth to survive a killing heat pulse that a fire might drive into the soil. While the rodent dispersal and caching behavior itself has not changed compared to other habitats, the environmental transformation caused by wildfire converts the caching burial of seed from a dispersal process to a plant fire adaptive trait, and provides the context for stimulating subsequent life history evolution in the plant host. PMID:26151560

  3. Impact of Large-scale Geological Architectures On Recharge

    NASA Astrophysics Data System (ADS)

    Troldborg, L.; Refsgaard, J. C.; Engesgaard, P.; Jensen, K. H.

    Geological and hydrogeological data constitutes the basis for assessment of ground- water flow pattern and recharge zones. The accessibility and applicability of hard ge- ological data is often a major obstacle in deriving plausible conceptual models. Nev- ertheless focus is often on parameter uncertainty caused by the effect of geological heterogeneity due to lack of hard geological data, thus neglecting the possibility of alternative conceptualizations of the large-scale geological architecture. For a catchment in the eastern part of Denmark we have constructed different geologi- cal models based on different conceptualization of the major geological trends and fa- cies architecture. The geological models are equally plausible in a conceptually sense and they are all calibrated to well head and river flow measurements. Comparison of differences in recharge zones and subsequently well protection zones emphasize the importance of assessing large-scale geological architecture in hydrological modeling on regional scale in a non-deterministic way. Geostatistical modeling carried out in a transitional probability framework shows the possibility of assessing multiple re- alizations of large-scale geological architecture from a combination of soft and hard geological information.

  4. Line segment extraction for large scale unorganized point clouds

    NASA Astrophysics Data System (ADS)

    Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan

    2015-04-01

    Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.

  5. Reliability assessment for components of large scale photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Ahadi, Amir; Ghadimi, Noradin; Mirabbasi, Davar

    2014-10-01

    Photovoltaic (PV) systems have significantly shifted from independent power generation systems to a large-scale grid-connected generation systems in recent years. The power output of PV systems is affected by the reliability of various components in the system. This study proposes an analytical approach to evaluate the reliability of large-scale, grid-connected PV systems. The fault tree method with an exponential probability distribution function is used to analyze the components of large-scale PV systems. The system is considered in the various sequential and parallel fault combinations in order to find all realistic ways in which the top or undesired events can occur. Additionally, it can identify areas that the planned maintenance should focus on. By monitoring the critical components of a PV system, it is possible not only to improve the reliability of the system, but also to optimize the maintenance costs. The latter is achieved by informing the operators about the system component's status. This approach can be used to ensure secure operation of the system by its flexibility in monitoring system applications. The implementation demonstrates that the proposed method is effective and efficient and can conveniently incorporate more system maintenance plans and diagnostic strategies.

  6. Learning Short Binary Codes for Large-scale Image Retrieval.

    PubMed

    Liu, Li; Yu, Mengyang; Shao, Ling

    2017-03-01

    Large-scale visual information retrieval has become an active research area in this big data era. Recently, hashing/binary coding algorithms prove to be effective for scalable retrieval applications. Most existing hashing methods require relatively long binary codes (i.e., over hundreds of bits, sometimes even thousands of bits) to achieve reasonable retrieval accuracies. However, for some realistic and unique applications, such as on wearable or mobile devices, only short binary codes can be used for efficient image retrieval due to the limitation of computational resources or bandwidth on these devices. In this paper, we propose a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes (i.e., usually the code length shorter than 100 b) for scalable image retrieval tasks. By exploring the discriminative ability of each dimension of data, MCR can generate one bit binary code for each dimension and simultaneously rank the discriminative separability of each bit according to the proposed cost function. Only top-ranked bits with minimum cost-values are then selected and grouped together to compose the final salient binary codes. Extensive experimental results on large-scale retrieval demonstrate that MCR can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval.

  7. Homogenization of Large-Scale Movement Models in Ecology

    USGS Publications Warehouse

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  8. Large-scale quantization from local correlations in space plasmas

    NASA Astrophysics Data System (ADS)

    Livadiotis, George; McComas, David J.

    2014-05-01

    This study examines the large-scale quantization that can characterize the phase space of certain physical systems. Plasmas are such systems where large-scale quantization, ħ*, is caused by Debye shielding that structures correlations between particles. The value of ħ* is constant—some 12 orders of magnitude larger than the Planck constant—across a wide range of space plasmas, from the solar wind in the inner heliosphere to the distant plasma in the inner heliosheath and the local interstellar medium. This paper develops the foundation and advances the understanding of the concept of plasma quantization; in particular, we (i) show the analogy of plasma to Planck quantization, (ii) show the key points of plasma quantization, (iii) construct some basic quantum mechanical concepts for the large-scale plasma quantization, (iv) investigate the correlation between plasma parameters that implies plasma quantization, when it is approximated by a relation between the magnetosonic energy and the plasma frequency, (v) analyze typical space plasmas throughout the heliosphere and show the constancy of plasma quantization over many orders of magnitude in plasma parameters, (vi) analyze Advanced Composition Explorer (ACE) solar wind measurements to develop another measurement of the value of ħ*, and (vii) apply plasma quantization to derive unknown plasma parameters when some key observable is missing.

  9. Channel capacity of next generation large scale MIMO systems

    NASA Astrophysics Data System (ADS)

    Alshammari, A.; Albdran, S.; Matin, M.

    2016-09-01

    Information rate that can be transferred over a given bandwidth is limited by the information theory. Capacity depends on many factors such as the signal to noise ratio (SNR), channel state information (CSI) and the spatial correlation in the propagation environment. It is very important to increase spectral efficiency in order to meet the growing demand for wireless services. Thus, Multiple input multiple output (MIMO) technology has been developed and applied in most of the wireless standards and it has been very successful in increasing capacity and reliability. As the demand is still increasing, attention now is shifting towards large scale multiple input multiple output (MIMO) which has a potential of bringing orders of magnitude of improvement in spectral and energy efficiency. It has been shown that users channels decorrelate after increasing the number of antennas. As a result, inter-user interference can be avoided since energy can be focused on precise directions. This paper investigates the limits of channel capacity for large scale MIMO. We study the relation between spectral efficiency and the number of antenna N. We use time division duplex (TDD) system in order to obtain CSI using training sequence in the uplink. The same CSI is used for the downlink because the channel is reciprocal. Spectral efficiency is measured for channel model that account for small scale fading while ignoring the effect of large scale fading. It is shown the spectral efficiency can be improved significantly when compared to single antenna systems in ideal circumstances.

  10. Sparse approximation through boosting for learning large scale kernel machines.

    PubMed

    Sun, Ping; Yao, Xin

    2010-06-01

    Recently, sparse approximation has become a preferred method for learning large scale kernel machines. This technique attempts to represent the solution with only a subset of original data points also known as basis vectors, which are usually chosen one by one with a forward selection procedure based on some selection criteria. The computational complexity of several resultant algorithms scales as O(NM(2)) in time and O(NM) in memory, where N is the number of training points and M is the number of basis vectors as well as the steps of forward selection. For some large scale data sets, to obtain a better solution, we are sometimes required to include more basis vectors, which means that M is not trivial in this situation. However, the limited computational resource (e.g., memory) prevents us from including too many vectors. To handle this dilemma, we propose to add an ensemble of basis vectors instead of only one at each forward step. The proposed method, closely related to gradient boosting, could decrease the required number M of forward steps significantly and thus a large fraction of computational cost is saved. Numerical experiments on three large scale regression tasks and a classification problem demonstrate the effectiveness of the proposed approach.

  11. Alteration of Large-Scale Chromatin Structure by Estrogen Receptor

    PubMed Central

    Nye, Anne C.; Rajendran, Ramji R.; Stenoien, David L.; Mancini, Michael A.; Katzenellenbogen, Benita S.; Belmont, Andrew S.

    2002-01-01

    The estrogen receptor (ER), a member of the nuclear hormone receptor superfamily important in human physiology and disease, recruits coactivators which modify local chromatin structure. Here we describe effects of ER on large-scale chromatin structure as visualized in live cells. We targeted ER to gene-amplified chromosome arms containing large numbers of lac operator sites either directly, through a lac repressor-ER fusion protein (lac rep-ER), or indirectly, by fusing lac repressor with the ER interaction domain of the coactivator steroid receptor coactivator 1. Significant decondensation of large-scale chromatin structure, comparable to that produced by the ∼150-fold-stronger viral protein 16 (VP16) transcriptional activator, was produced by ER in the absence of estradiol using both approaches. Addition of estradiol induced a partial reversal of this unfolding by green fluorescent protein-lac rep-ER but not by wild-type ER recruited by a lac repressor-SRC570-780 fusion protein. The chromatin decondensation activity did not require transcriptional activation by ER nor did it require ligand-induced coactivator interactions, and unfolding did not correlate with histone hyperacetylation. Ligand-induced coactivator interactions with helix 12 of ER were necessary for the partial refolding of chromatin in response to estradiol using the lac rep-ER tethering system. This work demonstrates that when tethered or recruited to DNA, ER possesses a novel large-scale chromatin unfolding activity. PMID:11971975

  12. Geospatial Optimization of Siting Large-Scale Solar Projects

    SciTech Connect

    Macknick, J.; Quinby, T.; Caulfield, E.; Gerritsen, M.; Diffendorfer, J.; Haines, S.

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  13. Assessing salivary cortisol in large-scale, epidemiological research.

    PubMed

    Adam, Emma K; Kumari, Meena

    2009-11-01

    Salivary cortisol measures are increasingly being incorporated into large-scale, population-based, or epidemiological research, in which participants are selected to be representative of particular communities or populations of interest, and sample sizes are in the order of hundreds to tens of thousands of participants. These approaches to studying salivary cortisol provide important advantages but pose a set of challenges. The representative nature of sampling, and large samples sizes associated with population-based research offer high generalizability and power, and the ability to examine cortisol functioning in relation to: (a) a wide range of social environments; (b) a diverse array individuals and groups; and (c) a broad set of pre-disease and disease outcomes. The greater importance of high response rates (to maintain generalizability) and higher costs associated with this type of large-scale research, however, requires special adaptations of existing ambulatory cortisol protocols. These include: using the most efficient sample collection protocol possible that still adequately address the specific cortisol-related questions at hand, and ensuring the highest possible response and compliance rates among those individuals invited to participate. Examples of choices made, response rates obtained, and examples of results obtained from existing epidemiological cortisol studies are offered, as are suggestions for the modeling and interpretation of salivary cortisol data obtained in large-scale epidemiological research.

  14. Large-scale investigation of genomic markers for severe periodontitis.

    PubMed

    Suzuki, Asami; Ji, Guijin; Numabe, Yukihiro; Ishii, Keisuke; Muramatsu, Masaaki; Kamoi, Kyuichi

    2004-09-01

    The purpose of the present study was to investigate the genomic markers for periodontitis, using large-scale single-nucleotide polymorphism (SNP) association studies comparing healthy volunteers and patients with periodontitis. Genomic DNA was obtained from 19 healthy volunteers and 22 patients with severe periodontitis, all of whom were Japanese. The subjects were genotyped at 637 SNPs in 244 genes on a large scale, using the TaqMan polymerase chain reaction (PCR) system. Statistically significant differences in allele and genotype frequencies were analyzed with Fisher's exact test. We found statistically significant differences (P < 0.01) between the healthy volunteers and patients with severe periodontitis in the following genes; gonadotropin-releasing hormone 1 (GNRH1), phosphatidylinositol 3-kinase regulatory 1 (PIK3R1), dipeptidylpeptidase 4 (DPP4), fibrinogen-like 2 (FGL2), and calcitonin receptor (CALCR). These results suggest that SNPs in the GNRH1, PIK3R1, DPP4, FGL2, and CALCR genes are genomic markers for severe periodontitis. Our findings indicate the necessity of analyzing SNPs in genes on a large scale (i.e., genome-wide approach), to identify genomic markers for periodontitis.

  15. Large-scale data mining pilot project in human genome

    SciTech Connect

    Musick, R.; Fidelis, R.; Slezak, T.

    1997-05-01

    This whitepaper briefly describes a new, aggressive effort in large- scale data Livermore National Labs. The implications of `large- scale` will be clarified Section. In the short term, this effort will focus on several @ssion-critical questions of Genome project. We will adapt current data mining techniques to the Genome domain, to quantify the accuracy of inference results, and lay the groundwork for a more extensive effort in large-scale data mining. A major aspect of the approach is that we will be fully-staffed data warehousing effort in the human Genome area. The long term goal is strong applications- oriented research program in large-@e data mining. The tools, skill set gained will be directly applicable to a wide spectrum of tasks involving a for large spatial and multidimensional data. This includes applications in ensuring non-proliferation, stockpile stewardship, enabling Global Ecology (Materials Database Industrial Ecology), advancing the Biosciences (Human Genome Project), and supporting data for others (Battlefield Management, Health Care).

  16. Large-scale biodiversity patterns in freshwater phytoplankton.

    PubMed

    Stomp, Maayke; Huisman, Jef; Mittelbach, Gary G; Litchman, Elena; Klausmeier, Christopher A

    2011-11-01

    Our planet shows striking gradients in the species richness of plants and animals, from high biodiversity in the tropics to low biodiversity in polar and high-mountain regions. Recently, similar patterns have been described for some groups of microorganisms, but the large-scale biogeographical distribution of freshwater phytoplankton diversity is still largely unknown. We examined the species diversity of freshwater phytoplankton sampled from 540 lakes and reservoirs distributed across the continental United States and found strong latitudinal, longitudinal, and altitudinal gradients in phytoplankton biodiversity, demonstrating that microorganisms can show substantial geographic variation in biodiversity. Detailed analysis using structural equation models indicated that these large-scale biodiversity gradients in freshwater phytoplankton diversity were mainly driven by local environmental factors, although there were residual direct effects of latitude, longitude, and altitude as well. Specifically, we found that phytoplankton species richness was an increasing saturating function of lake chlorophyll a concentration, increased with lake surface area and possibly increased with water temperature, resembling effects of productivity, habitat area, and temperature on diversity patterns commonly observed for macroorganisms. In turn, these local environmental factors varied along latitudinal, longitudinal, and altitudinal gradients. These results imply that changes in land use or climate that affect these local environmental factors are likely to have major impacts on large-scale biodiversity patterns of freshwater phytoplankton.

  17. A model of plasma heating by large-scale flow

    NASA Astrophysics Data System (ADS)

    Pongkitiwanichakul, P.; Cattaneo, F.; Boldyrev, S.; Mason, J.; Perez, J. C.

    2015-12-01

    In this work, we study the process of energy dissipation triggered by a slow large-scale motion of a magnetized conducting fluid. Our consideration is motivated by the problem of heating the solar corona, which is believed to be governed by fast reconnection events set off by the slow motion of magnetic field lines anchored in the photospheric plasma. To elucidate the physics governing the disruption of the imposed laminar motion and the energy transfer to small scales, we propose a simplified model where the large-scale motion of magnetic field lines is prescribed not at the footpoints but rather imposed volumetrically. As a result, the problem can be treated numerically with an efficient, highly accurate spectral method, allowing us to use a resolution and statistical ensemble exceeding those of the previous work. We find that, even though the large-scale deformations are slow, they eventually lead to reconnection events that drive a turbulent state at smaller scales. The small-scale turbulence displays many of the universal features of field-guided magnetohydrodynamic turbulence like a well-developed inertial range spectrum. Based on these observations, we construct a phenomenological model that gives the scalings of the amplitude of the fluctuations and the energy-dissipation rate as functions of the input parameters. We find good agreement between the numerical results and the predictions of the model.

  18. Large-scale flow generation by inhomogeneous helicity.

    PubMed

    Yokoi, N; Brandenburg, A

    2016-03-01

    The effect of kinetic helicity (velocity-vorticity correlation) on turbulent momentum transport is investigated. The turbulent kinetic helicity (pseudoscalar) enters the Reynolds stress (mirror-symmetric tensor) expression in the form of a helicity gradient as the coupling coefficient for the mean vorticity and/or the angular velocity (axial vector), which suggests the possibility of mean-flow generation in the presence of inhomogeneous helicity. This inhomogeneous helicity effect, which was previously confirmed at the level of a turbulence- or closure-model simulation, is examined with the aid of direct numerical simulations of rotating turbulence with nonuniform helicity sustained by an external forcing. The numerical simulations show that the spatial distribution of the Reynolds stress is in agreement with the helicity-related term coupled with the angular velocity, and that a large-scale flow is generated in the direction of angular velocity. Such a large-scale flow is not induced in the case of homogeneous turbulent helicity. This result confirms the validity of the inhomogeneous helicity effect in large-scale flow generation and suggests that a vortex dynamo is possible even in incompressible turbulence where there is no baroclinicity effect.

  19. Large-scale flow experiments for managing river systems

    USGS Publications Warehouse

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  20. New probes of Cosmic Microwave Background large-scale anomalies

    NASA Astrophysics Data System (ADS)

    Aiola, Simone

    Fifty years of Cosmic Microwave Background (CMB) data played a crucial role in constraining the parameters of the LambdaCDM model, where Dark Energy, Dark Matter, and Inflation are the three most important pillars not yet understood. Inflation prescribes an isotropic universe on large scales, and it generates spatially-correlated density fluctuations over the whole Hubble volume. CMB temperature fluctuations on scales bigger than a degree in the sky, affected by modes on super-horizon scale at the time of recombination, are a clean snapshot of the universe after inflation. In addition, the accelerated expansion of the universe, driven by Dark Energy, leaves a hardly detectable imprint in the large-scale temperature sky at late times. Such fundamental predictions have been tested with current CMB data and found to be in tension with what we expect from our simple LambdaCDM model. Is this tension just a random fluke or a fundamental issue with the present model? In this thesis, we present a new framework to probe the lack of large-scale correlations in the temperature sky using CMB polarization data. Our analysis shows that if a suppression in the CMB polarization correlations is detected, it will provide compelling evidence for new physics on super-horizon scale. To further analyze the statistical properties of the CMB temperature sky, we constrain the degree of statistical anisotropy of the CMB in the context of the observed large-scale dipole power asymmetry. We find evidence for a scale-dependent dipolar modulation at 2.5sigma. To isolate late-time signals from the primordial ones, we test the anomalously high Integrated Sachs-Wolfe effect signal generated by superstructures in the universe. We find that the detected signal is in tension with the expectations from LambdaCDM at the 2.5sigma level, which is somewhat smaller than what has been previously argued. To conclude, we describe the current status of CMB observations on small scales, highlighting the

  1. Robust large-scale parallel nonlinear solvers for simulations.

    SciTech Connect

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any existing linear solver, which makes it simple to write

  2. Foundational perspectives on causality in large-scale brain networks.

    PubMed

    Mannino, Michael; Bressler, Steven L

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  3. Foundational perspectives on causality in large-scale brain networks

    NASA Astrophysics Data System (ADS)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  4. Evolution of Large-Scale Circulation during TOGA COARE: Model Intercomparison and Basic Features.

    NASA Astrophysics Data System (ADS)

    Lau, K.-M.; Sheu, P. J.; Schubert, S.; Ledvina, D.; Weng, H.

    1996-05-01

    An intercomparison study of the evolution of large-scale circulation features during TOGA COARE has been carried out using data from three 4D assimilation systems: the National Meteorological Center (NMC, currently known as the National Center for Environmental Prediction), the Navy Fleet Numerical Oceanography Center, and the NASA Goddard Space Flight Center. Results show that the preliminary assimilation products, though somewhat crude, can provide important information concerning the evolution of the large-scale atmospheric circulation over the tropical western Pacific during TOGA COARE. Large-scale features such as sea level pressure, rotational wind field, and temperature are highly consistent among models. However, the rainfall and wind divergence distributions show poor agreement among models, even though some useful information can still be derived. All three models shows a continuous background rain over the Intensive Flux Area (IFA), even during periods with suppressed convection, in contrast to the radar-estimated rainfall that is more episodic. This may reflect a generic deficiency in the oversimplified representation of large-scale rain in all three models.Based on the comparative model diagnostics, a consistent picture of large-scale evolution and multiscale interaction during TOGA COARF emerges. The propagation of the Madden and Julian Oscillation (MJO) from the equatorial Indian Ocean region into the western Pacific foreshadows the establishment of westerly wind events over the COARE region. The genesis and maintenance of the westerly wind (WW) events during TOGA COARE are related to the establishment of a large-scale east-west pressure dipole between the Maritime Continent and the equatorial central Pacific. This pressure dipole could be identified in part with the ascending (low pressure) and descending (high pressure) branches of the MJO and in part with the fluctuations of the austral summer monsoon.Accompanying the development of WW over the

  5. 75 FR 51843 - In the Matter of Certain Large Scale Integrated Circuit Semiconductor Chips and Products...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-23

    ... Matter of Certain Large Scale Integrated Circuit Semiconductor Chips and Products Containing the Same... certain large scale integrated circuit semiconductor chips and products containing same by reason...

  6. Large-scale fires and time trends of PCDDS/DFs in sediments.

    PubMed

    Sakai, S; Deguchi, S; Takatsuki, H; Uchibo, A

    2001-01-01

    Drastic increases in PCDDs/DFs concentrations were identified in the uppermost layers of a sediment core sample taken from the coastal area of Kobe City. As large-scale fires caused by the Great Hanshin-Awaji earthquake were deemed to be a possible cause, we performed additional sampling of sediment cores and surface sediment samples, estimating the total amount of PCDDs/DFs released from fires and presuming the load to sediments by individual transport routes, such as air and water, using an air diffusion model to investigate the influence of fires. The total amount of PCDDs/DFs released from fires was estimated at 2000 g-total PCDDs/DFs, 22 g-TEQ. Increases in PCDDs/DFs generated in fires were principally transported through water rather than air. If 20% of the total PCDDs/DFs formed in fires had entered water, it would correspond to the entire increase of PCDDs/DFs concentration in sediment cores.

  7. Energetics and Structural Characterization of the large-scale Functional Motion of Adenylate Kinase

    NASA Astrophysics Data System (ADS)

    Formoso, Elena; Limongelli, Vittorio; Parrinello, Michele

    2015-02-01

    Adenylate Kinase (AK) is a signal transducing protein that regulates cellular energy homeostasis balancing between different conformations. An alteration of its activity can lead to severe pathologies such as heart failure, cancer and neurodegenerative diseases. A comprehensive elucidation of the large-scale conformational motions that rule the functional mechanism of this enzyme is of great value to guide rationally the development of new medications. Here using a metadynamics-based computational protocol we elucidate the thermodynamics and structural properties underlying the AK functional transitions. The free energy estimation of the conformational motions of the enzyme allows characterizing the sequence of events that regulate its action. We reveal the atomistic details of the most relevant enzyme states, identifying residues such as Arg119 and Lys13, which play a key role during the conformational transitions and represent druggable spots to design enzyme inhibitors. Our study offers tools that open new areas of investigation on large-scale motion in proteins.

  8. Predicting protein functions from redundancies in large-scale protein interaction networks

    NASA Technical Reports Server (NTRS)

    Samanta, Manoj Pratim; Liang, Shoudan

    2003-01-01

    Interpreting data from large-scale protein interaction experiments has been a challenging task because of the widespread presence of random false positives. Here, we present a network-based statistical algorithm that overcomes this difficulty and allows us to derive functions of unannotated proteins from large-scale interaction data. Our algorithm uses the insight that if two proteins share significantly larger number of common interaction partners than random, they have close functional associations. Analysis of publicly available data from Saccharomyces cerevisiae reveals >2,800 reliable functional associations, 29% of which involve at least one unannotated protein. By further analyzing these associations, we derive tentative functions for 81 unannotated proteins with high certainty. Our method is not overly sensitive to the false positives present in the data. Even after adding 50% randomly generated interactions to the measured data set, we are able to recover almost all (approximately 89%) of the original associations.

  9. Dynamics of large-scale instabilities in conductors electrically exploded in strong magnetic fields

    NASA Astrophysics Data System (ADS)

    Datsko, I. M.; Chaikovsky, S. A.; Labetskaya, N. A.; Oreshkin, V. I.; Ratakhin, N. A.

    2014-11-01

    The growth of large-scale instabilities during the propagation of a nonlinear magnetic diffusion wave through a conductor was studied experimentally. The experiment was carried out using the MIG terawatt pulsed power generator at a peak current up to 2.5 MA with 100 ns rise time. It was observed that instabilities with a wavelength of 150 μm developed on the surface of the conductor hollow part within 160 ns after the onset of current flow, whereas the surface of the solid rod remained almost unperturbed. A system of equations describing the propagation of a nonlinear diffusion wave through a conductor and the growth of thermal instabilities has been solved numerically. It has been revealed that the development of large- scale instabilities is obviously related to the propagation of a nonlinear magnetic diffusion wave.

  10. Large-scale inhomogeneities in solutions of low molar mass compounds and mixtures of liquids: supramolecular structures or nanobubbles?

    PubMed

    Sedlák, Marián; Rak, Dmytro

    2013-02-28

    In textbooks, undersaturated solutions of low molar mass compounds and mixtures of freely miscible liquids are considered as homogeneous at larger length scales exceeding appreciably dimensions of individual molecules. However, growing experimental evidence reveals that it is not the case. Large-scale structures with sizes on the order of 100 nm are present in solutions and mixtures used in everyday life and research practice, especially in aqueous systems. These mesoscale inhomogeneities are long-lived, and (relatively slow) kinetics of their formation can be monitored upon mixing the components. Nevertheless, the nature of these structures and mechanisms behind their formation are not clear yet. Since it was previously suggested that these can be nanobubbles stabilized by adsorbed solute at the gas/solvent interface, we devote the current study to addressing this question. Static and dynamic light scattering was used to investigate solutions and mixtures prepared at ordinary conditions (equilibrated with air at 1 atm), prepared with degassed solvent, and solutions and mixtures degassed after formation of large structures. The behavior of large structures in strong gravitational centrifugal fields was also investigated. Systems from various categories were chosen for this study: aqueous solutions of an inorganic ionic compound (MgSO4), organic ionic compound (citric acid), uncharged organic compound (urea), and a mixture of water with organic solvent freely miscible with water (tert-butyl alcohol). Obtained results show that these structures are not nanobubbles in all cases. Visualization of large-scale structures via nanoparticle tracking analysis is presented. NTA results confirm conclusions from our previous light scattering work.

  11. Large-scale smart passive system for civil engineering applications

    NASA Astrophysics Data System (ADS)

    Jung, Hyung-Jo; Jang, Dong-Doo; Lee, Heon-Jae; Cho, Sang-Won

    2008-03-01

    The smart passive system consisting of a magnetorheological (MR) damper and an electromagnetic induction (EMI) part has been recently proposed. An EMI part can generate the input current for an MR damper from vibration of a structure according to Faraday's law of electromagnetic induction. The control performance of the smart passive system has been demonstrated mainly by numerical simulations. It was verified from the numerical results that the system could be effective to reduce the structural responses in the cases of civil engineering structures such as buildings and bridges. On the other hand, the experimental validation of the system is not sufficiently conducted yet. In this paper, the feasibility of the smart passive system to real-scale structures is investigated. To do this, the large-scale smart passive system is designed, manufactured, and tested. The system consists of the large-capacity MR damper, which has a maximum force level of approximately +/-10,000N, a maximum stroke level of +/-35mm and the maximum current level of 3 A, and the large-scale EMI part, which is designed to generate sufficient induced current for the damper. The applicability of the smart passive system to large real-scale structures is examined through a series of shaking table tests. The magnitudes of the induced current of the EMI part with various sinusoidal excitation inputs are measured. According to the test results, the large-scale EMI part shows the possibility that it could generate the sufficient current or power for changing the damping characteristics of the large-capacity MR damper.

  12. Solving large scale structure in ten easy steps with COLA

    SciTech Connect

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J. E-mail: matiasz@ias.edu

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  13. Infectious diseases in large-scale cat hoarding investigations.

    PubMed

    Polak, K C; Levy, J K; Crawford, P C; Leutenegger, C M; Moriello, K A

    2014-08-01

    Animal hoarders accumulate animals in over-crowded conditions without adequate nutrition, sanitation, and veterinary care. As a result, animals rescued from hoarding frequently have a variety of medical conditions including respiratory infections, gastrointestinal disease, parasitism, malnutrition, and other evidence of neglect. The purpose of this study was to characterize the infectious diseases carried by clinically affected cats and to determine the prevalence of retroviral infections among cats in large-scale cat hoarding investigations. Records were reviewed retrospectively from four large-scale seizures of cats from failed sanctuaries from November 2009 through March 2012. The number of cats seized in each case ranged from 387 to 697. Cats were screened for feline leukemia virus (FeLV) and feline immunodeficiency virus (FIV) in all four cases and for dermatophytosis in one case. A subset of cats exhibiting signs of upper respiratory disease or diarrhea had been tested for infections by PCR and fecal flotation for treatment planning. Mycoplasma felis (78%), calicivirus (78%), and Streptococcus equi subspecies zooepidemicus (55%) were the most common respiratory infections. Feline enteric coronavirus (88%), Giardia (56%), Clostridium perfringens (49%), and Tritrichomonas foetus (39%) were most common in cats with diarrhea. The seroprevalence of FeLV and FIV were 8% and 8%, respectively. In the one case in which cats with lesions suspicious for dermatophytosis were cultured for Microsporum canis, 69/76 lesional cats were culture-positive; of these, half were believed to be truly infected and half were believed to be fomite carriers. Cats from large-scale hoarding cases had high risk for enteric and respiratory infections, retroviruses, and dermatophytosis. Case responders should be prepared for mass treatment of infectious diseases and should implement protocols to prevent transmission of feline or zoonotic infections during the emergency response and when

  14. Statistical analysis of large-scale neuronal recording data

    PubMed Central

    Reed, Jamie L.; Kaas, Jon H.

    2010-01-01

    Relating stimulus properties to the response properties of individual neurons and neuronal networks is a major goal of sensory research. Many investigators implant electrode arrays in multiple brain areas and record from chronically implanted electrodes over time to answer a variety of questions. Technical challenges related to analyzing large-scale neuronal recording data are not trivial. Several analysis methods traditionally used by neurophysiologists do not account for dependencies in the data that are inherent in multi-electrode recordings. In addition, when neurophysiological data are not best modeled by the normal distribution and when the variables of interest may not be linearly related, extensions of the linear modeling techniques are recommended. A variety of methods exist to analyze correlated data, even when data are not normally distributed and the relationships are nonlinear. Here we review expansions of the Generalized Linear Model designed to address these data properties. Such methods are used in other research fields, and the application to large-scale neuronal recording data will enable investigators to determine the variable properties that convincingly contribute to the variances in the observed neuronal measures. Standard measures of neuron properties such as response magnitudes can be analyzed using these methods, and measures of neuronal network activity such as spike timing correlations can be analyzed as well. We have done just that in recordings from 100-electrode arrays implanted in the primary somatosensory cortex of owl monkeys. Here we illustrate how one example method, Generalized Estimating Equations analysis, is a useful method to apply to large-scale neuronal recordings. PMID:20472395

  15. The Large-Scale Current System During Auroral Substorms

    NASA Astrophysics Data System (ADS)

    Gjerloev, Jesper

    2015-04-01

    The substorm process has been discussed for more than four decades and new empirical large-scale models continue to be published. The continued activity implies both the importance and the complexity of the problem. We recently published a new model of the large-scale substorm current system (Gjerloev and Hoffman, JGR, 2014). Based on data from >100 ground magnetometers (obtained from SuperMAG), 116 isolated substorms, global auroral images (obtained by the Polar VIS Earth Camera) and a careful normalization technique we derived an empirical model of the ionospheric equivalent current system. Our model yield some unexpected features that appear inconsistent with the classical single current wedge current system. One of these features is a distinct latitudinal shift of the westward electrojet (WEJ) current between the pre- and post-midnight region and we find evidence that these two WEJ regions are quasi disconnected. This, and other observational facts, led us to propose a modified 3D current system configuration that consists of 2 wedge type systems: a current wedge in the pre-midnight region (bulge current wedge), and another current wedge system in the post-midnight region (oval current wedge). The two wedge systems are shifted in latitude but overlap in local time in the midnight region. Our model is at considerable variance with previous global models and conceptual schematics of the large-scale substorm current system. We speculate that the data coverage, the methodologies and the techniques used in these previous global studies are the cause of the differences in solutions. In this presentation we present our model, compare with other published models and discuss possible causes for the differences.

  16. Solving large scale structure in ten easy steps with COLA

    NASA Astrophysics Data System (ADS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 109Msolar/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 1011Msolar/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  17. Improving Design Efficiency for Large-Scale Heterogeneous Circuits

    NASA Astrophysics Data System (ADS)

    Gregerson, Anthony

    Despite increases in logic density, many Big Data applications must still be partitioned across multiple computing devices in order to meet their strict performance requirements. Among the most demanding of these applications is high-energy physics (HEP), which uses complex computing systems consisting of thousands of FPGAs and ASICs to process the sensor data created by experiments at particles accelerators such as the Large Hadron Collider (LHC). Designing such computing systems is challenging due to the scale of the systems, the exceptionally high-throughput and low-latency performance constraints that necessitate application-specific hardware implementations, the requirement that algorithms are efficiently partitioned across many devices, and the possible need to update the implemented algorithms during the lifetime of the system. In this work, we describe our research to develop flexible architectures for implementing such large-scale circuits on FPGAs. In particular, this work is motivated by (but not limited in scope to) high-energy physics algorithms for the Compact Muon Solenoid (CMS) experiment at the LHC. To make efficient use of logic resources in multi-FPGA systems, we introduce Multi-Personality Partitioning, a novel form of the graph partitioning problem, and present partitioning algorithms that can significantly improve resource utilization on heterogeneous devices while also reducing inter-chip connections. To reduce the high communication costs of Big Data applications, we also introduce Information-Aware Partitioning, a partitioning method that analyzes the data content of application-specific circuits, characterizes their entropy, and selects circuit partitions that enable efficient compression of data between chips. We employ our information-aware partitioning method to improve the performance of the hardware validation platform for evaluating new algorithms for the CMS experiment. Together, these research efforts help to improve the efficiency

  18. LARGE-SCALE CO2 TRANSPORTATION AND DEEP OCEAN SEQUESTRATION

    SciTech Connect

    Hamid Sarv

    1999-03-01

    Technical and economical feasibility of large-scale CO{sub 2} transportation and ocean sequestration at depths of 3000 meters or grater was investigated. Two options were examined for transporting and disposing the captured CO{sub 2}. In one case, CO{sub 2} was pumped from a land-based collection center through long pipelines laid on the ocean floor. Another case considered oceanic tanker transport of liquid carbon dioxide to an offshore floating structure for vertical injection to the ocean floor. In the latter case, a novel concept based on subsurface towing of a 3000-meter pipe, and attaching it to the offshore structure was considered. Budgetary cost estimates indicate that for distances greater than 400 km, tanker transportation and offshore injection through a 3000-meter vertical pipe provides the best method for delivering liquid CO{sub 2} to deep ocean floor depressions. For shorter distances, CO{sub 2} delivery by parallel-laid, subsea pipelines is more cost-effective. Estimated costs for 500-km transport and storage at a depth of 3000 meters by subsea pipelines and tankers were 1.5 and 1.4 dollars per ton of stored CO{sub 2}, respectively. At these prices, economics of ocean disposal are highly favorable. Future work should focus on addressing technical issues that are critical to the deployment of a large-scale CO{sub 2} transportation and disposal system. Pipe corrosion, structural design of the transport pipe, and dispersion characteristics of sinking CO{sub 2} effluent plumes have been identified as areas that require further attention. Our planned activities in the next Phase include laboratory-scale corrosion testing, structural analysis of the pipeline, analytical and experimental simulations of CO{sub 2} discharge and dispersion, and the conceptual economic and engineering evaluation of large-scale implementation.

  19. Large-Scale periodic solar velocities: An observational study

    NASA Technical Reports Server (NTRS)

    Dittmer, P. H.

    1977-01-01

    Observations of large-scale solar velocities were made using the mean field telescope and Babcock magnetograph of the Stanford Solar Observatory. Observations were made in the magnetically insensitive ion line at 5124 A, with light from the center (limb) of the disk right (left) circularly polarized, so that the magnetograph measures the difference in wavelength between center and limb. Computer calculations are made of the wavelength difference produced by global pulsations for spherical harmonics up to second order and of the signal produced by displacing the solar image relative to polarizing optics or diffraction grating.

  20. UAV Data Processing for Large Scale Topographical Mapping

    NASA Astrophysics Data System (ADS)

    Tampubolon, W.; Reinhardt, W.

    2014-06-01

    Large scale topographical mapping in the third world countries is really a prominent challenge in geospatial industries nowadays. On one side the demand is significantly increasing while on the other hand it is constrained by limited budgets available for mapping projects. Since the advent of Act Nr.4/yr.2011 about Geospatial Information in Indonesia, large scale topographical mapping has been on high priority for supporting the nationwide development e.g. detail spatial planning. Usually large scale topographical mapping relies on conventional aerial survey campaigns in order to provide high resolution 3D geospatial data sources. Widely growing on a leisure hobby, aero models in form of the so-called Unmanned Aerial Vehicle (UAV) bring up alternative semi photogrammetric aerial data acquisition possibilities suitable for relatively small Area of Interest (AOI) i.e. <5,000 hectares. For detail spatial planning purposes in Indonesia this area size can be used as a mapping unit since it usually concentrates on the basis of sub district area (kecamatan) level. In this paper different camera and processing software systems will be further analyzed for identifying the best optimum UAV data acquisition campaign components in combination with the data processing scheme. The selected AOI is covering the cultural heritage of Borobudur Temple as one of the Seven Wonders of the World. A detailed accuracy assessment will be concentrated within the object feature of the temple at the first place. Feature compilation involving planimetric objects (2D) and digital terrain models (3D) will be integrated in order to provide Digital Elevation Models (DEM) as the main interest of the topographic mapping activity. By doing this research, incorporating the optimum amount of GCPs in the UAV photo data processing will increase the accuracy along with its high resolution in 5 cm Ground Sampling Distance (GSD). Finally this result will be used as the benchmark for alternative geospatial

  1. Large scale obscuration and related climate effects open literature bibliography

    SciTech Connect

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ``Nuclear Winter Controversy`` in the early 1980`s. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest.

  2. Water-based scintillators for large-scale liquid calorimetry

    SciTech Connect

    Winn, D.R.; Raftery, D.

    1985-02-01

    We have investigated primary and secondary solvent intermediates in search of a recipe to create a bulk liquid scintillator with water as the bulk solvent and common fluors as the solutes. As we are not concerned with energy resolution below 1 MeV in large-scale experiments, light-output at the 10% level of high-quality organic solvent based scintillators is acceptable. We have found encouraging performance from industrial surfactants as primary solvents for PPO and POPOP. This technique may allow economical and environmentally safe bulk scintillator for kiloton-sized high energy calorimetry.

  3. Enabling Large-Scale Biomedical Analysis in the Cloud

    PubMed Central

    Lin, Ying-Chih; Yu, Chin-Sheng; Lin, Yen-Jen

    2013-01-01

    Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable. PMID:24288665

  4. Large-Scale Measurement of Absolute Protein Glycosylation Stoichiometry.

    PubMed

    Sun, Shisheng; Zhang, Hui

    2015-07-07

    Protein glycosylation is one of the most important protein modifications. Glycosylation site occupancy alteration has been implicated in human diseases and cancers. However, current glycoproteomic methods focus on the identification and quantification of glycosylated peptides and glycosylation sites but not glycosylation occupancy or glycoform stoichiometry. Here we describe a method for large-scale determination of the absolute glycosylation stoichiometry using three independent relative ratios. Using this method, we determined 117 absolute N-glycosylation occupancies in OVCAR-3 cells. Finally, we investigated the possible functions and the determinants for partial glycosylation.

  5. Large scale mortality of nestling ardeids caused by nematode infection.

    PubMed

    Wiese, J H; Davidson, W R; Nettles, V F

    1977-10-01

    During the summer of 1976, an epornitic of verminous peritonitis caused by Eustrongylides ignotus resulted in large scale mortality of young herons and egrets on Pea Patch Island, Delaware. Mortality was highest (84%) in snowy egret nestlings ( Egretta thula ) and less severe in great egrets ( Casmerodius albus ), Louisiana herons ( Hydranassa tricolor ), little blue herons ( Florida caerulea ), and black crowned night herons ( Nycticorax nycticorax ). Most deaths occured within the first 4 weeks after hatching. Migration of E. ignotus resulted in multiple perforations of the visceral organs, escape of intestinal contents into the body cavity and subsequent bacterial peritonitis. Killifish ( Fundulus heteroclitus ) served as the source of infective larvae.

  6. Integrated High Accuracy Portable Metrology for Large Scale Structural Testing

    NASA Astrophysics Data System (ADS)

    Klaas, Andrej; Richardson, Paul; Burguete, Richard; Harris, Linden

    2014-06-01

    As the performance and accuracy of analysis tools increases bespoke solutions are more regularly being requested to perform high-accuracy measurement on structural tests to validate these methods. These can include optical methods and full-field techniques in place of the more traditional point measurements. As each test is unique it presents its own individual challenges.In this paper two recent, large scale tests performed by Airbus, will be presented and the metrology solutions that were identified for them will be discussed.

  7. Large-scale normal fluid circulation in helium superflows

    NASA Astrophysics Data System (ADS)

    Galantucci, Luca; Sciacca, Michele; Barenghi, Carlo F.

    2017-01-01

    We perform fully coupled numerical simulations of helium II pure superflows in a channel, with vortex-line density typical of experiments. Peculiar to our model is the computation of the back-reaction of the superfluid vortex motion on the normal fluid and the presence of solid boundaries. We recover the uniform vortex-line density experimentally measured employing second sound resonators and we show that pure superflow in helium II is associated with a large-scale circulation of the normal fluid which can be detected using existing particle-tracking visualization techniques.

  8. Large-scale genotoxicity assessments in the marine environment.

    PubMed Central

    Hose, J E

    1994-01-01

    There are a number of techniques for detecting genotoxicity in the marine environment, and many are applicable to large-scale field assessments. Certain tests can be used to evaluate responses in target organisms in situ while others utilize surrogate organisms exposed to field samples in short-term laboratory bioassays. Genotoxicity endpoints appear distinct from traditional toxicity endpoints, but some have chemical or ecotoxicologic correlates. One versatile end point, the frequency of anaphase aberrations, has been used in several large marine assessments to evaluate genotoxicity in the New York Bight, in sediment from San Francisco Bay, and following the Exxon Valdez oil spill. PMID:7713029

  9. Towards large scale production and separation of carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Alvarez, Noe T.

    Since their discovery, carbon nanotubes (CNTs) have boosted the research and applications of nanotechnology; however, many applications of CNTs are inaccessible because they depend upon large-scale CNT production and separations. Type, chirality and diameter control of CNTs determine many of their physical properties, and such control is still not accesible. This thesis studies the fundamentals for scalable selective reactions of HiPCo CNTs as well as the early phase of routes to an inexpensive approach for large-scale CNT production. In the growth part, this thesis covers a complete wet-chemistry process of catalyst and catalyst support deposition for growth of vertically aligned (VA) CNTs. A wet-chemistry preparation process has significant importance for CNT synthesis through chemical vapor deposition (CVD). CVD is by far, the most suitable and inexpensive process for large-scale CNT production when compared to other common processes such as laser ablation and arc discharge. However, its potential has been limited by low-yielding and difficult preparation processes of catalyst and its support, therefore its competitiveness has been reduced. The wet-chemistry process takes advantage of current nanoparticle technology to deposit the catalyst and the catalyst support as a thin film of nanoparticles, making the protocol simple compared to electron beam evaporation and sputtering processes. In the CNT selective reactions part, this thesis studies UV irradiation of individually dispersed HiPCo CNTs that generates auto-selective reactions in the liquid phase with good control over their diameter and chirality. This technique is ideal for large-scale and continuous-process of separations of CNTs by diameter and type. Additionally, an innovative simple catalyst deposition through abrasion is demonstrated. Simple friction between the catalyst and the substrates deposit a high enough density of metal catalyst particles for successful CNT growth. This simple approach has

  10. Clusters as cornerstones of large-scale structure.

    NASA Astrophysics Data System (ADS)

    Gottlöber, S.; Retzlaff, J.; Turchaninov, V.

    1997-04-01

    Galaxy clusters are one of the best tracers of large-scale structure in the Universe on scales well above 100 Mpc. The authors investigate here the clustering properties of a redshift sample of Abell/ACO clusters and compare the observational sample with mock samples constructed from N-body simulations on the basis of four different cosmological models. The authors discuss the power spectrum, the Minkowski functionals and the void statistics of these samples and conclude, that the SCDM and TCDM models are ruled out whereas the ACDM and BSI models are in agreement with the observational data.

  11. Large-Scale Patterns of Filament Channels and Filaments

    NASA Astrophysics Data System (ADS)

    Mackay, Duncan

    2016-07-01

    In this review the properties and large-scale patterns of filament channels and filaments will be considered. Initially, the global formation locations of filament channels and filaments are discussed, along with their hemispheric pattern. Next, observations of the formation of filament channels and filaments are described where two opposing views are considered. Finally, the wide range of models that have been constructed to consider the formation of filament channels and filaments over long time-scales are described, along with the origin of the hemispheric pattern of filaments.

  12. Quantum computation for large-scale image classification

    NASA Astrophysics Data System (ADS)

    Ruan, Yue; Chen, Hanwu; Tan, Jianing; Li, Xi

    2016-10-01

    Due to the lack of an effective quantum feature extraction method, there is currently no effective way to perform quantum image classification or recognition. In this paper, for the first time, a global quantum feature extraction method based on Schmidt decomposition is proposed. A revised quantum learning algorithm is also proposed that will classify images by computing the Hamming distance of these features. From the experimental results derived from the benchmark database Caltech 101, and an analysis of the algorithm, an effective approach to large-scale image classification is derived and proposed against the background of big data.

  13. Large Scale Composite Manufacturing for Heavy Lift Launch Vehicles

    NASA Technical Reports Server (NTRS)

    Stavana, Jacob; Cohen, Leslie J.; Houseal, Keth; Pelham, Larry; Lort, Richard; Zimmerman, Thomas; Sutter, James; Western, Mike; Harper, Robert; Stuart, Michael

    2012-01-01

    Risk reduction for the large scale composite manufacturing is an important goal to produce light weight components for heavy lift launch vehicles. NASA and an industry team successfully employed a building block approach using low-cost Automated Tape Layup (ATL) of autoclave and Out-of-Autoclave (OoA) prepregs. Several large, curved sandwich panels were fabricated at HITCO Carbon Composites. The aluminum honeycomb core sandwich panels are segments of a 1/16th arc from a 10 meter cylindrical barrel. Lessons learned highlight the manufacturing challenges required to produce light weight composite structures such as fairings for heavy lift launch vehicles.

  14. Large-scale genotoxicity assessments in the marine environment

    SciTech Connect

    Hose, J.E.

    1994-12-01

    There are a number of techniques for detecting genotoxicity in the marine environment, and many are applicable to large-scale field assessments. Certain tests can be used to evaluate responses in target organisms in situ while others utilize surrogate organisms exposed to field samples in short-term laboratory bioassays. Genotoxicity endpoints appear distinct from traditional toxicity endpoints, but some have chemical or ecotoxicologic correlates. One versatile end point, the frequency of anaphase aberrations, has been used in several large marine assessments to evaluate genotoxicity in the New York Bight, in sediment from San Francisco Bay, and following the Exxon Valdez oil spill. 31 refs., 2 tabs.

  15. Solving Large-scale Eigenvalue Problems in SciDACApplications

    SciTech Connect

    Yang, Chao

    2005-06-29

    Large-scale eigenvalue problems arise in a number of DOE applications. This paper provides an overview of the recent development of eigenvalue computation in the context of two SciDAC applications. We emphasize the importance of Krylov subspace methods, and point out its limitations. We discuss the value of alternative approaches that are more amenable to the use of preconditioners, and report the progression using the multi-level algebraic sub-structuring techniques to speed up eigenvalue calculation. In addition to methods for linear eigenvalue problems, we also examine new approaches to solving two types of non-linear eigenvalue problems arising from SciDAC applications.

  16. Analysis Plan for 1985 Large-Scale Tests.

    DTIC Science & Technology

    1983-01-01

    KEY WORDS (Continue on reverse side it necessary mnd Identify by block number) Large-Scale Blasting Agents Multiburst ANFO S:,ock Waves 20. ABSTRACT...CONSIDERATIONS 6 1.5 MULTIBURST TECHNIQUES 6 1.6 TEST SITE CONSIDERATIONS 6 2 CANDIDATE EXPLOSIVES 8 2.1 INTRODUCTION 82.2 ANFO 8 2.2.1 Bulk (Loose) ANFO 11...2.2.2 Bagged ANFO 13 2.3 APEX 1360 15 2.4 NITRIC ACID AND NITROPROPANE 17 2.5 NITROPROPANENITRATE (NPN) 19 2.6 DBA - 22M 21 2.7 HARDENING EMULSION 22 2.8

  17. Frequency domain multiplexing for large-scale bolometer arrays

    SciTech Connect

    Spieler, Helmuth

    2002-05-31

    The development of planar fabrication techniques for superconducting transition-edge sensors has brought large-scale arrays of 1000 pixels or more to the realm of practicality. This raises the problem of reading out a large number of sensors with a tractable number of connections. A possible solution is frequency-domain multiplexing. I summarize basic principles, present various circuit topologies, and discuss design trade-offs, noise performance, cross-talk and dynamic range. The design of a practical device and its readout system is described with a discussion of fabrication issues, practical limits and future prospects.

  18. A Modular Ring Architecture for Large Scale Neural Network Implementations

    NASA Astrophysics Data System (ADS)

    Jump, Lance B.; Ligomenides, Panos A.

    1989-11-01

    Constructing fully parallel, large scale, neural networks is complicated by the problems of providing for massive interconnectivity and of overcoming fan in/out limitations in area-efficient VLSI/WSI realizations. A modular, bus switched, neural ring architecture employing primitive ring (pRing) processors is proposed, which solves the fan in/out and connectivity problems by a dynamically reconfigurable communication ring that synchronously serves identical, radially connected, processing elements. It also allows cost versus performance trade-offs by the assignment of variable numbers of logical neurons to each physical processing element.

  19. Design of a large-scale CFB boiler

    SciTech Connect

    Darling, S.; Li, S.

    1997-12-31

    Many CFB boilers sized 100--150 MWe are in operation, and several others sized 150--250 MWe are in operation or under construction. The next step for CFB technology is the 300--400 MWe size range. This paper will describe Foster Wheeler`s large-scale CFB boiler experience and the design for a 300 MWe CFB boiler. The authors will show how the design incorporates Foster Wheeler`s unique combination of extensive utility experience and CFB boiler experience. All the benefits of CFB technology which include low emissions, fuel flexibility, low maintenance and competitive cost are now available in the 300--400 MWe size range.

  20. Simplified DGS procedure for large-scale genome structural study.

    PubMed

    Jung, Yong-Chul; Xu, Jia; Chen, Jun; Kim, Yeong; Winchester, David; Wang, San Ming

    2009-11-01

    Ditag genome scanning (DGS) uses next-generation DNA sequencing to sequence the ends of ditag fragments produced by restriction enzymes. These sequences are compared to known genome sequences to determine their structure. In order to use DGS for large-scale genome structural studies, we have substantially revised the original protocol by replacing the in vivo genomic DNA cloning with in vitro adaptor ligation, eliminating the ditag concatemerization steps, and replacing the 454 sequencer with Solexa or SOLiD sequencers for ditag sequence collection. This revised protocol further increases genome coverage and resolution and allows DGS to be used to analyze multiple genomes simultaneously.

  1. Decentrally stabilizable linear and bilinear large-scale systems

    NASA Technical Reports Server (NTRS)

    Siljak, D. D.; Vukcevic, M. B.

    1977-01-01

    Two classes of large-scale systems are identified, which can always be stabilized by decentralized feedback control. For the class of systems composed of interconnected linear subsystems, we can choose local controllers for the subsystems to achieve stability of the overall system. The same linear feedback scheme can be used to stabilize a class of linear systems with bilinear interconnections. In this case, however, the scheme is used to establish a finite region of stability for the overall system. The stabilization algorithm is applied to the design of a control system for the Large-Space Telescope.

  2. Large-scale structure from wiggly cosmic strings

    NASA Astrophysics Data System (ADS)

    Vachaspati, Tanmay; Vilenkin, Alexander

    1991-08-01

    Recent simulations of the evolution of cosmic strings indicate the presence of small-scale structure on the strings. It is shown that wakes produced by such 'wiggly' cosmic strings can result in the efficient formation of large-scale structure and large streaming velocities in the universe without significantly affecting the microwave-background isotropy. It is also argued that the motion of strings will lead to the generation of a primordial magnetic field. The most promising version of this scenario appears to be the one in which the universe is dominated by light neutrinos.

  3. Structure and function of large-scale brain systems.

    PubMed

    Koziol, Leonard F; Barker, Lauren A; Joyce, Arthur W; Hrin, Skip

    2014-01-01

    This article introduces the functional neuroanatomy of large-scale brain systems. Both the structure and functions of these brain networks are presented. All human behavior is the result of interactions within and between these brain systems. This system of brain function completely changes our understanding of how cognition and behavior are organized within the brain, replacing the traditional lesion model. Understanding behavior within the context of brain network interactions has profound implications for modifying abstract constructs such as attention, learning, and memory. These constructs also must be understood within the framework of a paradigm shift, which emphasizes ongoing interactions within a dynamically changing environment.

  4. Climate variability rather than overstocking causes recent large scale cover changes of Tibetan pastures

    NASA Astrophysics Data System (ADS)

    Lehnert, Lukas; Wesche, Karsten; Trachte, Katja; Reudenbach, Christoph; Miehe, Georg; Bendix, Jörg

    2016-04-01

    The Tibetan Plateau has been entitled "Third-Pole-Environment" because of its outstanding importance for the climate and the hydrology in East and South-east Asia. Its climatological and hydrological influences are strongly affected by the local grassland vegetation which is supposed to be subject to ongoing degradation. On a local scale, numerous studies focused on grassland degradation of the Tibetan pastures. However, because methods and scales substantially differed among previous studies, the overall pattern of the degradation in the Tibetan Plateau is unknown. Consequently, a satellite based approach was selected to cope with the spatial limitations. Therefore, a MODIS-based vegetation cover product was developed which is fully validated against 600 in situ measurements covering a wide extent of the Tibetan Plateau. The vegetation cover as a proxy for grassland degradation is modelled with low error rates using support vector machine regressions. To identify the changes in the vegetation cover, the trends seen in the new vegetation cover product since the beginning of the new millennium were analysed. The drivers of the vegetation changes were identified by the analysis of trends of climatic variables (precipitation and 2 m air temperature) and land-use (livestock numbers) over the same time. The results reveal that - in contrast to the prevailing opinion - pasture degradation on the Tibetan Plateau is not a generally proceeding process because areas of positive and negative changes are almost equal in extent. The positive and negative vegetation changes have regionally different triggers: While, from 2000 on, the vegetation cover has increased in the north-eastern part of the Tibetan Plateau due to increasing precipitation, it has declined in the central and western parts due to rising air temperature and declining precipitation. Increasing livestock numbers as a result of land use changes exacerbated the negative trends but, contrarily to the assumptions of

  5. The combustion behavior of large scale lithium titanate battery

    PubMed Central

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-01

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(NixCoyMnz)O2/Li4Ti5O12 batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(NixCoyMnz)O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112–121°C on anode tab and 139 to 147°C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li+ distribution are the main causes that lead to the difference. PMID:25586064

  6. Topographically Engineered Large Scale Nanostructures for Plasmonic Biosensing

    PubMed Central

    Xiao, Bo; Pradhan, Sangram K.; Santiago, Kevin C.; Rutherford, Gugu N.; Pradhan, Aswini K.

    2016-01-01

    We demonstrate that a nanostructured metal thin film can achieve enhanced transmission efficiency and sharp resonances and use a large-scale and high-throughput nanofabrication technique for the plasmonic structures. The fabrication technique combines the features of nanoimprint and soft lithography to topographically construct metal thin films with nanoscale patterns. Metal nanogratings developed using this method show significantly enhanced optical transmission (up to a one-order-of-magnitude enhancement) and sharp resonances with full width at half maximum (FWHM) of ~15nm in the zero-order transmission using an incoherent white light source. These nanostructures are sensitive to the surrounding environment, and the resonance can shift as the refractive index changes. We derive an analytical method using a spatial Fourier transformation to understand the enhancement phenomenon and the sensing mechanism. The use of real-time monitoring of protein-protein interactions in microfluidic cells integrated with these nanostructures is demonstrated to be effective for biosensing. The perpendicular transmission configuration and large-scale structures provide a feasible platform without sophisticated optical instrumentation to realize label-free surface plasmon resonance (SPR) sensing. PMID:27072067

  7. Semantic overlay network for large-scale spatial information indexing

    NASA Astrophysics Data System (ADS)

    Zou, Zhiqiang; Wang, Yue; Cao, Kai; Qu, Tianshan; Wang, Zhongmin

    2013-08-01

    The increased demand for online services of spatial information poses new challenges to the combined filed of Computer Science and Geographic Information Science. Amongst others, these include fast indexing of spatial data in distributed networks. In this paper we propose a novel semantic overlay network for large-scale multi-dimensional spatial information indexing, called SON_LSII, which has a hybrid structure integrating a semantic quad-tree and Chord ring. The SON_LSII is a small world overlay network that achieves a very competitive trade-off between indexing efficiency and maintenance overhead. To create SON_LSII, we use an effective semantic clustering strategy that considers two aspects, i.e., the semantic of spatial information that peer holds in overlay network and physical network performances. Based on SON_LSII, a mapping method is used to reduce the multi-dimensional features into a single dimension and an efficient indexing algorithm is presented to support complex range queries of the spatial information with a massive number of concurrent users. The results from extensive experiments demonstrate that SON_LSII is superior to existing overlay networks in various respects, including scalability, maintenance, rate of indexing hits, indexing logical hops, and adaptability. Thus, the proposed SON_LSII can be used for large-scale spatial information indexing.

  8. The combustion behavior of large scale lithium titanate battery.

    PubMed

    Huang, Peifeng; Wang, Qingsong; Li, Ke; Ping, Ping; Sun, Jinhua

    2015-01-14

    Safety problem is always a big obstacle for lithium battery marching to large scale application. However, the knowledge on the battery combustion behavior is limited. To investigate the combustion behavior of large scale lithium battery, three 50 Ah Li(Ni(x)Co(y)Mn(z))O2/Li(4)Ti(5)O(12) batteries under different state of charge (SOC) were heated to fire. The flame size variation is depicted to analyze the combustion behavior directly. The mass loss rate, temperature and heat release rate are used to analyze the combustion behavior in reaction way deeply. Based on the phenomenon, the combustion process is divided into three basic stages, even more complicated at higher SOC with sudden smoke flow ejected. The reason is that a phase change occurs in Li(Ni(x)Co(y)Mn(z))O2 material from layer structure to spinel structure. The critical temperatures of ignition are at 112-121 °C on anode tab and 139 to 147 °C on upper surface for all cells. But the heating time and combustion time become shorter with the ascending of SOC. The results indicate that the battery fire hazard increases with the SOC. It is analyzed that the internal short and the Li(+) distribution are the main causes that lead to the difference.

  9. Maestro: An Orchestration Framework for Large-Scale WSN Simulations

    PubMed Central

    Riliskis, Laurynas; Osipov, Evgeny

    2014-01-01

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123

  10. Large-Scale Low-Boom Inlet Test Overview

    NASA Technical Reports Server (NTRS)

    Hirt, Stefanie

    2011-01-01

    This presentation provides a high level overview of the Large-Scale Low-Boom Inlet Test and was presented at the Fundamental Aeronautics 2011 Technical Conference. In October 2010 a low-boom supersonic inlet concept with flow control was tested in the 8'x6' supersonic wind tunnel at NASA Glenn Research Center (GRC). The primary objectives of the test were to evaluate the inlet stability and operability of a large-scale low-boom supersonic inlet concept by acquiring performance and flowfield validation data, as well as evaluate simple, passive, bleedless inlet boundary layer control options. During this effort two models were tested: a dual stream inlet intended to model potential flight hardware and a single stream design to study a zero-degree external cowl angle and to permit surface flow visualization of the vortex generator flow control on the internal centerbody surface. The tests were conducted by a team of researchers from NASA GRC, Gulfstream Aerospace Corporation, University of Illinois at Urbana-Champaign, and the University of Virginia

  11. Wall turbulence manipulation by large-scale streamwise vortices

    NASA Astrophysics Data System (ADS)

    Iuso, Gaetano; Onorato, Michele; Spazzini, Pier Giorgio; di Cicca, Gaetano Maria

    2002-12-01

    This paper describes an experimental study of the manipulation of a fully developed turbulent channel flow through large-scale streamwise vortices originated by vortex generator jets distributed along the wall in the spanwise direction. Apart from the interest in flow management itself, an important aim of the research is to observe the response of the flow to external perturbations as a technique for investigating the structure of turbulence. Considerable mean and fluctuating skin friction reductions, locally as high as 30% and 50% respectively, were measured for an optimal forcing flow intensity. Mean and fluctuating velocity profiles are also greatly modified by the manipulating large-scale vortices; in particular, attenuation of the turbulence intensity was measured. Moreover the flow manipulation caused an increase in longitudinal coherence of the wall organized motions, accompanied by a reduced frequency of burst events, demonstrated by a reduction of the velocity time derivative PDFs and by an higher intermittency. A strong transversal periodic organization of the flow field was observed, including some typical behaviours in each of the periodic boxes originated by the interaction of the vortex pairs. Results are interpreted and discussed in terms of management of the near-wall turbulent structures and with reference to the wall turbulence regeneration mechanisms suggested in the literature.

  12. Power suppression at large scales in string inflation

    SciTech Connect

    Cicoli, Michele; Downes, Sean; Dutta, Bhaskar E-mail: sddownes@physics.tamu.edu

    2013-12-01

    We study a possible origin of the anomalous suppression of the power spectrum at large angular scales in the cosmic microwave background within the framework of explicit string inflationary models where inflation is driven by a closed string modulus parameterizing the size of the extra dimensions. In this class of models the apparent power loss at large scales is caused by the background dynamics which involves a sharp transition from a fast-roll power law phase to a period of Starobinsky-like slow-roll inflation. An interesting feature of this class of string inflationary models is that the number of e-foldings of inflation is inversely proportional to the string coupling to a positive power. Therefore once the string coupling is tuned to small values in order to trust string perturbation theory, enough e-foldings of inflation are automatically obtained without the need of extra tuning. Moreover, in the less tuned cases the sharp transition responsible for the power loss takes place just before the last 50-60 e-foldings of inflation. We illustrate these general claims in the case of Fibre Inflation where we study the strength of this transition in terms of the attractor dynamics, finding that it induces a pivot from a blue to a redshifted power spectrum which can explain the apparent large scale power loss. We compute the effects of this pivot for example cases and demonstrate how magnitude and duration of this effect depend on model parameters.

  13. Knocking down highly-ordered large-scale nanowire arrays.

    PubMed

    Pevzner, Alexander; Engel, Yoni; Elnathan, Roey; Ducobni, Tamir; Ben-Ishai, Moshit; Reddy, Koteeswara; Shpaisman, Nava; Tsukernik, Alexander; Oksman, Mark; Patolsky, Fernando

    2010-04-14

    The large-scale assembly of nanowire elements with controlled and uniform orientation and density at spatially well-defined locations on solid substrates presents one of the most significant challenges facing their integration in real-world electronic applications. Here, we present the universal "knocking-down" approach, based on the controlled in-place planarization of nanowire elements, for the formation of large-scale ordered nanowire arrays. The controlled planarization of the nanowires is achieved by the use of an appropriate elastomer-covered rigid-roller device. After being knocked down, each nanowire in the array can be easily addressed electrically, by a simple single photolithographic step, to yield a large number of nanoelectrical devices with an unprecedented high-fidelity rate. The approach allows controlling, in only two simple steps, all possible array parameters, that is, nanowire dimensions, chemical composition, orientation, and density. The resulting knocked-down arrays can be further used for the creation of massive nanoelectronic-device arrays. More than million devices were already fabricated with yields over 98% on substrate areas of up, but not limited to, to 10 cm(2).

  14. IP over optical multicasting for large-scale video delivery

    NASA Astrophysics Data System (ADS)

    Jin, Yaohui; Hu, Weisheng; Sun, Weiqiang; Guo, Wei

    2007-11-01

    In the IPTV systems, multicasting will play a crucial role in the delivery of high-quality video services, which can significantly improve bandwidth efficiency. However, the scalability and the signal quality of current IPTV can barely compete with the existing broadcast digital TV systems since it is difficult to implement large-scale multicasting with end-to-end guaranteed quality of service (QoS) in packet-switched IP network. China 3TNet project aimed to build a high performance broadband trial network to support large-scale concurrent streaming media and interactive multimedia services. The innovative idea of 3TNet is that an automatic switched optical networks (ASON) with the capability of dynamic point-to-multipoint (P2MP) connections replaces the conventional IP multicasting network in the transport core, while the edge remains an IP multicasting network. In this paper, we will introduce the network architecture and discuss challenges in such IP over Optical multicasting for video delivery.

  15. Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping

    NASA Astrophysics Data System (ADS)

    Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.

    2016-06-01

    When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  16. High Speed Networking and Large-scale Simulation in Geodynamics

    NASA Technical Reports Server (NTRS)

    Kuang, Weijia; Gary, Patrick; Seablom, Michael; Truszkowski, Walt; Odubiyi, Jide; Jiang, Weiyuan; Liu, Dong

    2004-01-01

    Large-scale numerical simulation has been one of the most important approaches for understanding global geodynamical processes. In this approach, peta-scale floating point operations (pflops) are often required to carry out a single physically-meaningful numerical experiment. For example, to model convective flow in the Earth's core and generation of the geomagnetic field (geodynamo), simulation for one magnetic free-decay time (approximately 15000 years) with a modest resolution of 150 in three spatial dimensions would require approximately 0.2 pflops. If such a numerical model is used to predict geomagnetic secular variation over decades and longer, with e.g. an ensemble Kalman filter assimilation approach, approximately 30 (and perhaps more) independent simulations of similar scales would be needed for one data assimilation analysis. Obviously, such a simulation would require an enormous computing resource that exceeds the capacity of a single facility currently available at our disposal. One solution is to utilize a very fast network (e.g. 10Gb optical networks) and available middleware (e.g. Globus Toolkit) to allocate available but often heterogeneous resources for such large-scale computing efforts. At NASA GSFC, we are experimenting with such an approach by networking several clusters for geomagnetic data assimilation research. We shall present our initial testing results in the meeting.

  17. Maestro: an orchestration framework for large-scale WSN simulations.

    PubMed

    Riliskis, Laurynas; Osipov, Evgeny

    2014-03-18

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation.

  18. Detecting differential protein expression in large-scale population proteomics

    SciTech Connect

    Ryu, Soyoung; Qian, Weijun; Camp, David G.; Smith, Richard D.; Tompkins, Ronald G.; Davis, Ronald W.; Xiao, Wenzhong

    2014-06-17

    Mass spectrometry-based high-throughput quantitative proteomics shows great potential in clinical biomarker studies, identifying and quantifying thousands of proteins in biological samples. However, methods are needed to appropriately handle issues/challenges unique to mass spectrometry data in order to detect as many biomarker proteins as possible. One issue is that different mass spectrometry experiments generate quite different total numbers of quantified peptides, which can result in more missing peptide abundances in an experiment with a smaller total number of quantified peptides. Another issue is that the quantification of peptides is sometimes absent, especially for less abundant peptides and such missing values contain the information about the peptide abundance. Here, we propose a Significance Analysis for Large-scale Proteomics Studies (SALPS) that handles missing peptide intensity values caused by the two mechanisms mentioned above. Our model has a robust performance in both simulated data and proteomics data from a large clinical study. Because varying patients’ sample qualities and deviating instrument performances are not avoidable for clinical studies performed over the course of several years, we believe that our approach will be useful to analyze large-scale clinical proteomics data.

  19. Brief Mental Training Reorganizes Large-Scale Brain Networks

    PubMed Central

    Tang, Yi-Yuan; Tang, Yan; Tang, Rongxiang; Lewis-Peacock, Jarrod A.

    2017-01-01

    Emerging evidences have shown that one form of mental training—mindfulness meditation, can improve attention, emotion regulation and cognitive performance through changing brain activity and structural connectivity. However, whether and how the short-term mindfulness meditation alters large-scale brain networks are not well understood. Here, we applied a novel data-driven technique, the multivariate pattern analysis (MVPA) to resting-state fMRI (rsfMRI) data to identify changes in brain activity patterns and assess the neural mechanisms induced by a brief mindfulness training—integrative body–mind training (IBMT), which was previously reported in our series of randomized studies. Whole brain rsfMRI was performed on an undergraduate group who received 2 weeks of IBMT with 30 min per session (5 h training in total). Classifiers were trained on measures of functional connectivity in this fMRI data, and they were able to reliably differentiate (with 72% accuracy) patterns of connectivity from before vs. after the IBMT training. After training, an increase in positive functional connections (60 connections) were detected, primarily involving bilateral superior/middle occipital gyrus, bilateral frontale operculum, bilateral superior temporal gyrus, right superior temporal pole, bilateral insula, caudate and cerebellum. These results suggest that brief mental training alters the functional connectivity of large-scale brain networks at rest that may involve a portion of the neural circuitry supporting attention, cognitive and affective processing, awareness and sensory integration and reward processing. PMID:28293180

  20. Large Scale Organization of a Near Wall Turbulent Boundary Layer

    NASA Astrophysics Data System (ADS)

    Stanislas, Michel; Dekou Tiomajou, Raoul Florent; Foucaut, Jean Marc

    2016-11-01

    This study lies in the context of large scale coherent structures investigation in a near wall turbulent boundary layer. An experimental database at high Reynolds numbers (Re θ = 9830 and Re θ = 19660) was obtained in the LML wind tunnel with stereo-PIV at 4 Hz and hot wire anemometry at 30 kHz. A Linear Stochastic Estimation procedure, is used to reconstruct a 3 component field resolved in space and time. Algorithms were developed to extract coherent structures from the reconstructed field. A sample of 3D view of the structures is depicted in Figure 1. Uniform momentum regions are characterized with their mean hydraulic diameter in the YZ plane, their life time and their contribution to Reynolds stresses. The vortical motions are characterized by their position, radius, circulation and vorticity in addition to their life time and their number computed at a fixed position from the wall. The spatial organization of the structures was investigated through a correlation of their respective indicative functions in the spanwise direction. The simplified large scale model that arise is compared to the ones available in the literature. Streamwise low (green) and high (yellow) uniform momentum regions with positive (red) and negative (blue) vortical motions. This work was supported by Campus International pour la Sécurité et l'Intermodalité des Transports.

  1. Ecohydrological modeling for large-scale environmental impact assessment.

    PubMed

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model.

  2. The Impact of Large Scale Environments on Cluster Entropy Profiles

    NASA Astrophysics Data System (ADS)

    Trierweiler, Isabella; Su, Yuanyuan

    2017-01-01

    We perform a systematic analysis of 21 clusters imaged by the Suzaku satellite to determine the relation between the richness of cluster environments and entropy at large radii. Entropy profiles for clusters are expected to follow a power-law, but Suzaku observations show that the entropy profiles of many clusters are significantly flattened beyond 0.3 Rvir. While the entropy at the outskirts of clusters is thought to be highly dependent on the large scale cluster environment, the exact nature of the environment/entropy relation is unclear. Using the Sloan Digital Sky Survey and 6dF Galaxy Survey, we study the 20 Mpc large scale environment for all clusters in our sample. We find no strong relation between the entropy deviations at the virial radius and the total luminosity of the cluster surroundings, indicating that accretion and mergers have a more complex and indirect influence on the properties of the gas at large radii. We see a possible anti-correlation between virial temperature and richness of the cluster environment and find that density excess appears to play a larger role in the entropy flattening than temperature, suggesting that clumps of gas can lower entropy.

  3. The effective field theory of cosmological large scale structures

    SciTech Connect

    Carrasco, John Joseph M.; Hertzberg, Mark P.; Senatore, Leonardo

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  4. Large-scale Direct Targeting for Drug Repositioning and Discovery

    PubMed Central

    Zheng, Chunli; Guo, Zihu; Huang, Chao; Wu, Ziyin; Li, Yan; Chen, Xuetong; Fu, Yingxue; Ru, Jinlong; Ali Shar, Piar; Wang, Yuan; Wang, Yonghua

    2015-01-01

    A system-level identification of drug-target direct interactions is vital to drug repositioning and discovery. However, the biological means on a large scale remains challenging and expensive even nowadays. The available computational models mainly focus on predicting indirect interactions or direct interactions on a small scale. To address these problems, in this work, a novel algorithm termed weighted ensemble similarity (WES) has been developed to identify drug direct targets based on a large-scale of 98,327 drug-target relationships. WES includes: (1) identifying the key ligand structural features that are highly-related to the pharmacological properties in a framework of ensemble; (2) determining a drug’s affiliation of a target by evaluation of the overall similarity (ensemble) rather than a single ligand judgment; and (3) integrating the standardized ensemble similarities (Z score) by Bayesian network and multi-variate kernel approach to make predictions. All these lead WES to predict drug direct targets with external and experimental test accuracies of 70% and 71%, respectively. This shows that the WES method provides a potential in silico model for drug repositioning and discovery. PMID:26155766

  5. Modulation of energetic coherent motions by large-scale topography

    NASA Astrophysics Data System (ADS)

    Lai, Wing; Hamed, Ali M.; Troolin, Dan; Chamorro, Leonardo P.

    2016-11-01

    The distinctive characteristics and dynamics of the large-scale coherent motions induced over 2D and 3D large-scale wavy walls were explored experimentally with time-resolved volumetric PIV, and selected wall-normal high-resolution stereo PIV in a refractive-index-matching channel. The 2D wall consists of a sinusoidal wave in the streamwise direction with amplitude to wavelength ratio a/ λx = 0.05, while the 3D wall has an additional wave in the spanwise direction with a/ λy = 0.1. The ?ow was characterized at Re 8000, based on the bulk velocity and the channel half height. The walls are such that the amplitude to boundary layer thickness ratio is a/ δ99 0.1, which resemble geophysical-like topography. Insight on the dynamics of the coherent motions, Reynolds stress and spatial interaction of sweep and ejection events will be discussed in terms of the wall topography modulation.

  6. Very-large-scale coherent motions in open channel flows

    NASA Astrophysics Data System (ADS)

    Zhong, Qiang; Hussain, Fazle; Li, Dan-Xun

    2016-11-01

    Very-large-scale coherent structures (VLSSs) - whose characteristic length is of the order of 10 h (h is the water depth) - are found to exist in the log and outer layers near the bed of open channel flows. For decades researchers have speculated that large coherent structures may exist in open channel flows. However, conclusive evidence is still lacking. The present study employed pre-multiplied velocity power spectral and co-spectral analyses of time-resolved PIV data obtained in open channel flows. In all cases, two modes - large-scale structures (of the order of h) and VLSSs - dominate the log and outer layers of the turbulent boundary layer. More than half of TKE and 40% of the Reynolds shear stress in the log and outer layers are contributed by VLSSs. The strength difference of VLSSs between open and closed channel flows leads to pronounced redistribution of TKE near the free surface of open channel flows, which is a unique phenomenon that sets the open channel flows apart from other wall-bounded turbulent flows. Funded by China Postdoctoral Science Foundation (No.2015M580105), National Natural Science Foundation of China (No.51127006).

  7. Resonant plankton patchiness induced by large-scale turbulent flow

    NASA Astrophysics Data System (ADS)

    McKiver, William J.; Neufeld, Zoltán

    2011-01-01

    Here we study how large-scale variability of oceanic plankton is affected by mesoscale turbulence in a spatially heterogeneous environment. We consider a phytoplankton-zooplankton (PZ) ecosystem model, with different types of zooplankton grazing functions, coupled to a turbulent flow described by the two-dimensional Navier-Stokes equations, representing large-scale horizontal transport in the ocean. We characterize the system using a dimensionless parameter, γ=TB/TF, which is the ratio of the ecosystem biological time scale TB and the flow time scale TF. Through numerical simulations, we examine how the PZ system depends on the time-scale ratio γ and find that the variance of both species changes significantly, with maximum phytoplankton variability at intermediate mixing rates. Through an analysis of the linearized population dynamics, we find an analytical solution based on the forced harmonic oscillator, which explains the behavior of the ecosystem, where there is resonance between the advection and the ecosystem predator-prey dynamics when the forcing time scales match the ecosystem time scales. We also examine the dependence of the power spectra on γ and find that the resonance behavior leads to different spectral slopes for phytoplankton and zooplankton, in agreement with observations.

  8. Resonant plankton patchiness induced by large-scale turbulent flow.

    PubMed

    McKiver, William J; Neufeld, Zoltán

    2011-01-01

    Here we study how large-scale variability of oceanic plankton is affected by mesoscale turbulence in a spatially heterogeneous environment. We consider a phytoplankton-zooplankton (PZ) ecosystem model, with different types of zooplankton grazing functions, coupled to a turbulent flow described by the two-dimensional Navier-Stokes equations, representing large-scale horizontal transport in the ocean. We characterize the system using a dimensionless parameter, γ=T(B)/T(F), which is the ratio of the ecosystem biological time scale T(B) and the flow time scale T(F). Through numerical simulations, we examine how the PZ system depends on the time-scale ratio γ and find that the variance of both species changes significantly, with maximum phytoplankton variability at intermediate mixing rates. Through an analysis of the linearized population dynamics, we find an analytical solution based on the forced harmonic oscillator, which explains the behavior of the ecosystem, where there is resonance between the advection and the ecosystem predator-prey dynamics when the forcing time scales match the ecosystem time scales. We also examine the dependence of the power spectra on γ and find that the resonance behavior leads to different spectral slopes for phytoplankton and zooplankton, in agreement with observations.

  9. Large-scale anisotropy in stably stratified rotating flows

    SciTech Connect

    Marino, R.; Mininni, P. D.; Rosenberg, D. L.; Pouquet, A.

    2014-08-28

    We present results from direct numerical simulations of the Boussinesq equations in the presence of rotation and/or stratification, both in the vertical direction. The runs are forced isotropically and randomly at small scales and have spatial resolutions of up to $1024^3$ grid points and Reynolds numbers of $\\approx 1000$. We first show that solutions with negative energy flux and inverse cascades develop in rotating turbulence, whether or not stratification is present. However, the purely stratified case is characterized instead by an early-time, highly anisotropic transfer to large scales with almost zero net isotropic energy flux. This is consistent with previous studies that observed the development of vertically sheared horizontal winds, although only at substantially later times. However, and unlike previous works, when sufficient scale separation is allowed between the forcing scale and the domain size, the total energy displays a perpendicular (horizontal) spectrum with power law behavior compatible with $\\sim k_\\perp^{-5/3}$, including in the absence of rotation. In this latter purely stratified case, such a spectrum is the result of a direct cascade of the energy contained in the large-scale horizontal wind, as is evidenced by a strong positive flux of energy in the parallel direction at all scales including the largest resolved scales.

  10. Large-scale anisotropy in stably stratified rotating flows

    DOE PAGES

    Marino, R.; Mininni, P. D.; Rosenberg, D. L.; ...

    2014-08-28

    We present results from direct numerical simulations of the Boussinesq equations in the presence of rotation and/or stratification, both in the vertical direction. The runs are forced isotropically and randomly at small scales and have spatial resolutions of up tomore » $1024^3$ grid points and Reynolds numbers of $$\\approx 1000$$. We first show that solutions with negative energy flux and inverse cascades develop in rotating turbulence, whether or not stratification is present. However, the purely stratified case is characterized instead by an early-time, highly anisotropic transfer to large scales with almost zero net isotropic energy flux. This is consistent with previous studies that observed the development of vertically sheared horizontal winds, although only at substantially later times. However, and unlike previous works, when sufficient scale separation is allowed between the forcing scale and the domain size, the total energy displays a perpendicular (horizontal) spectrum with power law behavior compatible with $$\\sim k_\\perp^{-5/3}$$, including in the absence of rotation. In this latter purely stratified case, such a spectrum is the result of a direct cascade of the energy contained in the large-scale horizontal wind, as is evidenced by a strong positive flux of energy in the parallel direction at all scales including the largest resolved scales.« less

  11. The effect of large scale inhomogeneities on the luminosity distance

    NASA Astrophysics Data System (ADS)

    Brouzakis, Nikolaos; Tetradis, Nikolaos; Tzavara, Eleftheria

    2007-02-01

    We study the form of the luminosity distance as a function of redshift in the presence of large scale inhomogeneities, with sizes of order 10 Mpc or larger. We approximate the Universe through the Swiss-cheese model, with each spherical region described by the Lemaitre Tolman Bondi metric. We study the propagation of light beams in this background, assuming that the locations of the source and the observer are random. We derive the optical equations for the evolution of the beam area and shear. Through their integration we determine the configurations that can lead to an increase of the luminosity distance relative to the homogeneous cosmology. We find that this can be achieved if the Universe is composed of spherical void-like regions, with matter concentrated near their surface. For inhomogeneities consistent with the observed large scale structure, the relative increase of the luminosity distance is of the order of a few per cent at redshifts near 1, and falls short of explaining the substantial increase required by the supernova data. On the other hand, the effect we describe is important for the correct determination of the energy content of the Universe from observations.

  12. Exact-Differential Large-Scale Traffic Simulation

    SciTech Connect

    Hanai, Masatoshi; Suzumura, Toyotaro; Theodoropoulos, Georgios; Perumalla, Kalyan S

    2015-01-01

    Analyzing large-scale traffics by simulation needs repeating execution many times with various patterns of scenarios or parameters. Such repeating execution brings about big redundancy because the change from a prior scenario to a later scenario is very minor in most cases, for example, blocking only one of roads or changing the speed limit of several roads. In this paper, we propose a new redundancy reduction technique, called exact-differential simulation, which enables to simulate only changing scenarios in later execution while keeping exactly same results as in the case of whole simulation. The paper consists of two main efforts: (i) a key idea and algorithm of the exact-differential simulation, (ii) a method to build large-scale traffic simulation on the top of the exact-differential simulation. In experiments of Tokyo traffic simulation, the exact-differential simulation shows 7.26 times as much elapsed time improvement in average and 2.26 times improvement even in the worst case as the whole simulation.

  13. Halo detection via large-scale Bayesian inference

    NASA Astrophysics Data System (ADS)

    Merson, Alexander I.; Jasche, Jens; Abdalla, Filipe B.; Lahav, Ofer; Wandelt, Benjamin; Jones, D. Heath; Colless, Matthew

    2016-08-01

    We present a proof-of-concept of a novel and fully Bayesian methodology designed to detect haloes of different masses in cosmological observations subject to noise and systematic uncertainties. Our methodology combines the previously published Bayesian large-scale structure inference algorithm, HAmiltonian Density Estimation and Sampling algorithm (HADES), and a Bayesian chain rule (the Blackwell-Rao estimator), which we use to connect the inferred density field to the properties of dark matter haloes. To demonstrate the capability of our approach, we construct a realistic galaxy mock catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a median redshift of approximately 0.05. Application of HADES to the catalogue provides us with accurately inferred three-dimensional density fields and corresponding quantification of uncertainties inherent to any cosmological observation. We then use a cosmological simulation to relate the amplitude of the density field to the probability of detecting a halo with mass above a specified threshold. With this information, we can sum over the HADES density field realisations to construct maps of detection probabilities and demonstrate the validity of this approach within our mock scenario. We find that the probability of successful detection of haloes in the mock catalogue increases as a function of the signal to noise of the local galaxy observations. Our proposed methodology can easily be extended to account for more complex scientific questions and is a promising novel tool to analyse the cosmic large-scale structure in observations.

  14. Brief Mental Training Reorganizes Large-Scale Brain Networks.

    PubMed

    Tang, Yi-Yuan; Tang, Yan; Tang, Rongxiang; Lewis-Peacock, Jarrod A

    2017-01-01

    Emerging evidences have shown that one form of mental training-mindfulness meditation, can improve attention, emotion regulation and cognitive performance through changing brain activity and structural connectivity. However, whether and how the short-term mindfulness meditation alters large-scale brain networks are not well understood. Here, we applied a novel data-driven technique, the multivariate pattern analysis (MVPA) to resting-state fMRI (rsfMRI) data to identify changes in brain activity patterns and assess the neural mechanisms induced by a brief mindfulness training-integrative body-mind training (IBMT), which was previously reported in our series of randomized studies. Whole brain rsfMRI was performed on an undergraduate group who received 2 weeks of IBMT with 30 min per session (5 h training in total). Classifiers were trained on measures of functional connectivity in this fMRI data, and they were able to reliably differentiate (with 72% accuracy) patterns of connectivity from before vs. after the IBMT training. After training, an increase in positive functional connections (60 connections) were detected, primarily involving bilateral superior/middle occipital gyrus, bilateral frontale operculum, bilateral superior temporal gyrus, right superior temporal pole, bilateral insula, caudate and cerebellum. These results suggest that brief mental training alters the functional connectivity of large-scale brain networks at rest that may involve a portion of the neural circuitry supporting attention, cognitive and affective processing, awareness and sensory integration and reward processing.

  15. Large-scale columnar vortices in rotating turbulence

    NASA Astrophysics Data System (ADS)

    Yokoyama, Naoto; Takaoka, Masanori

    2016-11-01

    In the rotating turbulence, flow structures are affected by the angular velocity of the system's rotation. When the angular velocity is small, three-dimensional statistically-isotropic flow, which has the Kolmogorov spectrum all over the inertial subrange, is formed. When the angular velocity increases, the flow becomes two-dimensional anisotropic, and the energy spectrum has a power law k-2 in the small wavenumbers in addition to the Kolmogorov spectrum in the large wavenumbers. When the angular velocity decreases, the flow returns to the isotropic one. It is numerically found that the transition between the isotropic and anisotropic flows is hysteretic; the critical angular velocity at which the flow transitions from the anisotropic one to the isotropic one, and that of the reverse transition are different. It is also observed that the large-scale columnar structures in the anisotropic flow depends on the external force which maintains a statistically-steady state. In some cases, small-scale anticyclonic structures are aligned in a columnar structure apart from the cyclonic Taylor column. The formation mechanism of the large-scale columnar structures will be discussed. This work was partially supported by JSPS KAKENHI.

  16. Large scale CMB anomalies from thawing cosmic strings

    SciTech Connect

    Ringeval, Christophe; Yamauchi, Daisuke; Yokoyama, Jun'ichi; Bouchet, François R. E-mail: yamauchi@resceu.s.u-tokyo.ac.jp E-mail: bouchet@iap.fr

    2016-02-01

    Cosmic strings formed during inflation are expected to be either diluted over super-Hubble distances, i.e., invisible today, or to have crossed our past light cone very recently. We discuss the latter situation in which a few strings imprint their signature in the Cosmic Microwave Background (CMB) Anisotropies after recombination. Being almost frozen in the Hubble flow, these strings are quasi static and evade almost all of the previously derived constraints on their tension while being able to source large scale anisotropies in the CMB sky. Using a local variance estimator on thousand of numerically simulated Nambu-Goto all sky maps, we compute the expected signal and show that it can mimic a dipole modulation at large angular scales while being negligible at small angles. Interestingly, such a scenario generically produces one cold spot from the thawing of a cosmic string loop. Mixed with anisotropies of inflationary origin, we find that a few strings of tension GU = O(1) × 10{sup −6} match the amplitude of the dipole modulation reported in the Planck satellite measurements and could be at the origin of other large scale anomalies.

  17. A visual backchannel for large-scale events.

    PubMed

    Dörk, Marian; Gruen, Daniel; Williamson, Carey; Carpendale, Sheelagh

    2010-01-01

    We introduce the concept of a Visual Backchannel as a novel way of following and exploring online conversations about large-scale events. Microblogging communities, such as Twitter, are increasingly used as digital backchannels for timely exchange of brief comments and impressions during political speeches, sport competitions, natural disasters, and other large events. Currently, shared updates are typically displayed in the form of a simple list, making it difficult to get an overview of the fast-paced discussions as it happens in the moment and how it evolves over time. In contrast, our Visual Backchannel design provides an evolving, interactive, and multi-faceted visual overview of large-scale ongoing conversations on Twitter. To visualize a continuously updating information stream, we include visual saliency for what is happening now and what has just happened, set in the context of the evolving conversation. As part of a fully web-based coordinated-view system we introduce Topic Streams, a temporally adjustable stacked graph visualizing topics over time, a People Spiral representing participants and their activity, and an Image Cloud encoding the popularity of event photos by size. Together with a post listing, these mutually linked views support cross-filtering along topics, participants, and time ranges. We discuss our design considerations, in particular with respect to evolving visualizations of dynamically changing data. Initial feedback indicates significant interest and suggests several unanticipated uses.

  18. Systematic renormalization of the effective theory of Large Scale Structure

    SciTech Connect

    Abolhasani, Ali Akbar; Mirbabayi, Mehrdad; Pajer, Enrico

    2016-05-31

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we show that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to large scale density contrast δ and momentum density π(k) scale as k{sup 2} and k, respectively. Third, we demonstrate that the common practice of introducing counterterms only in the Euler equation when one is interested in correlators of δ is indeed valid to all orders.

  19. Scalable WIM: effective exploration in large-scale astrophysical environments.

    PubMed

    Li, Yinggang; Fu, Chi-Wing; Hanson, Andrew J

    2006-01-01

    Navigating through large-scale virtual environments such as simulations of the astrophysical Universe is difficult. The huge spatial range of astronomical models and the dominance of empty space make it hard for users to travel across cosmological scales effectively, and the problem of wayfinding further impedes the user's ability to acquire reliable spatial knowledge of astronomical contexts. We introduce a new technique called the scalable world-in-miniature (WIM) map as a unifying interface to facilitate travel and wayfinding in a virtual environment spanning gigantic spatial scales: Power-law spatial scaling enables rapid and accurate transitions among widely separated regions; logarithmically mapped miniature spaces offer a global overview mode when the full context is too large; 3D landmarks represented in the WIM are enhanced by scale, positional, and directional cues to augment spatial context awareness; a series of navigation models are incorporated into the scalable WIM to improve the performance of travel tasks posed by the unique characteristics of virtual cosmic exploration. The scalable WIM user interface supports an improved physical navigation experience and assists pragmatic cognitive understanding of a visualization context that incorporates the features of large-scale astronomy.

  20. Extending SME to Handle Large-Scale Cognitive Modeling.

    PubMed

    Forbus, Kenneth D; Ferguson, Ronald W; Lovett, Andrew; Gentner, Dedre

    2016-06-20

    Analogy and similarity are central phenomena in human cognition, involved in processes ranging from visual perception to conceptual change. To capture this centrality requires that a model of comparison must be able to integrate with other processes and handle the size and complexity of the representations required by the tasks being modeled. This paper describes extensions to Structure-Mapping Engine (SME) since its inception in 1986 that have increased its scope of operation. We first review the basic SME algorithm, describe psychological evidence for SME as a process model, and summarize its role in simulating similarity-based retrieval and generalization. Then we describe five techniques now incorporated into the SME that have enabled it to tackle large-scale modeling tasks: (a) Greedy merging rapidly constructs one or more best interpretations of a match in polynomial time: O(n(2) log(n)); (b) Incremental operation enables mappings to be extended as new information is retrieved or derived about the base or target, to model situations where information in a task is updated over time; (c) Ubiquitous predicates model the varying degrees to which items may suggest alignment; (d) Structural evaluation of analogical inferences models aspects of plausibility judgments; (e) Match filters enable large-scale task models to communicate constraints to SME to influence the mapping process. We illustrate via examples from published studies how these enable it to capture a broader range of psychological phenomena than before.

  1. Large-scale chromatin structure of inducible genes: transcription on a condensed, linear template

    PubMed Central

    Hu, Yan; Kireev, Igor; Plutz, Matt; Ashourian, Nazanin

    2009-01-01

    The structure of interphase chromosomes, and in particular the changes in large-scale chromatin structure accompanying transcriptional activation, remain poorly characterized. Here we use light microscopy and in vivo immunogold labeling to directly visualize the interphase chromosome conformation of 1–2 Mbp chromatin domains formed by multi-copy BAC transgenes containing 130–220 kb of genomic DNA surrounding the DHFR, Hsp70, or MT gene loci. We demonstrate near-endogenous transcription levels in the context of large-scale chromatin fibers compacted nonuniformly well above the 30-nm chromatin fiber. An approximately 1.5–3-fold extension of these large-scale chromatin fibers accompanies transcriptional induction and active genes remain mobile. Heat shock–induced Hsp70 transgenes associate with the exterior of nuclear speckles, with Hsp70 transcripts accumulating within the speckle. Live-cell imaging reveals distinct dynamic events, with Hsp70 transgenes associating with adjacent speckles, nucleating new speckles, or moving to preexisting speckles. Our results call for reexamination of classical models of interphase chromosome organization. PMID:19349581

  2. High mutational rates of large-scale duplication and deletion in Daphnia pulex

    PubMed Central

    Keith, Nathan; Tucker, Abraham E.; Jackson, Craig E.; Sung, Way; Lucas Lledó, José Ignacio; Schrider, Daniel R.; Schaack, Sarah; Dudycha, Jeffry L.; Ackerman, Matthew; Younge, Andrew J.; Shaw, Joseph R.; Lynch, Michael

    2016-01-01

    Knowledge of the genome-wide rate and spectrum of mutations is necessary to understand the origin of disease and the genetic variation driving all evolutionary processes. Here, we provide a genome-wide analysis of the rate and spectrum of mutations obtained in two Daphnia pulex genotypes via separate mutation-accumulation (MA) experiments. Unlike most MA studies that utilize haploid, homozygous, or self-fertilizing lines, D. pulex can be propagated ameiotically while maintaining a naturally heterozygous, diploid genome, allowing the capture of the full spectrum of genomic changes that arise in a heterozygous state. While base-substitution mutation rates are similar to those in other multicellular eukaryotes (about 4 × 10−9 per site per generation), we find that the rates of large-scale (>100 kb) de novo copy-number variants (CNVs) are significantly elevated relative to those seen in previous MA studies. The heterozygosity maintained in this experiment allowed for estimates of gene-conversion processes. While most of the conversion tract lengths we report are similar to those generated by meiotic processes, we also find larger tract lengths that are indicative of mitotic processes. Comparison of MA lines to natural isolates reveals that a majority of large-scale CNVs in natural populations are removed by purifying selection. The mutations observed here share similarities with disease-causing, complex, large-scale CNVs, thereby demonstrating that MA studies in D. pulex serve as a system for studying the processes leading to such alterations. PMID:26518480

  3. Modeling dynamic functional information flows on large-scale brain networks.

    PubMed

    Lv, Peili; Guo, Lei; Hu, Xintao; Li, Xiang; Jin, Changfeng; Han, Junwei; Li, Lingjiang; Liu, Tianming

    2013-01-01

    Growing evidence from the functional neuroimaging field suggests that human brain functions are realized via dynamic functional interactions on large-scale structural networks. Even in resting state, functional brain networks exhibit remarkable temporal dynamics. However, it has been rarely explored to computationally model such dynamic functional information flows on large-scale brain networks. In this paper, we present a novel computational framework to explore this problem using multimodal resting state fMRI (R-fMRI) and diffusion tensor imaging (DTI) data. Basically, recent literature reports including our own studies have demonstrated that the resting state brain networks dynamically undergo a set of distinct brain states. Within each quasi-stable state, functional information flows from one set of structural brain nodes to other sets of nodes, which is analogous to the message package routing on the Internet from the source node to the destination. Therefore, based on the large-scale structural brain networks constructed from DTI data, we employ a dynamic programming strategy to infer functional information transition routines on structural networks, based on which hub routers that most frequently participate in these routines are identified. It is interesting that a majority of those hub routers are located within the default mode network (DMN), revealing a possible mechanism of the critical functional hub roles played by the DMN in resting state. Also, application of this framework on a post trauma stress disorder (PTSD) dataset demonstrated interesting difference in hub router distributions between PTSD patients and healthy controls.

  4. Debottlenecking recombinant protein production in Bacillus megaterium under large-scale conditions--targeted precursor feeding designed from metabolomics.

    PubMed

    Korneli, Claudia; Bolten, Christoph Josef; Godard, Thibault; Franco-Lara, Ezequiel; Wittmann, Christoph

    2012-06-01

    In the present work the impact of large production scale was investigated for Bacillus megaterium expressing green fluorescent protein (GFP). Specifically designed scale-down studies, mimicking the intermittent and continuous nutrient supply of large- and small-scale processes, were carried out for this purpose. The recombinant strain revealed a 40% reduced GFP yield for the large-scale conditions. In line with extended carbon loss via formation of acetate and carbon dioxide, this indicated obvious limitations in the underlying metabolism of B. megaterium under the large-scale conditions. Quantitative analysis of intracellular amino acids via validated fast filtration protocols revealed that their level strongly differed between the two scenarios. During cultivation in large-scale set-up, the availability of most amino acids, serving as key building blocks of the recombinant protein, was substantially reduced. This was most pronounced for tryptophan, aspartate, histidine, glutamine, and lysine. In contrast alanine was increased, probably related to a bottleneck at the level of pyruvate which also triggered acetate overflow metabolism. The pre-cursor quantifications could then be exploited to verify the presumed bottlenecks and improve recombinant protein production under large-scale conditions. Addition of only 5 mM tryptophan, aspartate, histidine, glutamine, and lysine to the feed solution increased the GFP yield by 100%. This rational concept of driving the lab scale productivity of recombinant microorganisms under suboptimal feeding conditions emulating large scale can easily be extended to other processes and production hosts.

  5. Modelling large-scale halo bias using the bispectrum

    NASA Astrophysics Data System (ADS)

    Pollack, Jennifer E.; Smith, Robert E.; Porciani, Cristiano

    2012-03-01

    We study the relation between the density distribution of tracers for large-scale structure and the underlying matter distribution - commonly termed bias - in the Λ cold dark matter framework. In particular, we examine the validity of the local model of biasing at quadratic order in the matter density. This model is characterized by parameters b1 and b2. Using an ensemble of N-body simulations, we apply several statistical methods to estimate the parameters. We measure halo and matter fluctuations smoothed on various scales. We find that, whilst the fits are reasonably good, the parameters vary with smoothing scale. We argue that, for real-space measurements, owing to the mixing of wavemodes, no smoothing scale can be found for which the parameters are independent of smoothing. However, this is not the case in Fourier space. We measure halo and halo-mass power spectra and from these construct estimates of the effective large-scale bias as a guide for b1. We measure the configuration dependence of the halo bispectra Bhhh and reduced bispectra Qhhh for very large-scale k-space triangles. From these data, we constrain b1 and b2, taking into account the full bispectrum covariance matrix. Using the lowest order perturbation theory, we find that for Bhhh the best-fitting parameters are in reasonable agreement with one another as the triangle scale is varied, although the fits become poor as smaller scales are included. The same is true for Qhhh. The best-fitting values were found to depend on the discreteness correction. This led us to consider halo-mass cross-bispectra. The results from these statistics supported our earlier findings. We then developed a test to explore whether the inconsistency in the recovered bias parameters could be attributed to missing higher order corrections in the models. We prove that low-order expansions are not sufficiently accurate to model the data, even on scales k1˜ 0.04 h Mpc-1. If robust inferences concerning bias are to be drawn

  6. Large-Scale Graphene Film Deposition for Monolithic Device Fabrication

    NASA Astrophysics Data System (ADS)

    Al-shurman, Khaled

    Since 1958, the concept of integrated circuit (IC) has achieved great technological developments and helped in shrinking electronic devices. Nowadays, an IC consists of more than a million of compacted transistors. The majority of current ICs use silicon as a semiconductor material. According to Moore's law, the number of transistors built-in on a microchip can be double every two years. However, silicon device manufacturing reaches its physical limits. To explain, there is a new trend to shrinking circuitry to seven nanometers where a lot of unknown quantum effects such as tunneling effect can not be controlled. Hence, there is an urgent need for a new platform material to replace Si. Graphene is considered a promising material with enormous potential applications in many electronic and optoelectronics devices due to its superior properties. There are several techniques to produce graphene films. Among these techniques, chemical vapor deposition (CVD) offers a very convenient method to fabricate films for large-scale graphene films. Though CVD method is suitable for large area growth of graphene, the need for transferring a graphene film to silicon-based substrates is required. Furthermore, the graphene films thus achieved are, in fact, not single crystalline. Also, graphene fabrication utilizing Cu and Ni at high growth temperature contaminates the substrate that holds Si CMOS circuitry and CVD chamber as well. So, lowering the deposition temperature is another technological milestone for the successful adoption of graphene in integrated circuits fabrication. In this research, direct large-scale graphene film fabrication on silicon based platform (i.e. SiO2 and Si3N4) at low temperature was achieved. With a focus on low-temperature graphene growth, hot-filament chemical vapor deposition (HF-CVD) was utilized to synthesize graphene film using 200 nm thick nickel film. Raman spectroscopy was utilized to examine graphene formation on the bottom side of the Ni film

  7. Climatological context for large-scale coral bleaching

    NASA Astrophysics Data System (ADS)

    Barton, A. D.; Casey, K. S.

    2005-12-01

    Large-scale coral bleaching was first observed in 1979 and has occurred throughout virtually all of the tropics since that time. Severe bleaching may result in the loss of live coral and in a decline of the integrity of the impacted coral reef ecosystem. Despite the extensive scientific research and increased public awareness of coral bleaching, uncertainties remain about the past and future of large-scale coral bleaching. In order to reduce these uncertainties and place large-scale coral bleaching in the longer-term climatological context, specific criteria and methods for using historical sea surface temperature (SST) data to examine coral bleaching-related thermal conditions are proposed by analyzing three, 132 year SST reconstructions: ERSST, HadISST1, and GISST2.3b. These methodologies are applied to case studies at Discovery Bay, Jamaica (77.27°W, 18.45°N), Sombrero Reef, Florida, USA (81.11°W, 24.63°N), Academy Bay, Galápagos, Ecuador (90.31°W, 0.74°S), Pearl and Hermes Reef, Northwest Hawaiian Islands, USA (175.83°W, 27.83°N), Midway Island, Northwest Hawaiian Islands, USA (177.37°W, 28.25°N), Davies Reef, Australia (147.68°E, 18.83°S), and North Male Atoll, Maldives (73.35°E, 4.70°N). The results of this study show that (1) The historical SST data provide a useful long-term record of thermal conditions in reef ecosystems, giving important insight into the thermal history of coral reefs and (2) While coral bleaching and anomalously warm SSTs have occurred over much of the world in recent decades, case studies in the Caribbean, Northwest Hawaiian Islands, and parts of other regions such as the Great Barrier Reef exhibited SST conditions and cumulative thermal stress prior to 1979 that were comparable to those conditions observed during the strong, frequent coral bleaching events since 1979. This climatological context and knowledge of past environmental conditions in reef ecosystems may foster a better understanding of how coral reefs will

  8. Large-scale dimension densities for heart rate variability analysis

    NASA Astrophysics Data System (ADS)

    Raab, Corinna; Wessel, Niels; Schirdewan, Alexander; Kurths, Jürgen

    2006-04-01

    In this work, we reanalyze the heart rate variability (HRV) data from the 2002 Computers in Cardiology (CiC) Challenge using the concept of large-scale dimension densities and additionally apply this technique to data of healthy persons and of patients with cardiac diseases. The large-scale dimension density (LASDID) is estimated from the time series using a normalized Grassberger-Procaccia algorithm, which leads to a suitable correction of systematic errors produced by boundary effects in the rather large scales of a system. This way, it is possible to analyze rather short, nonstationary, and unfiltered data, such as HRV. Moreover, this method allows us to analyze short parts of the data and to look for differences between day and night. The circadian changes in the dimension density enable us to distinguish almost completely between real data and computer-generated data from the CiC 2002 challenge using only one parameter. In the second part we analyzed the data of 15 patients with atrial fibrillation (AF), 15 patients with congestive heart failure (CHF), 15 elderly healthy subjects (EH), as well as 18 young and healthy persons (YH). With our method we are able to separate completely the AF (ρlsμ=0.97±0.02) group from the others and, especially during daytime, the CHF patients show significant differences from the young and elderly healthy volunteers (CHF, 0.65±0.13 ; EH, 0.54±0.05 ; YH, 0.57±0.05 ; p<0.05 for both comparisons). Moreover, for the CHF patients we find no circadian changes in ρlsμ (day, 0.65±0.13 ; night, 0.66±0.12 ; n.s.) in contrast to healthy controls (day, 0.54±0.05 ; night, 0.61±0.05 ; p=0.002 ). Correlation analysis showed no statistical significant relation between standard HRV and circadian LASDID, demonstrating a possibly independent application of our method for clinical risk stratification.

  9. Aft-End Flow of a Large-Scale Lifting Body During Free-Flight Tests

    NASA Technical Reports Server (NTRS)

    Banks, Daniel W.; Fisher, David F.

    2006-01-01

    Free-flight tests of a large-scale lifting-body configuration, the X-38 aircraft, were conducted using tufts to characterize the flow on the aft end, specifically in the inboard region of the vertical fins. Pressure data was collected on the fins and base. Flow direction and movement were correlated with surface pressure and flight condition. The X-38 was conceived to be a rescue vehicle for the International Space Station. The vehicle shape was derived from the U.S. Air Force X-24 lifting body. Free-flight tests of the X-38 configuration were conducted at the NASA Dryden Flight Research Center at Edwards Air Force Base, California from 1997 to 2001.

  10. Rapid large-scale oligonucleotide selection for microarrays.

    PubMed

    Rahmann, Sven

    2002-01-01

    We present the first algorithm that selects oligonucleotide probes (e.g. 25-mers) for microarray experiments on a large scale. For example, oligos for human genes can be found within 50 hours. This becomes possible by using the longest common substring as a specificity measure for candidate oligos. We present an algorithm based on a suffix array with additional information that is efficient both in terms of memory usage and running time to rank all candidate oligos according to their specificity. We also introduce the concept of master sequences to describe the sequences from which oligos are to be selected. Constraints such as oligo length, melting temperature, and self-complementarity are incorporated in the master sequence at a preprocessing stage and thus kept separate from the main selection problem. As a result, custom oligos can now be designed for any sequenced genome, just as the technology for on-site chip synthesis is becoming increasingly mature.

  11. Lightweight computational steering of very large scale molecular dynamics simulations

    SciTech Connect

    Beazley, D.M.; Lomdahl, P.S.

    1996-09-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages.

  12. Solving large scale traveling salesman problems by chaotic neurodynamics.

    PubMed

    Hasegawa, Mikio; Ikeguch, Tohru; Aihara, Kazuyuki

    2002-03-01

    We propose a novel approach for solving large scale traveling salesman problems (TSPs) by chaotic dynamics. First, we realize the tabu search on a neural network, by utilizing the refractory effects as the tabu effects. Then, we extend it to a chaotic neural network version. We propose two types of chaotic searching methods, which are based on two different tabu searches. While the first one requires neurons of the order of n2 for an n-city TSP, the second one requires only n neurons. Moreover, an automatic parameter tuning method of our chaotic neural network is presented for easy application to various problems. Last, we show that our method with n neurons is applicable to large TSPs such as an 85,900-city problem and exhibits better performance than the conventional stochastic searches and the tabu searches.

  13. Atypical Behavior Identification in Large Scale Network Traffic

    SciTech Connect

    Best, Daniel M.; Hafen, Ryan P.; Olsen, Bryan K.; Pike, William A.

    2011-10-23

    Cyber analysts are faced with the daunting challenge of identifying exploits and threats within potentially billions of daily records of network traffic. Enterprise-wide cyber traffic involves hundreds of millions of distinct IP addresses and results in data sets ranging from terabytes to petabytes of raw data. Creating behavioral models and identifying trends based on those models requires data intensive architectures and techniques that can scale as data volume increases. Analysts need scalable visualization methods that foster interactive exploration of data and enable identification of behavioral anomalies. Developers must carefully consider application design, storage, processing, and display to provide usability and interactivity with large-scale data. We present an application that highlights atypical behavior in enterprise network flow records. This is accomplished by utilizing data intensive architectures to store the data, aggregation techniques to optimize data access, statistical techniques to characterize behavior, and a visual analytic environment to render the behavioral trends, highlight atypical activity, and allow for exploration.

  14. Scalable parallel distance field construction for large-scale applications

    SciTech Connect

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan -Liu; Kolla, Hemanth; Chen, Jacqueline H.

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. Anew distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking overtime, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. In conclusion, our work greatly extends the usability of distance fields for demanding applications.

  15. Investigation of flow fields within large scale hypersonic inlet models

    NASA Technical Reports Server (NTRS)

    Gnos, A. V.; Watson, E. C.; Seebaugh, W. R.; Sanator, R. J.; Decarlo, J. P.

    1973-01-01

    Analytical and experimental investigations were conducted to determine the internal flow characteristics in model passages representative of hypersonic inlets for use at Mach numbers to about 12. The passages were large enough to permit measurements to be made in both the core flow and boundary layers. The analytical techniques for designing the internal contours and predicting the internal flow-field development accounted for coupling between the boundary layers and inviscid flow fields by means of a displacement-thickness correction. Three large-scale inlet models, each having a different internal compression ratio, were designed to provide high internal performance with an approximately uniform static-pressure distribution at the throat station. The models were tested in the Ames 3.5-Foot Hypersonic Wind Tunnel at a nominal free-stream Mach number of 7.4 and a unit free-stream Reynolds number of 8.86 X one million per meter.

  16. Large-scale quantum networks based on graphs

    NASA Astrophysics Data System (ADS)

    Epping, Michael; Kampermann, Hermann; Bruß, Dagmar

    2016-05-01

    Society relies and depends increasingly on information exchange and communication. In the quantum world, security and privacy is a built-in feature for information processing. The essential ingredient for exploiting these quantum advantages is the resource of entanglement, which can be shared between two or more parties. The distribution of entanglement over large distances constitutes a key challenge for current research and development. Due to losses of the transmitted quantum particles, which typically scale exponentially with the distance, intermediate quantum repeater stations are needed. Here we show how to generalise the quantum repeater concept to the multipartite case, by describing large-scale quantum networks, i.e. network nodes and their long-distance links, consistently in the language of graphs and graph states. This unifying approach comprises both the distribution of multipartite entanglement across the network, and the protection against errors via encoding. The correspondence to graph states also provides a tool for optimising the architecture of quantum networks.

  17. Large scale structure of the globular cluster population in Coma

    NASA Astrophysics Data System (ADS)

    Gagliano, Alexander T.; O'Neill, Conor; Madrid, Juan P.

    2016-01-01

    A search for globular cluster candidates in the Coma Cluster was carried out using Hubble Space Telescope data taken with the Advanced Camera for Surveys. We combine different observing programs including the Coma Treasury Survey in order to obtain the large scale distribution of globular clusters in Coma. Globular cluster candidates were selected through careful morphological inspection and a detailed analysis of their magnitude and colors in the two available wavebands, F475W (Sloan g) and F814W (I). Color Magnitude Diagrams, radial density plots and density maps were then created to characterize the globular cluster population in Coma. Preliminary results show the structure of the intergalactic globular cluster system throughout Coma, among the largest globular clusters catalogues to date. The spatial distribution of globular clusters shows clear overdensities, or bridges, between Coma galaxies. It also becomes evident that galaxies of similar luminosity have vastly different numbers of associated globular clusters.

  18. Large-Scale Advanced Prop-Fan (LAP) blade design

    NASA Technical Reports Server (NTRS)

    Violette, John A.; Sullivan, William E.; Turnberg, Jay E.

    1984-01-01

    This report covers the design analysis of a very thin, highly swept, propeller blade to be used in the Large-Scale Advanced Prop-Fan (LAP) test program. The report includes: design requirements and goals, a description of the blade configuration which meets requirements, a description of the analytical methods utilized/developed to demonstrate compliance with the requirements, and the results of these analyses. The methods described include: finite element modeling, predicted aerodynamic loads and their application to the blade, steady state and vibratory response analyses, blade resonant frequencies and mode shapes, bird impact analysis, and predictions of stalled and unstalled flutter phenomena. Summarized results include deflections, retention loads, stress/strength comparisons, foreign object damage resistance, resonant frequencies and critical speed margins, resonant vibratory mode shapes, calculated boundaries of stalled and unstalled flutter, and aerodynamic and acoustic performance calculations.

  19. Recovery Act - Large Scale SWNT Purification and Solubilization

    SciTech Connect

    Michael Gemano; Dr. Linda B. McGown

    2010-10-07

    The goal of this Phase I project was to establish a quantitative foundation for development of binary G-gels for large-scale, commercial processing of SWNTs and to develop scientific insight into the underlying mechanisms of solubilization, selectivity and alignment. In order to accomplish this, we performed systematic studies to determine the effects of G-gel composition and experimental conditions that will enable us to achieve our goals that include (1) preparation of ultra-high purity SWNTs from low-quality, commercial SWNT starting materials, (2) separation of MWNTs from SWNTs, (3) bulk, non-destructive solubilization of individual SWNTs in aqueous solution at high concentrations (10-100 mg/mL) without sonication or centrifugation, (4) tunable enrichment of subpopulations of the SWNTs based on metallic vs. semiconductor properties, diameter, or chirality and (5) alignment of individual SWNTs.

  20. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  1. Large Scale Bacterial Colony Screening of Diversified FRET Biosensors

    PubMed Central

    Litzlbauer, Julia; Schifferer, Martina; Ng, David; Fabritius, Arne; Thestrup, Thomas; Griesbeck, Oliver

    2015-01-01

    Biosensors based on Förster Resonance Energy Transfer (FRET) between fluorescent protein mutants have started to revolutionize physiology and biochemistry. However, many types of FRET biosensors show relatively small FRET changes, making measurements with these probes challenging when used under sub-optimal experimental conditions. Thus, a major effort in the field currently lies in designing new optimization strategies for these types of sensors. Here we describe procedures for optimizing FRET changes by large scale screening of mutant biosensor libraries in bacterial colonies. We describe optimization of biosensor expression, permeabilization of bacteria, software tools for analysis, and screening conditions. The procedures reported here may help in improving FRET changes in multiple suitable classes of biosensors. PMID:26061878

  2. Large-scale structure non-Gaussianities with modal methods

    NASA Astrophysics Data System (ADS)

    Schmittfull, Marcel

    2016-10-01

    Relying on a separable modal expansion of the bispectrum, the implementation of a fast estimator for the full bispectrum of a 3d particle distribution is presented. The computational cost of accurate bispectrum estimation is negligible relative to simulation evolution, so the bispectrum can be used as a standard diagnostic whenever the power spectrum is evaluated. As an application, the time evolution of gravitational and primordial dark matter bispectra was measured in a large suite of N-body simulations. The bispectrum shape changes characteristically when the cosmic web becomes dominated by filaments and halos, therefore providing a quantitative probe of 3d structure formation. Our measured bispectra are determined by ~ 50 coefficients, which can be used as fitting formulae in the nonlinear regime and for non-Gaussian initial conditions. We also compare the measured bispectra with predictions from the Effective Field Theory of Large Scale Structures (EFTofLSS).

  3. Towards Online Multiresolution Community Detection in Large-Scale Networks

    PubMed Central

    Huang, Jianbin; Sun, Heli; Liu, Yaguang; Song, Qinbao; Weninger, Tim

    2011-01-01

    The investigation of community structure in networks has aroused great interest in multiple disciplines. One of the challenges is to find local communities from a starting vertex in a network without global information about the entire network. Many existing methods tend to be accurate depending on a priori assumptions of network properties and predefined parameters. In this paper, we introduce a new quality function of local community and present a fast local expansion algorithm for uncovering communities in large-scale networks. The proposed algorithm can detect multiresolution community from a source vertex or communities covering the whole network. Experimental results show that the proposed algorithm is efficient and well-behaved in both real-world and synthetic networks. PMID:21887325

  4. Large-scale characterization of the murine cardiac proteome.

    PubMed

    Cosme, Jake; Emili, Andrew; Gramolini, Anthony O

    2013-01-01

    Cardiomyopathies are diseases of the heart that result in impaired cardiac muscle function. This dysfunction can progress to an inability to supply blood to the body. Cardiovascular diseases play a large role in overall global morbidity. Investigating the protein changes in the heart during disease can uncover pathophysiological mechanisms and potential therapeutic targets. Establishing a global protein expression "footprint" can facilitate more targeted studies of diseases of the heart.In the technical review presented here, we present methods to elucidate the heart's proteome through subfractionation of the cellular compartments to reduce sample complexity and improve detection of lower abundant proteins during multidimensional protein identification technology analysis. Analysis of the cytosolic, microsomal, and mitochondrial subproteomes separately in order to characterize the murine cardiac proteome is advantageous by simplifying complex cardiac protein mixtures. In combination with bioinformatic analysis and genome correlation, large-scale protein changes can be identified at the cellular compartment level in this animal model.

  5. Nuclear-pumped lasers for large-scale applications

    SciTech Connect

    Anderson, R.E.; Leonard, E.M.; Shea, R.E.; Berggren, R.R.

    1988-01-01

    Efficient initiation of large-volume chemical lasers may be achieved by neutron induced reactions which produce charged particles in the final state. When a burst mode nuclear reactor is used as the neutron source, both a sufficiently intense neutron flux and a sufficient short initiation pulse may be possible. Proof-of-principle experiments are planned to demonstrate lasing in a direct nuclear-pumped large-volume system: to study the effects of various neutron absorbing materials on laser performance; to study the effects of long initiation pulse lengths; to determine the performance of large-scale optics and the beam quality that may bo obtained; and to assess the performance of alternative designs of burst systems that increase the neutron output and burst repetition rate. 21 refs., 7 figs., 5 tabs.

  6. Large Scale Deformation of the Western U.S. Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2002-01-01

    Over the past couple of years, with support from NASA, we used a large collection of data from GPS, VLBI, SLR, and DORIS networks which span the Western U.S. Cordillera (WUSC) to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work was roughly divided into an analysis of these space geodetic observations to infer the deformation field across and within the entire plate boundary zone, and an investigation of the implications of this deformation field regarding plate boundary dynamics. Following the determination of the first generation WUSC velocity solution, we placed high priority on the dissemination of the velocity estimates. With in-kind support from the Smithsonian Astrophysical Observatory, we constructed a web-site which allows anyone to access the data, and to determine their own velocity reference frame.

  7. Large Scale Deformation of the Western U.S. Cordillera

    NASA Technical Reports Server (NTRS)

    Bennett, Richard A.

    2002-01-01

    Over the past couple of years, with support from NASA, we used a large collection of data from GPS, VLBI, SLR, and DORIS networks which span the Westem U.S. Cordillera (WUSC) to precisely quantify present-day large-scale crustal deformations in a single uniform reference frame. Our work was roughly divided into an analysis of these space geodetic observations to infer the deformation field across and within the entire plate boundary zone, and an investigation of the implications of this deformation field regarding plate boundary dynamics. Following the determination of the first generation WUSC velocity solution, we placed high priority on the dissemination of the velocity estimates. With in-kind support from the Smithsonian Astrophysical Observatory, we constructed a web-site which allows anyone to access the data, and to determine their own velocity reference frame.

  8. Galaxy clustering and the origin of large-scale flows

    NASA Technical Reports Server (NTRS)

    Juszkiewicz, R.; Yahil, A.

    1989-01-01

    Peebles's 'cosmic virial theorem' is extended from its original range of validity at small separations, where hydrostatic equilibrium holds, to large separations, in which linear gravitational stability theory applies. The rms pairwise velocity difference at separation r is shown to depend on the spatial galaxy correlation function xi(x) only for x less than r. Gravitational instability theory can therefore be tested by comparing the two up to the maximum separation for which both can reliably be determined, and there is no dependence on the poorly known large-scale density and velocity fields. With the expected improvement in the data over the next few years, however, this method should yield a reliable determination of omega.

  9. Mass Efficiencies for Common Large-Scale Precision Space Structures

    NASA Technical Reports Server (NTRS)

    Williams, R. Brett; Agnes, Gregory S.

    2005-01-01

    This paper presents a mass-based trade study for large-scale deployable triangular trusses, where the longerons can be monocoque tubes, isogrid tubes, or coilable longeron trusses. Such structures are typically used to support heavy reflectors, solar panels, or other instruments, and are subject to thermal gradients that can vary a great deal based on orbital altitude, location in orbit, and self-shadowing. While multi layer insulation (MLI) blankets are commonly used to minimize the magnitude of these thermal disturbances, they subject the truss to a nonstructural mass penalty. This paper investigates the impact of these add-on thermal protection layers on selecting the lightest precision structure for a given loading scenario.

  10. Large scale self energy calculations for ion-surface interactions

    NASA Astrophysics Data System (ADS)

    Kürpick, P.; Thumm, U.

    1996-03-01

    We present large scale non-perturbative self energy calculations for the interaction of an ion with a metal surface. Using both the simple jellium potential and more sophisticated ab initio potentials(P. J. Jennings, R. O. Jones and M. Weinert, Phys. Rev. B, 37), 6113 (1988)., we study the complex self energy matrix for various n-manifolds allowing for the calculation of diabatic and adiabatic non-perturbative level shifts and widths, and hybrid orbitals(P. Kürpick and U.Thumm, to be published.). Besides this self energy calculations a new adiabatic close--coupling calculation is being developed that will be applied to the interaction of ions in various charge states with metal surfaces.

  11. In-line unit for large-scale condensate pumps

    SciTech Connect

    Tazetdinov, A.G.

    1983-09-01

    An in-line unit has been tested in the VNIIAEN for a screw centrifugal stage, Three alternative preincorporated axial impellers designed by the Voznesenskii-Pekin method for the sleeve action were tested with two types of centrifugal impellers. The optimum values of relative vortex current, mean peripheral component of the absolute velocity, and mean peripheral velocity are obtained. The existence of an optimum value for the mean relative vortex is explained. Results of tests lead to the design of a third alternative CI and AI No. 4 as specified. As a result of the tests, a screw centrifugal state with high energy and cavitational indices, a unit needed for the development of large scale condensate pumps, was obtained.

  12. Grid infrastructure to support science portals for large scale instruments.

    SciTech Connect

    von Laszewski, G.; Foster, I.

    1999-09-29

    Soon, a new generation of scientific workbenches will be developed as a collaborative effort among various research institutions in the US. These scientific workbenches will be accessed in the Web via portals. Reusable components are needed to build such portals for different scientific disciplines, allowing uniform desktop access to remote resources. Such components will include tools and services enabling easy collaboration, job submission, job monitoring, component discovery, and persistent object storage. Based on experience gained from Grand Challenge applications for large-scale instruments, we demonstrate how Grid infrastructure components can be used to support the implementation of science portals. The availability of these components will simplify the prototype implementation of a common portal architecture.

  13. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    NASA Astrophysics Data System (ADS)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  14. Computational solutions to large-scale data management and analysis

    PubMed Central

    Schadt, Eric E.; Linderman, Michael D.; Sorenson, Jon; Lee, Lawrence; Nolan, Garry P.

    2011-01-01

    Today we can generate hundreds of gigabases of DNA and RNA sequencing data in a week for less than US$5,000. The astonishing rate of data generation by these low-cost, high-throughput technologies in genomics is being matched by that of other technologies, such as real-time imaging and mass spectrometry-based flow cytometry. Success in the life sciences will depend on our ability to properly interpret the large-scale, high-dimensional data sets that are generated by these technologies, which in turn requires us to adopt advances in informatics. Here we discuss how we can master the different types of computational environments that exist — such as cloud and heterogeneous computing — to successfully tackle our big data problems. PMID:20717155

  15. Large-Scale All-Dielectric Metamaterial Perfect Reflectors

    SciTech Connect

    Moitra, Parikshit; Slovick, Brian A.; li, Wei; Kravchencko, Ivan I.; Briggs, Dayrl P.; Krishnamurthy, S.; Valentine, Jason

    2015-05-08

    All-dielectric metamaterials offer a potential low-loss alternative to plasmonic metamaterials at optical frequencies. In this paper, we take advantage of the low absorption loss as well as the simple unit cell geometry to demonstrate large-scale (centimeter-sized) all-dielectric metamaterial perfect reflectors made from silicon cylinder resonators. These perfect reflectors, operating in the telecommunications band, were fabricated using self-assembly based nanosphere lithography. In spite of the disorder originating from the self-assembly process, the average reflectance of the metamaterial perfect reflectors is 99.7% at 1530 nm, surpassing the reflectance of metallic mirrors. Moreover, the spectral separation of the electric and magnetic resonances can be chosen to achieve the required reflection bandwidth while maintaining a high tolerance to disorder. Finally, the scalability of this design could lead to new avenues of manipulating light for low-loss and large-area photonic applications.

  16. Optical modulation of aqueous metamaterial properties at large scale.

    PubMed

    Yang, Sui; Wang, Yuan; Ni, Xingjie; Zhang, Xiang

    2015-11-02

    Dynamical control of metamaterials by adjusting their shape and structures has been developed to achieve desired optical functionalities and to enable modulation and selection of spectra responses. However it is still challenging to realize such a manipulation at large scale. Recently, it has been shown that the desired high (or low) symmetry metamaterials structure in solution can be self-assembled under external light stimuli. Using the this approach, we systematically investiagted the optical controlling process and report here a dynamical manipulation of magnetic properties of metamaterials. Under external laser excitations, we demonstrated that selected magnetic properties of metamaterials can be tuned with the freedom of chosen wavelength ranges. The magnetic dipole selectivity and tunability were further quantified by in situ spectral measurement.

  17. Studies on Editing Patterns in Large-scale Wikis

    NASA Astrophysics Data System (ADS)

    Boulain, Philip; Shadbolt, Nigel; Gibbins, Nicholas

    Wiki systems have developed over the past years as lightweight, community-editable, web-based hypertext systems. With the emergence of Semantic Wikis, these collections of interlinked documents have also gained a dual role as ad-hoc RDF graphs. However, their roots lie at the limited hypertext capabilities of the World Wide Web: embedded links, without support for composite objects or transclusion. In this chapter, we present experimental evidence that hyperstructure changes, as opposed to content changes, form a substantial proportion of editing effort on a large-scale wiki.We then follow this with a in-detail experiment, studying how individual editors work to edit articles on the wiki. These experiments are set in the wider context of a study of how the technologies developed during decades of hypertext research may be applied to improve management of wiki document structure and, with semantic wikis, knowledge structure.

  18. Large-Scale Quantitative Analysis of Painting Arts

    PubMed Central

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  19. U-shaped Vortex Structures in Large Scale Cloud Cavitation

    NASA Astrophysics Data System (ADS)

    Cao, Yantao; Peng, Xiaoxing; Xu, Lianghao; Hong, Fangwen

    2015-12-01

    The control of cloud cavitation, especially large scale cloud cavitation(LSCC), is always a hot issue in the field of cavitation research. However, there has been little knowledge on the evolution of cloud cavitation since it is associated with turbulence and vortex flow. In this article, the structure of cloud cavitation shed by sheet cavitation around different hydrofoils and a wedge were observed in detail with high speed camera (HSC). It was found that the U-shaped vortex structures always existed in the development process of LSCC. The results indicated that LSCC evolution was related to this kind of vortex structures, and it may be a universal character for LSCC. Then vortex strength of U-shaped vortex structures in a cycle was analyzed with numerical results.

  20. Large scale ocean circulation from the GRACE GGM01 Geoid

    NASA Astrophysics Data System (ADS)

    Tapley, B. D.; Chambers, D. P.; Bettadpur, S.; Ries, J. C.

    2003-11-01

    The GRACE Gravity Model 01 (GGM01), computed from 111 days of GRACE K-band ranging (KBR) data, is differenced from a global mean sea surface (MSS) computed from a decade of satellite altimetry to determine a mean dynamic ocean topography (DOT). As a test of the GGM01 gravity model, large-scale zonal and meridional surface geostrophic currents are computed from the topography and are compared with those derived from a mean hydrographic surface. Reduction in residual RMS between the two by 30-60% (and increased correlation) indicates that the GGM01 geoid represents a dramatic improvement over older geoid models, which were developed from multiple satellite tracking data, altimetry, and surface gravity measurements. For the first time, all major current systems are clearly observed in the DOT from space-based measurements.

  1. The dynamics of large-scale arrays of coupled resonators

    NASA Astrophysics Data System (ADS)

    Borra, Chaitanya; Pyles, Conor S.; Wetherton, Blake A.; Quinn, D. Dane; Rhoads, Jeffrey F.

    2017-03-01

    This work describes an analytical framework suitable for the analysis of large-scale arrays of coupled resonators, including those which feature amplitude and phase dynamics, inherent element-level parameter variation, nonlinearity, and/or noise. In particular, this analysis allows for the consideration of coupled systems in which the number of individual resonators is large, extending as far as the continuum limit corresponding to an infinite number of resonators. Moreover, this framework permits analytical predictions for the amplitude and phase dynamics of such systems. The utility of this analytical methodology is explored through the analysis of a system of N non-identical resonators with global coupling, including both reactive and dissipative components, physically motivated by an electromagnetically-transduced microresonator array. In addition to the amplitude and phase dynamics, the behavior of the system as the number of resonators varies is investigated and the convergence of the discrete system to the infinite-N limit is characterized.

  2. Distributed Coordinated Control of Large-Scale Nonlinear Networks

    DOE PAGES

    Kundu, Soumya; Anghel, Marian

    2015-11-08

    We provide a distributed coordinated approach to the stability analysis and control design of largescale nonlinear dynamical systems by using a vector Lyapunov functions approach. In this formulation the large-scale system is decomposed into a network of interacting subsystems and the stability of the system is analyzed through a comparison system. However finding such comparison system is not trivial. In this work, we propose a sum-of-squares based completely decentralized approach for computing the comparison systems for networks of nonlinear systems. Moreover, based on the comparison systems, we introduce a distributed optimal control strategy in which the individual subsystems (agents) coordinatemore » with their immediate neighbors to design local control policies that can exponentially stabilize the full system under initial disturbances.We illustrate the control algorithm on a network of interacting Van der Pol systems.« less

  3. Honeycomb: Visual Analysis of Large Scale Social Networks

    NASA Astrophysics Data System (ADS)

    van Ham, Frank; Schulz, Hans-Jörg; Dimicco, Joan M.

    The rise in the use of social network sites allows us to collect large amounts of user reported data on social structures and analysis of this data could provide useful insights for many of the social sciences. This analysis is typically the domain of Social Network Analysis, and visualization of these structures often proves invaluable in understanding them. However, currently available visual analysis tools are not very well suited to handle the massive scale of this network data, and often resolve to displaying small ego networks or heavily abstracted networks. In this paper, we present Honeycomb, a visualization tool that is able to deal with much larger scale data (with millions of connections), which we illustrate by using a large scale corporate social networking site as an example. Additionally, we introduce a new probability based network metric to guide users to potentially interesting or anomalous patterns and discuss lessons learned during design and implementation.

  4. Modeling Failure Propagation in Large-Scale Engineering Networks

    NASA Astrophysics Data System (ADS)

    Schläpfer, Markus; Shapiro, Jonathan L.

    The simultaneous unavailability of several technical components within large-scale engineering systems can lead to high stress, rendering them prone to cascading events. In order to gain qualitative insights into the failure propagation mechanisms resulting from independent outages, we adopt a minimalistic model representing the components and their interdependencies by an undirected, unweighted network. The failure dynamics are modeled by an anticipated accelerated “wearout” process being dependent on the initial degree of a node and on the number of failed nearest neighbors. The results of the stochastic simulations imply that the influence of the network topology on the speed of the cascade highly depends on how the number of failed nearest neighbors shortens the life expectancy of a node. As a formal description of the decaying networks we propose a continuous-time mean field approximation, estimating the average failure rate of the nearest neighbors of a node based on the degree-degree distribution.

  5. Large-Scale Quantitative Analysis of Painting Arts

    NASA Astrophysics Data System (ADS)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  6. Performance Health Monitoring of Large-Scale Systems

    SciTech Connect

    Rajamony, Ram

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  7. Towards large-scale, human-based, mesoscopic neurotechnologies.

    PubMed

    Chang, Edward F

    2015-04-08

    Direct human brain recordings have transformed the scope of neuroscience in the past decade. Progress has relied upon currently available neurophysiological approaches in the context of patients undergoing neurosurgical procedures for medical treatment. While this setting has provided precious opportunities for scientific research, it also has presented significant constraints on the development of new neurotechnologies. A major challenge now is how to achieve high-resolution spatiotemporal neural recordings at a large scale. By narrowing the gap between current approaches, new directions tailored to the mesoscopic (intermediate) scale of resolution may overcome the barriers towards safe and reliable human-based neurotechnology development, with major implications for advancing both basic research and clinical translation.

  8. Experiments of Bouyant Thermocapillary Convection of Large Scale Liquid Bridge

    NASA Astrophysics Data System (ADS)

    Duan, Li; Kang, Qi

    Thermocapillary-driven convection in a large scale liquid bridge was investigated by experiments in this paper. We used 2cst silicone oil (Pr=28.571) ,observed the onset of liquid bridge with different aspect ratio (A=l/d) and volume, analyze the transformation of temperature oscillation frequency and phase , discussed the problems of hydrothermal waves. The column diameter of liquid bridge was 20mm. Due to the limit by gravity, we constructed bridge with 3mm-4.25mm height. With the help of five azimuthal thermocouples inserted in the bridge interior, we discovered that temperature oscillation in flow field occurs at the same time, bridges with different aspect ratio and volume have different flow mode, and with the increase of temperature difference, the frequency approximately increases linearly, oscillation phase of each temperature oscillation curve continuously changes. Bridges with different aspect ratio have different ways to chaos.

  9. A mini review: photobioreactors for large scale algal cultivation.

    PubMed

    Gupta, Prabuddha L; Lee, Seung-Mok; Choi, Hee-Jeong

    2015-09-01

    Microalgae cultivation has gained much interest in terms of the production of foods, biofuels, and bioactive compounds and offers a great potential option for cleaning the environment through CO2 sequestration and wastewater treatment. Although open pond cultivation is most affordable option, there tends to be insufficient control on growth conditions and the risk of contamination. In contrast, while providing minimal risk of contamination, closed photobioreactors offer better control on culture conditions, such as: CO2 supply, water supply, optimal temperatures, efficient exposure to light, culture density, pH levels, and mixing rates. For a large scale production of biomass, efficient photobioreactors are required. This review paper describes general design considerations pertaining to photobioreactor systems, in order to cultivate microalgae for biomass production. It also discusses the current challenges in designing of photobioreactors for the production of low-cost biomass.

  10. Large scale simulations of the great 1906 San Francisco earthquake

    NASA Astrophysics Data System (ADS)

    Nilsson, S.; Petersson, A.; Rodgers, A.; Sjogreen, B.; McCandless, K.

    2006-12-01

    As part of a multi-institutional simulation effort, we present large scale computations of the ground motion during the great 1906 San Francisco earthquake using a new finite difference code called WPP. The material data base for northern California provided by USGS together with the rupture model by Song et al. is demonstrated to lead to a reasonable match with historical data. In our simulations, the computational domain covered 550 km by 250 km of northern California down to 40 km depth, so a 125 m grid size corresponds to about 2.2 Billion grid points. To accommodate these large grids, the simulations were run on 512-1024 processors on one of the supercomputers at Lawrence Livermore National Lab. A wavelet compression algorithm enabled storage of time-dependent volumetric data. Nevertheless, the first 45 seconds of the earthquake still generated 1.2 TByte of disk space and the 3-D post processing was done in parallel.

  11. Automated Sequence Preprocessing in a Large-Scale Sequencing Environment

    PubMed Central

    Wendl, Michael C.; Dear, Simon; Hodgson, Dave; Hillier, LaDeana

    1998-01-01

    A software system for transforming fragments from four-color fluorescence-based gel electrophoresis experiments into assembled sequence is described. It has been developed for large-scale processing of all trace data, including shotgun and finishing reads, regardless of clone origin. Design considerations are discussed in detail, as are programming implementation and graphic tools. The importance of input validation, record tracking, and use of base quality values is emphasized. Several quality analysis metrics are proposed and applied to sample results from recently sequenced clones. Such quantities prove to be a valuable aid in evaluating modifications of sequencing protocol. The system is in full production use at both the Genome Sequencing Center and the Sanger Centre, for which combined weekly production is ∼100,000 sequencing reads per week. PMID:9750196

  12. Theoretical expectations for bulk flows in large-scale surveys

    NASA Technical Reports Server (NTRS)

    Feldman, Hume A.; Watkins, Richard

    1994-01-01

    We calculate the theoretical expectation for the bulk motion of a large-scale survey of the type recently carried out by Lauer and Postman. Included are the effects of survey geometry, errors in the distance measurements, clustering properties of the sample, and different assumed power spectra. We considered the power spectrum calculated from the Infrared Astronomy Satellite (IRAS)-QDOT survey, as well as spectra from hot + cold and standard cold dark matter models. We find that measurement uncertainty, sparse sampling, and clustering can lead to a much larger expectation for the bulk motion of a cluster sample than for the volume as a whole. However, our results suggest that the expected bulk motion is still inconsistent with that reported by Lauer and Postman at the 95%-97% confidence level.

  13. INUNDATION PATTERNS AND FATALITY ANALYSIS ON LARGE-SCALE FLOOD

    NASA Astrophysics Data System (ADS)

    Ikeuchi, Koji; Ochi, Shigeo; Yasuda, Goro; Okamura, Jiro; Aono, Masashi

    In order to enhance the emergency preparedness for large-scale floods of the Ara River, we categorized the inundation patterns and calculated fatality estimates. We devised an effective continuous embankment elevation estimation method employing light detection and ranging data analysis. Drainage pump capabilities, in terms of operatable inundation depth and operatable duration limited by fuel supply logistics, were modeled from pump station data of eac h site along the rivers. Fatality reduction effects due to the enhancement of the drainage capabilities were calculated. We found proper operations of the drainage facilities can decrease the number of estimat ed fatalities considerably in some cases. We also estimated the difference of risk between floods with 200 years return period and those with 1000 years return period. In some of the 1000 years return period cases, we found the estimated fatalities jumped up whereas the populations in inundated areas changed only a little.

  14. Stability of large scale chromomagnetic fields in the early universe

    NASA Astrophysics Data System (ADS)

    Elmfors, Per; Persson, David

    1999-01-01

    It is well known that Yang-Mills theory in vacuum has a perturbative instability to spontaneously form a large scale magnetic field (the Savvidy mechanism) and that a constant field is unstable so that a possible ground state has to be inhomogenous over the non-perturbative scale Λ (the Copenhagen vacuum). We argue that this spontaneous instability does not occur at high temperature when the induced field strength gB~Λ2 is much weaker than the magnetic mass squared (g2T)2. At high temperature, oscillations of gauge fields acquire a thermal mass M~gT and we show that this mass stabilizes a magnetic field which is constant over length scales shorter than the magnetic screening length (g2T)-1. We therefore conclude that there is no indication for any spontaneous generation of weak non-abelian magnetic fields in the early universe.

  15. Towards large-scale plasma-assisted synthesis of nanowires

    NASA Astrophysics Data System (ADS)

    Cvelbar, U.

    2011-05-01

    Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.

  16. Statistics of Caustics in Large-Scale Structure Formation

    NASA Astrophysics Data System (ADS)

    Feldbrugge, Job L.; Hidding, Johan; van de Weygaert, Rien

    2016-10-01

    The cosmic web is a complex spatial pattern of walls, filaments, cluster nodes and underdense void regions. It emerged through gravitational amplification from the Gaussian primordial density field. Here we infer analytical expressions for the spatial statistics of caustics in the evolving large-scale mass distribution. In our analysis, following the quasi-linear Zel'dovich formalism and confined to the 1D and 2D situation, we compute number density and correlation properties of caustics in cosmic density fields that evolve from Gaussian primordial conditions. The analysis can be straightforwardly extended to the 3D situation. We moreover, are currently extending the approach to the non-linear regime of structure formation by including higher order Lagrangian approximations and Lagrangian effective field theory.

  17. Retention of memory for large-scale spaces.

    PubMed

    Ishikawa, Toru

    2013-01-01

    This study empirically examined the retention of large-scale spatial memory, taking different types of spatial knowledge and levels of sense of direction into consideration. A total of 38 participants learned a route from a video and conducted spatial tasks immediately after learning the route and after 2 weeks or 3 months had passed. Results showed that spatial memory decayed over time, at a faster rate for the first 2-week period than for the subsequent period of up to 3 months, although it was not completely forgotten even after 3 months. The rate of forgetting differed depending on the type of knowledge, with landmark and route knowledge deteriorating at a much faster rate than survey knowledge. Sense of direction affected both the acquisition and the retention of survey knowledge. Survey knowledge by people with a good sense of direction was more accurate and decayed much less than that by people with a poor sense of direction.

  18. Large-Scale All-Dielectric Metamaterial Perfect Reflectors

    DOE PAGES

    Moitra, Parikshit; Slovick, Brian A.; li, Wei; ...

    2015-05-08

    All-dielectric metamaterials offer a potential low-loss alternative to plasmonic metamaterials at optical frequencies. In this paper, we take advantage of the low absorption loss as well as the simple unit cell geometry to demonstrate large-scale (centimeter-sized) all-dielectric metamaterial perfect reflectors made from silicon cylinder resonators. These perfect reflectors, operating in the telecommunications band, were fabricated using self-assembly based nanosphere lithography. In spite of the disorder originating from the self-assembly process, the average reflectance of the metamaterial perfect reflectors is 99.7% at 1530 nm, surpassing the reflectance of metallic mirrors. Moreover, the spectral separation of the electric and magnetic resonances canmore » be chosen to achieve the required reflection bandwidth while maintaining a high tolerance to disorder. Finally, the scalability of this design could lead to new avenues of manipulating light for low-loss and large-area photonic applications.« less

  19. Measuring Large-Scale Social Networks with High Resolution

    PubMed Central

    Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr; Cuttone, Andrea; Madsen, Mette My; Larsen, Jakob Eg; Lehmann, Sune

    2014-01-01

    This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years—the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection. PMID:24770359

  20. Phase Correlations and Topological Measures of Large-Scale Structure

    NASA Astrophysics Data System (ADS)

    Coles, P.

    The process of gravitational instability initiated by small primordial density perturbations is a vital ingredient of cosmological models that attempt to explain how galaxies and large-scale structure formed in the Universe. In the standard picture (the "concordance" model), a period of accelerated expansion ("inflation") generated density fluctuations with simple statistical properties through quantum processes (Starobinsky [82], [83], [84]; Guth [39]; Guth & Pi [40]; Albrecht & Steinhardt [2]; Linde [55]). In this scenario the primordial density field is assumed to form a statistically homogeneous and isotropic Gaussian random field (GRF). Over years of observational scrutiny this paradigm has strengthened its hold in the minds of cosmologists and has survived many tests, culminating in those furnished by the Wilkinson Microwave Anisotropy Probe (WMAP; Bennett et al. [7]; Hinshaw et al. [45].

  1. Large-Scale Activity Initiated BY Halo CMEs

    NASA Astrophysics Data System (ADS)

    Chertok, I.; Grechnev, V.

    We summarize results of our recent studies of CME-associated EUV dimmings and coronal waves by `derotated' fixed-difference SOHO/EIT heliograms at 195 Å with 12-min intervals and at 171, 195, 284, 304 Å with 6-h intervals. Correctness of the derotated fixed-difference technique is confirmed by the consideration of the Bastille Day 2000 event. We also demonstrate that long narrow channeled dimmings and anisotropic coronal waves are typical of the complex global solar magnetosphere near the solar cycle maximum. Homology of large-scale dimmings and coronal waves takes place in a series of recurrent eruptive events. Along with dimmings coinciding entirely or partially in all four EIT bands, there exist dimmings that appear different, mainly in the transition-region line of 304 Å and high-temperature coronal line of 284 Å.

  2. Large-scale testing of structural clay tile infilled frames

    SciTech Connect

    Flanagan, R.D.; Bennett, R.M.

    1993-03-18

    A summary of large-scale cyclic static tests of structural clay tile infilled frames is given. In-plane racking tests examined the effects of varying frame stiffness, varying infill size, infill offset from frame centerline, and single and double wythe infill construction. Out-of-plane tests examined infilled frame response to inertial loadings and inter-story drift loadings. Sequential in-plane and out-of-plane loadings were performed to determine the effects of orthogonal damage and degradation on both strength and stiffness. A combined out-of-plane inertial and in-plane racking test was conducted to investigate the interaction of multi-directional loading. To determine constitutive properties of the infills, prism compression, mortar compression and various unit tile tests were performed.

  3. Applications of large-scale density functional theory in biology

    NASA Astrophysics Data System (ADS)

    Cole, Daniel J.; Hine, Nicholas D. M.

    2016-10-01

    Density functional theory (DFT) has become a routine tool for the computation of electronic structure in the physics, materials and chemistry fields. Yet the application of traditional DFT to problems in the biological sciences is hindered, to a large extent, by the unfavourable scaling of the computational effort with system size. Here, we review some of the major software and functionality advances that enable insightful electronic structure calculations to be performed on systems comprising many thousands of atoms. We describe some of the early applications of large-scale DFT to the computation of the electronic properties and structure of biomolecules, as well as to paradigmatic problems in enzymology, metalloproteins, photosynthesis and computer-aided drug design. With this review, we hope to demonstrate that first principles modelling of biological structure-function relationships are approaching a reality.

  4. A large-scale crop protection bioassay data set

    NASA Astrophysics Data System (ADS)

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J. P.; Bellis, Louisa J.; Bento, A. Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P.

    2015-07-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database.

  5. Impact of Parallel Computing on Large Scale Aeroelastic Computations

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    Aeroelasticity is computationally one of the most intensive fields in aerospace engineering. Though over the last three decades the computational speed of supercomputers have substantially increased, they are still inadequate for large scale aeroelastic computations using high fidelity flow and structural equations. In addition to reaching a saturation in computational speed because of changes in economics, computer manufactures are stopping the manufacturing of mainframe type supercomputers. This has led computational aeroelasticians to face the gigantic task of finding alternate approaches for fulfilling their needs. The alternate path to over come speed and availability limitations of mainframe type supercomputers is to use parallel computers. During this decade several different architectures have evolved. In FY92 the US Government started the High Performance Computing and Communication (HPCC) program. As a participant in this program NASA developed several parallel computational tools for aeroelastic applications. This talk describes the impact of those application tools on high fidelity based multidisciplinary analysis.

  6. A large-scale evaluation of computational protein function prediction.

    PubMed

    Radivojac, Predrag; Clark, Wyatt T; Oron, Tal Ronnen; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kaßner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Boehm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas A; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo

    2013-03-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based critical assessment of protein function annotation (CAFA) experiment. Fifty-four methods representing the state of the art for protein function prediction were evaluated on a target set of 866 proteins from 11 organisms. Two findings stand out: (i) today's best protein function prediction algorithms substantially outperform widely used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is considerable need for improvement of currently available tools.

  7. A large-scale crop protection bioassay data set

    PubMed Central

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J. P.; Bellis, Louisa J.; Bento, A. Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P.

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database. PMID:26175909

  8. Large scale anisotropic bias from primordial non-Gaussianity

    SciTech Connect

    Baghram, Shant; Firouzjahi, Hassan; Namjoo, Mohammad Hossein E-mail: mh.namjoo@ipm.ir

    2013-08-01

    In this work we study the large scale structure bias in models of anisotropic inflation. We use the Peak Background Splitting method in Excursion Set Theory to find the scale-dependent bias. We show that the amplitude of the bias is modified by a direction-dependent factor. In the specific anisotropic inflation model which we study, the scale-dependent bias vanishes at leading order when the long wavelength mode in squeezed limit is aligned with the anisotropic direction in the sky. We also extend the scale-dependent bias formulation to the general situations with primordial anisotropy. We find some selection rules indicating that some specific parts of a generic anisotropic bispectrum is picked up by the bias parameter. We argue that the anisotropic bias is mainly sourced by the angle between the anisotropic direction and the long wavelength mode in the squeezed limit.

  9. Optimal Wind Energy Integration in Large-Scale Electric Grids

    NASA Astrophysics Data System (ADS)

    Albaijat, Mohammad H.

    The major concern in electric grid operation is operating under the most economical and reliable fashion to ensure affordability and continuity of electricity supply. This dissertation investigates the effects of such challenges, which affect electric grid reliability and economic operations. These challenges are: 1. Congestion of transmission lines, 2. Transmission lines expansion, 3. Large-scale wind energy integration, and 4. Phaser Measurement Units (PMUs) optimal placement for highest electric grid observability. Performing congestion analysis aids in evaluating the required increase of transmission line capacity in electric grids. However, it is necessary to evaluate expansion of transmission line capacity on methods to ensure optimal electric grid operation. Therefore, the expansion of transmission line capacity must enable grid operators to provide low-cost electricity while maintaining reliable operation of the electric grid. Because congestion affects the reliability of delivering power and increases its cost, the congestion analysis in electric grid networks is an important subject. Consequently, next-generation electric grids require novel methodologies for studying and managing congestion in electric grids. We suggest a novel method of long-term congestion management in large-scale electric grids. Owing to the complication and size of transmission line systems and the competitive nature of current grid operation, it is important for electric grid operators to determine how many transmission lines capacity to add. Traditional questions requiring answers are "Where" to add, "How much of transmission line capacity" to add, and "Which voltage level". Because of electric grid deregulation, transmission lines expansion is more complicated as it is now open to investors, whose main interest is to generate revenue, to build new transmission lines. Adding a new transmission capacity will help the system to relieve the transmission system congestion, create

  10. Large scale rigidity-based flexibility analysis of biomolecules

    PubMed Central

    Streinu, Ileana

    2016-01-01

    KINematics And RIgidity (KINARI) is an on-going project for in silico flexibility analysis of proteins. The new version of the software, Kinari-2, extends the functionality of our free web server KinariWeb, incorporates advanced web technologies, emphasizes the reproducibility of its experiments, and makes substantially improved tools available to the user. It is designed specifically for large scale experiments, in particular, for (a) very large molecules, including bioassemblies with high degree of symmetry such as viruses and crystals, (b) large collections of related biomolecules, such as those obtained through simulated dilutions, mutations, or conformational changes from various types of dynamics simulations, and (c) is intended to work as seemlessly as possible on the large, idiosyncratic, publicly available repository of biomolecules, the Protein Data Bank. We describe the system design, along with the main data processing, computational, mathematical, and validation challenges underlying this phase of the KINARI project. PMID:26958583

  11. Computational Issues in Damping Identification for Large Scale Problems

    NASA Technical Reports Server (NTRS)

    Pilkey, Deborah L.; Roe, Kevin P.; Inman, Daniel J.

    1997-01-01

    Two damping identification methods are tested for efficiency in large-scale applications. One is an iterative routine, and the other a least squares method. Numerical simulations have been performed on multiple degree-of-freedom models to test the effectiveness of the algorithm and the usefulness of parallel computation for the problems. High Performance Fortran is used to parallelize the algorithm. Tests were performed using the IBM-SP2 at NASA Ames Research Center. The least squares method tested incurs high communication costs, which reduces the benefit of high performance computing. This method's memory requirement grows at a very rapid rate meaning that larger problems can quickly exceed available computer memory. The iterative method's memory requirement grows at a much slower pace and is able to handle problems with 500+ degrees of freedom on a single processor. This method benefits from parallelization, and significant speedup can he seen for problems of 100+ degrees-of-freedom.

  12. Large-scale experience with biological treatment of contaminated soil

    SciTech Connect

    Schulz-Berendt, V.; Poetzsch, E.

    1995-12-31

    The efficiency of biological methods for the cleanup of soil contaminated with total petroleum hydrocarbons (TPH) and polycyclic aromatic hydrocarbons (PAH) was demonstrated by a large-scale example in which 38,000 tons of TPH- and PAH-polluted soil was treated onsite with the TERRAFERM{reg_sign} degradation system to reach the target values of 300 mg/kg TPH and 5 mg/kg PAH. Detection of the ecotoxicological potential (Microtox{reg_sign} assay) showed a significant decrease during the remediation. Low concentrations of PAH in the ground were treated by an in situ technology. The in situ treatment was combined with mechanical measures (slurry wall) to prevent the contamination from dispersing from the site.

  13. Distributed Coordinated Control of Large-Scale Nonlinear Networks

    SciTech Connect

    Kundu, Soumya; Anghel, Marian

    2015-11-08

    We provide a distributed coordinated approach to the stability analysis and control design of largescale nonlinear dynamical systems by using a vector Lyapunov functions approach. In this formulation the large-scale system is decomposed into a network of interacting subsystems and the stability of the system is analyzed through a comparison system. However finding such comparison system is not trivial. In this work, we propose a sum-of-squares based completely decentralized approach for computing the comparison systems for networks of nonlinear systems. Moreover, based on the comparison systems, we introduce a distributed optimal control strategy in which the individual subsystems (agents) coordinate with their immediate neighbors to design local control policies that can exponentially stabilize the full system under initial disturbances.We illustrate the control algorithm on a network of interacting Van der Pol systems.

  14. A large-scale evaluation of computational protein function prediction

    PubMed Central

    Radivojac, Predrag; Clark, Wyatt T; Ronnen Oron, Tal; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kassner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Böhm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo

    2013-01-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based Critical Assessment of protein Function Annotation (CAFA) experiment. Fifty-four methods representing the state-of-the-art for protein function prediction were evaluated on a target set of 866 proteins from eleven organisms. Two findings stand out: (i) today’s best protein function prediction algorithms significantly outperformed widely-used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is significant need for improvement of currently available tools. PMID:23353650

  15. Battery technologies for large-scale stationary energy storage.

    PubMed

    Soloveichik, Grigorii L

    2011-01-01

    In recent years, with the deployment of renewable energy sources, advances in electrified transportation, and development in smart grids, the markets for large-scale stationary energy storage have grown rapidly. Electrochemical energy storage methods are strong candidate solutions due to their high energy density, flexibility, and scalability. This review provides an overview of mature and emerging technologies for secondary and redox flow batteries. New developments in the chemistry of secondary and flow batteries as well as regenerative fuel cells are also considered. Advantages and disadvantages of current and prospective electrochemical energy storage options are discussed. The most promising technologies in the short term are high-temperature sodium batteries with β″-alumina electrolyte, lithium-ion batteries, and flow batteries. Regenerative fuel cells and lithium metal batteries with high energy density require further research to become practical.

  16. Large-scale structure in f(T) gravity

    SciTech Connect

    Li Baojiu; Sotiriou, Thomas P.; Barrow, John D.

    2011-05-15

    In this work we study the cosmology of the general f(T) gravity theory. We express the modified Einstein equations using covariant quantities, and derive the gauge-invariant perturbation equations in covariant form. We consider a specific choice of f(T), designed to explain the observed late-time accelerating cosmic expansion without including an exotic dark energy component. Our numerical solution shows that the extra degree of freedom of such f(T) gravity models generally decays as one goes to smaller scales, and consequently its effects on scales such as galaxies and galaxies clusters are small. But on large scales, this degree of freedom can produce large deviations from the standard {Lambda}CDM scenario, leading to severe constraints on the f(T) gravity models as an explanation to the cosmic acceleration.

  17. Thermophoretically induced large-scale deformations around microscopic heat centers

    NASA Astrophysics Data System (ADS)

    Puljiz, Mate; Orlishausen, Michael; Köhler, Werner; Menzel, Andreas M.

    2016-05-01

    Selectively heating a microscopic colloidal particle embedded in a soft elastic matrix is a situation of high practical relevance. For instance, during hyperthermic cancer treatment, cell tissue surrounding heated magnetic colloidal particles is destroyed. Experiments on soft elastic polymeric matrices suggest a very long-ranged, non-decaying radial component of the thermophoretically induced displacement fields around the microscopic heat centers. We theoretically confirm this conjecture using a macroscopic hydrodynamic two-fluid description. Both thermophoretic and elastic effects are included in this theory. Indeed, we find that the elasticity of the environment can cause the experimentally observed large-scale radial displacements in the embedding matrix. Additional experiments confirm the central role of elasticity. Finally, a linearly decaying radial component of the displacement field in the experiments is attributed to the finite size of the experimental sample. Similar results are obtained from our theoretical analysis under modified boundary conditions.

  18. Large scale Hugoniot material properties for Danby Marble

    SciTech Connect

    Rinehart, E.J.

    1993-11-01

    This paper presents the results of simulation experiments of nuclear underground testing carried out using the HYDROPLUS methodology for yield verifications of non-standard tests. The objective of this test series was to demonstrate the accuracy of stress and velocity measurements in hard, low porosity rock, to obtain comparisons of large-scale material properties with those obtained from laboratory testing of the same material, and to address the problems posed by a material having a clear precursor wave preceding the main shock wave. The test series consisted of three individual experimental tests. The first established material properties of the Danby marble selected for use in the experiments. The second and third tests looked at stress and velocity gage errors obtained when gages were placed in boreholes and grouted into place.

  19. A large-scale crop protection bioassay data set.

    PubMed

    Gaulton, Anna; Kale, Namrata; van Westen, Gerard J P; Bellis, Louisa J; Bento, A Patrícia; Davies, Mark; Hersey, Anne; Papadatos, George; Forster, Mark; Wege, Philip; Overington, John P

    2015-01-01

    ChEMBL is a large-scale drug discovery database containing bioactivity information primarily extracted from scientific literature. Due to the medicinal chemistry focus of the journals from which data are extracted, the data are currently of most direct value in the field of human health research. However, many of the scientific use-cases for the current data set are equally applicable in other fields, such as crop protection research: for example, identification of chemical scaffolds active against a particular target or endpoint, the de-convolution of the potential targets of a phenotypic assay, or the potential targets/pathways for safety liabilities. In order to broaden the applicability of the ChEMBL database and allow more widespread use in crop protection research, an extensive data set of bioactivity data of insecticidal, fungicidal and herbicidal compounds and assays was collated and added to the database.

  20. Tools for Large-Scale Mobile Malware Analysis

    SciTech Connect

    Bierma, Michael

    2014-01-01

    Analyzing mobile applications for malicious behavior is an important area of re- search, and is made di cult, in part, by the increasingly large number of appli- cations available for the major operating systems. There are currently over 1.2 million apps available in both the Google Play and Apple App stores (the respec- tive o cial marketplaces for the Android and iOS operating systems)[1, 2]. Our research provides two large-scale analysis tools to aid in the detection and analysis of mobile malware. The rst tool we present, Andlantis, is a scalable dynamic analysis system capa- ble of processing over 3000 Android applications per hour. Traditionally, Android dynamic analysis techniques have been relatively limited in scale due to the compu- tational resources required to emulate the full Android system to achieve accurate execution. Andlantis is the most scalable Android dynamic analysis framework to date, and is able to collect valuable forensic data, which helps reverse-engineers and malware researchers identify and understand anomalous application behavior. We discuss the results of running 1261 malware samples through the system, and provide examples of malware analysis performed with the resulting data. While techniques exist to perform static analysis on a large number of appli- cations, large-scale analysis of iOS applications has been relatively small scale due to the closed nature of the iOS ecosystem, and the di culty of acquiring appli- cations for analysis. The second tool we present, iClone, addresses the challenges associated with iOS research in order to detect application clones within a dataset of over 20,000 iOS applications.